Comment author: Nornagest 14 December 2013 10:52:53PM 1 point [-]

If you're dealing with creatures good enough at modeling the world to predict the future and transfer skills, then you're dealing with memetic factors as well as genetic. That's rather beyond the scope of natural selection as typically defined.

Granted, I suppose there are theoretical situations where that argument wouldn't apply -- but I'm having trouble imagining an animal smart enough to make decisions based on projected consequences more than one selection round out, but too dumb to talk about it. We ourselves aren't nearly that smart individually.

Comment author: timtyler 14 December 2013 11:54:25PM *  -2 points [-]

If you're dealing with creatures good enough at modeling the world to predict the future and transfer skills, then you're dealing with memetic factors as well as genetic. That's rather beyond the scope of natural selection as typically defined.

What?!? Natural selection applies to both genes and memes.

I suppose there are theoretical situations where that argument wouldn't apply

I don't think you presented a supporting argument. You referenced "typical" definitions of natural selection. I don't know of any definitions that exclude culture. Here's a classic one from 1970 - which explicitly includes cultural variation. Even Darwin recognised this, saying: "The survival or preservation of certain favoured words in the struggle for existence is natural selection."

If anyone tells you that natural selection doesn't apply to cultural variation, they are simply mistaken.

I'm having trouble imagining an animal smart enough to make decisions based on projected consequences more than one selection round out, but too dumb to talk about it.

I recommend not pursuing this avenue.

Comment author: Nornagest 14 December 2013 09:13:39PM 1 point [-]

Optimization involves going uphill - but you might be climbing a mountain that is sinking into the sea. However, that doesn't mean that you weren't really optimizing - or that you were optimizing something other than altitude.

The question's more about what function's generating the fitness landscape you're looking at (using "fitness" now in the sense of "fitness function"). "Survival" isn't a bad way to characterize that fitness function -- more than adequate for eighth-grade science, for example. But it's a short-tern and highly specialized kind of survival, and generalizing from the word's intuitive meaning can really get you into trouble when you start thinking about, for example, death.

Comment author: timtyler 14 December 2013 10:46:04PM 1 point [-]

The question's more about what function's generating the fitness landscape you're looking at (using "fitness" now in the sense of "fitness function"). "Survival" isn't a bad way to characterize that fitness function -- more than adequate for eighth-grade science, for example. But it's a short-tern and highly specialized kind of survival [...]

Evolution is only as short-sighted as the creatures that compose its populations. If organisms can do better by predicting the future (and sometimes they can) then the whole process is a foresightful one. Evolution is often characterised as 'blind to the future' - but that's just a mistake.

Comment author: Nornagest 14 December 2013 08:39:02PM *  1 point [-]

Even to the extent that natural selection can be said to be care about anything, saying that survival is that thing is kind of misleading. It's perfectly normal for populations to hill-climb themselves into a local optimum and then get wiped out when it's invalidated by changing environmental conditions that a more basal but less specialized species would have been able to handle, for example.

(Pandas are a good example, or would be if we didn't think they were cute.)

Comment author: timtyler 14 December 2013 09:07:00PM 0 points [-]

Even to the extent that natural selection can be said to be care about anything, saying that survival is that thing is kind of misleading.

Well, I have gone into more details elsewhere.

It's perfectly normal for populations to hill-climb themselves into a local optimum and then get wiped out when it's invalidated by changing environmental conditions that a more basal but less specialized species would have been able to handle, for example.

Sure. Optimization involves going uphill - but you might be climbing a mountain that is sinking into the sea. However, that doesn't mean that you weren't really optimizing - or that you were optimizing something other than altitude.

Comment author: ialdabaoth 14 December 2013 07:48:56PM *  4 points [-]

Nature's most common solution involves sexual reproduction - and not "tiling". It's not necessarily a good thing to rule out the most common solution in the statement of a problem.

True, but nature's goals are not our own.

The reason sexual reproduction is acceptable is that Nature doesn't care about the outcome, as long as the outcome includes 'be fruitful and multiply'. If we have an agent with its own goals, it will need more robust strategies to avoid its descendants' behaviors falling back to Nature's fundamental Darwinian imperatives.

Comment author: timtyler 14 December 2013 08:25:17PM -6 points [-]

Nature only "cares" about survival. However, that's also exactly what we should care about - assuming that our main priority is avoiding eternal obliteration.

Comment author: ialdabaoth 14 December 2013 04:38:45PM *  3 points [-]

It also raises other questions - such as: how will such a monoculture resist exploitation by parasites?

Aren't these actually the same question? "Exploitation by parasites" is actually a behavior, so it's a subset of the general trust question.

Comment author: timtyler 14 December 2013 07:42:43PM -3 points [-]

In biology, the "how can you trust your descendants?" question is rarely much of an issue - typically, you can't.

The issue of how to ensure your descendants don't get overrun by parasites is more of a real problem.

Nature's most common solution involves sexual reproduction - and not "tiling". It's not necessarily a good thing to rule out the most common solution in the statement of a problem.

Comment author: timtyler 14 December 2013 04:34:03PM *  -3 points [-]

We want to be able to consider agents which build slightly better versions of themselves, which build slightly better versions of themselves, and so on. This is referred to as an agent "tiling" itself. This introduces a question: how can the parent agent trust its descendants?

It also raises other questions - such as: how will such a monoculture resist exploitation by parasites?

Comment author: byrnema 11 December 2013 03:40:13PM *  0 points [-]

This is not very surprising, given his background in handwriting and image recognition.

Could you elaborate on the connections between image recognition / interpretation and prediction? For this reply, it's fine to be only roughly accurate. (In case an inability to be sufficiently rigorous is what prevented you from sketching the connection.)

...naively, I think of intelligence as, say, an ability to identify and solve problems. Is LeCun saying perhaps that this is equal to prediction, or not as important as prediction, or that he's more interested in working on the latter?

Comment author: timtyler 14 December 2013 04:18:36PM 1 point [-]

Here is one of my efforts to explain the links: Machine Forecasting.

Comment author: ShardPhoenix 11 December 2013 08:08:35AM 1 point [-]

As an outsider I kind of get the impression that there is a bit of looking-under-the-streetlamp syndrome going on here where world-modelling is assumed to be the most/only important feature because that's what we can currently do well. I got the same impression seeing Jeff Hawkins speaking at a conference recently.

Comment author: timtyler 14 December 2013 04:16:19PM 0 points [-]

I'm pretty sure that we suck at prediction - compared to evaluation and tree-pruining. Prediction is where our machines need to improve the most.

Comment author: Eliezer_Yudkowsky 11 December 2013 09:09:48PM 2 points [-]

Agreed. And search is not the same problem as prediction, you can have a big search problem even when evaluating/predicting any single point is straightforward.

Comment author: timtyler 14 December 2013 03:01:56AM 0 points [-]

search is not the same problem as prediction

It is when what you are predicting is the results of a search. Prediction covers searching.

Comment author: timtyler 14 December 2013 02:50:45AM *  0 points [-]

It is interesting that his view of AI is apparently that of a prediction tool [...] rather than of a world optimizer.

If you can predict well enough, you can pass the Turing test - with a little training data.

View more: Prev | Next