Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Ghatanathoah comments on In Praise of Boredom - Less Wrong

23 Post author: Eliezer_Yudkowsky 18 January 2009 09:03AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (100)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: Ghatanathoah 13 June 2012 07:16:49AM 7 points [-]

Our civilisation maximises entropy - not paperclips - which hardly seems much more interesting.

No it doesn't. It tries to maximize fun. It might maximize entropy as a side effect, but saying that we act to maximize entropy is as ludicrous as saying cows act to maximize atmospheric methane content. You're confusing a side-effect with the real goal.

A constant theme I've noticed in your posts is that you take some (usually evolutionary) trend that is occurring in our society or history and then act as if that trend is an actual conscious goal of human beings, and life. You then exhibit confusion when someone makes a statement that their real conscious goals are in opposition to this trend, and tell them that the fact that this trend is occurring means they must really want it to occur, and that it is part of their real goals. This demonstrates rather confused thinking, both on how human minds work, an on what a "goal" really is.

Scientists often metaphorically describe trends as having goals and "maximizing" things, you seem to have taken these metaphors excessively literally and act like these trends literally have goals and literally maximize things. Terms like "goals" and "maximization" only refer, in the literal sense, to the computations of consequentialist thinking beings (consequentialist in this case meaning a being that can forecast the future, not the moral theory). A goal is a forecast of the future a consequentialist assigns favorable utility to. Maximizing refers to a consequentialist that values a certain property in the future so much it assigns very favorable utility to increasing it as much as possible. This is the only appropriate time to literally use the terms "goal" and "maximize," all other times are metaphorical.

Evolution and other trends do not literally maximize anything. They have no goals. It is just sometimes useful to metaphorically pretend they do because it makes thinking about it easier for human brains, which are better at modeling other consequentialist than they are at modeling abstract trends. Any claim that evolution has a goal in a non-metaphorical sense is a blatant falsehood. Until you realize this fact you will fail to understand virtually everything that Eliezer is talking about.

There is no good reason why this would lead to a "worthless, valueless future" - which is why Yudkowsky fails to provide one.

Yes there is. Such a future would be boring. It is bad for things to be boring and good for things to be interesting, so such a future would be bad. And don't ask me why boringness is bad, that's like asking why water is H2O. You are asking Eliezer to provide some meaning of good seperate from things like truth, fun, novelty, life, etc, when he has clearly explained that there is no meaning of good seperate from those things. Those things are good, so a world where they don't exist, or are reduced, would be bad, full stop.

Comment author: timtyler 13 June 2012 10:09:59AM *  -2 points [-]

No it doesn't. It tries to maximize fun. It might maximize entropy as a side effect, but saying that we act to maximize entropy is as ludicrous as saying cows act to maximize atmospheric methane content. You're confusing a side-effect with the real goal.

Really? How do you know that? Are plants trying to maximise "fun"? Is "fun" even a measurable quantity?

If "fun" is being maximised, why is there so much suffering in the world? If two systems are in contention, is it really the one that is having the most fun that will win? The "fun-as-maximand" theory seems trivially refuted by the facts.

"Fun" - if we are trying to treat the concept seriously - is better characterised as the proxy that brains use for the inclusive fitness of their associated organism.

There's a scientific literature on the subject of what God's utility function is. Entire books have been written about the topic. I'm familiar with this literature, are you?

Terms like "goals" and "maximization" only refer, in the literal sense, to the computations of consequentialist thinking beings (consequentialist in this case meaning a being that can forecast the future, not the moral theory). A goal is a forecast of the future a consequentialist assigns favorable utility to. Maximizing refers to a consequentialist that values a certain property in the future so much it assigns very favorable utility to increasing it as much as possible. This is the only appropriate time to literally use the terms "goal" and "maximize," all other times are metaphorical.

We had better talk about "optimization" then, or we will talk past each other.

Evolution and other trends do not literally maximize anything.

Really? How do you know that? Evolution is a gigantic optimization process with a maximand. You claimed above that it is "fun" - and my claim is that it is entropy. As I say, there's a substantial scientific literature on the topic - have you looked at it?

Comment author: RichardKennaway 13 June 2012 11:47:07AM 3 points [-]

Evolution is a gigantic optimization process with a maximand

Success for the fox is failure for the rabbit; success for the rabbit is failure for the fox. What is the maximand?

Comment author: TheOtherDave 13 June 2012 01:32:56PM 0 points [-]

OTOH, as rabbits become better fox-evaders, foxes become better rabbit-hunters. If there exists some thing X that fox-evasion and rabbit-hunting have in common, it's possible (I would even say likely) that X is increasing throughout this process.

Comment author: timtyler 13 June 2012 11:22:36PM 1 point [-]

Increasing != maximising, though. Methane is increasing in both cases - but evolution doesn't maximise methane production.

Comment author: TheOtherDave 14 June 2012 02:36:25AM 0 points [-]

Not sure why it's relevant, but certainly true.

Comment author: timtyler 13 June 2012 11:18:03PM *  0 points [-]

So: entropy, as far as we can tell. See the works of Dewar, referenced here. Or for a popular version, try: Whitfield, John Survival of the Likeliest? for a popular version from someone other than me.

Comment author: Ghatanathoah 14 June 2012 02:09:36AM *  2 points [-]

Really? How do you know that?

You said: "Our civilisation maximises entropy." Our civilization consists of all the humans in the world. When you're asking what our civilization is trying to maximize you're asking what the humans of the world are trying to maximize. Humans try to do things they enjoy, things that are fun. Therefore our civilization tries to maximize fun.

I know that because that's basic human psychology 101. Humans want to be happy and have fulfilled preferences.

Are plants trying to maximise "fun"?

We're talking about our civilization. In other words, all the humans in the world. Plants aren't human, so whether they maximize fun is irrelevant. I suppose if you regarded human tools and artifacts as part of our civilization then agricultural plants could be regarded as part of it. But they aren't the part of our civilization that makes decisions on what to maximize, humans are.

Plants aren't trying to maximize anything. They're plants, they don't have minds. If I was to use the word maximize as liberally as you do I could actually argue that agricultural plants do try to maximize fun, because humans grow them for the purpose of eating, and eating is fun. But that wouldn't be strictly accurate, plants just execute their genetically coded behaviors, any purpose they have is really the purpose of the consequentalist minds that grow them, not of the plants. Saying that agricultural plants have any purpose at all is the mind-projection fallacy.

If "fun" is being maximised, why is there so much suffering in the world?

Because some humans are selfish and try to maximize their own fun at the expense of the fun of others. And sometimes we make big mistakes when trying to make the world more fun. But still, most of the time we try to work together to have fun. We aren't that good at it yet, but we're trying and keep improving. The world is getting progressively more fun.

If you systems are in contention, is it really the one that is having the most fun that will win?

Yes. Humans who are enjoying life the most are generally regarded as being more successful at life than humans who are not. This is a basic and easily observable fact.

The "fun-as-maximand" theory seems trivially refuted by the facts.

It's easily confirmed by the facts. As humans have grown richer and more technologically advanced they have devoted more and more of their resources to having fun. Look at the existence of places like Disneyworld for evidence.

"Fun" - if we are trying to treat the concept seriously - is better characterised as the proxy that brains use for the inclusive fitness of their associated organism.

No it isn't. Brains don't care about inclusive genetic fitness. At all. They never have. If you want evidence for that, note the fact that humans do things like use condoms. Also note that the growth of the world's population is slowing and will probably stop by the end of the 21st century if trends continue.

There's a scientific literature on the subject of what God's utility function is. Entire books have been written about the topic. I'm familiar with this literature, are you?

That literature has exactly zero relevance to our current discussion, which is what human beings, value, care about, and try to maximize. You learn about that by studying basic psychology. Evolutionary theory may give us insights into how humans came to have our current values, but it has no relevance on what we should do now that we have them.

Our values are what we value, how we came to have them is irrelevant. If our values were bestowed on us by an alien geneticist rather than evolution we would behave exactly the same as we do now. Humans don't give a crap about "god's utility function." If they end up increasing entropy it is as a side effect to obtaining their real goals.

We had better talk about "optimization" then, or we will talk past each other.

Optimization has the same problem. Optimization literally refers to a consequentialist creature using its future forecasting abilities to determine how an object or meme would better suit its goals and altering that thing accordingly. Evolution can be metaphorically said to optimize, but that isn't strictly true. It's just a form of personification to make thinking about evolution easier.

Strictly speaking, evolution is just a description of a series of trends. Since human minds are bad at modeling trends, but good at modeling other consequentialists, sometimes it's useful to pretend that evolution is a consequentialist with "goals" and a "utility function" to help people understand it. It's less scientifically accurate than modeling evolution as a series of trends, but it makes up for it by being easier for a human brain to compute. The problem is that, while most scientists understand this, there are some people who who misinterpret this to mean that evolution literally has goals, desires, and utility functions. You appear to be one of these people.

Really? How do you know that?

Because literally speaking, only consequentialist minds maximize things. You might be able to say evolution maximizes things as a useful metaphor, but literally speaking it isn't true.

Evolution is a gigantic optimization process with a maximand.

No it isn't. It is useful to pretend that it is because doing so makes it a little easier for the human mind to think about evolution. But really, evolution is just an abstract series of mindless trends.

You claimed above that it is "fun" - and my claim is that it is entropy.

I never claimed evolution tries to maximize fun. I claimed our civilization does. In other words, that the consequentialist minds making up human civilization use their forecasting abilities to foresee possible futures, and then steer the universe towards the one where they are having the most fun.

As I say, there's a substantial scientific literature on the topic - have you looked at it?

I'm familiar with some of the literature, and I've looked at your website. You constantly confuse the metaphorical "goals" evolution has with the real goals that consequentialist minds such as human beings have. For instance you say:

Another example: currently, researchers at ITER in France are working on an enormous fusion reactor, to allow us to accelerate the conversion of order into entropy still further.

This is trivially false, the reason researchers are working on a fusion reactor is to secure human beings cheap renewable energy to have more fun with. The fact that it increases entropy is a side-effect. The consequentialist human minds do not foresee a future with more entropy and take action in order to secure that future. They foresee a future where humans are using cheap energy to have more fun and take actions to secure that future. The entropy increase is an unfortunate, but acceptable side effect.

What you remind me of is one of those theologians who describe God as an "unmoved mover" or something like that and suggest such a thing must exist (which was a reasonable hypothesis at one point in history, even if it isn't now). They then make the ridiculous leap of logic that because an unmoved mover must exist, and you can call such a thing "God," that therefore a God with all the ludicrously specific human-like properties described in the Bible must exist.

Similarly, you take some basic facts about evolution and physics that every educated person agrees are true. Then you make bizarre leaps of logic to conclude that human beings care about maximizing IGF and maximizing entropy and other obvious falsehoods. I am not objecting to the evolutionary biology research you cite, I am objecting to the bizarre and unjustified inferences about human psychology and moral philosophy you use that research to make.

Comment author: timtyler 14 June 2012 11:38:13PM *  -2 points [-]

Then you make bizarre leaps of logic to conclude that human beings care about maximizing IGF and maximizing entropy and other obvious falsehoods.

To clarify, many humans fail to maximise their own inclusive fitnesses - largely because they are malfunctioning - with many of the most common malfunctions being caused by parasites - and the most common parasites being responsible for memetic hijacking. Humans and the ecosystems they are part of really do maximise entropy (subject to constraints) - or at least the MEP is a deep and powerful explanatory principle - when it comes to CAS and living systems.

Comment author: timtyler 15 June 2012 12:03:01AM *  -2 points [-]

Another example: currently, researchers at ITER in France are working on an enormous fusion reactor, to allow us to accelerate the conversion of order into entropy still further.

This is trivially false, the reason researchers are working on a fusion reactor is to secure human beings cheap renewable energy to have more fun with. The fact that it increases entropy is a side-effect. The consequentialist human minds do not foresee a future with more entropy and take action in order to secure that future. They foresee a future where humans are using cheap energy to have more fun and take actions to secure that future. The entropy increase is an unfortunate, but acceptable side effect.

This line of reasoning is intuitive, but, I believe, wrong. Destroying energy gradients is actively selected for in lots of ways. For example, it actively deprives competitors of resources. Organisms compete to dissipate sources of order by reaching them quicky and eliminating before others can. The picture of entropy as an inconvenient side effect seems attractive initially, but doesn't withstand close inspection.

I don't deny that properly functioning brains act like hedonic maximisers. Hedonic maximisation is a much weaker explanatory principle than entropy maximisation, though. The latter explains why water flows downhill. Hedonic maximisation is a narrow and weak idea - by comparison.

Comment author: timtyler 15 June 2012 12:19:59AM -2 points [-]

Strictly speaking, evolution is just a description of a series of trends. Since human minds are bad at modeling trends, but good at modeling other consequentialists, sometimes it's useful to pretend that evolution is a consequentialist with "goals" and a "utility function" to help people understand it. It's less scientifically accurate than modeling evolution as a series of trends, but it makes up for it by being easier for a human brain to compute. The problem is that, while most scientists understand this, there are some people who who misinterpret this to mean that evolution literally has goals, desires, and utility functions. You appear to be one of these people.

Feel free to substitute "maximisation" terminology if my preferred lingo causes you conceptual problems. Selfishness, progress and optimisation can all be "cashed out" in more long-winded terms. Remember: teleonomy is just teleology in new clothes.

Comment author: timtyler 15 June 2012 12:22:50AM 3 points [-]

We had better talk about "optimization" then, or we will talk past each other.

Optimization has the same problem. Optimization literally refers to a consequentialist creature using its future forecasting abilities to determine how an object or meme would better suit its goals and altering that thing accordingly.

Nonsense. Look it up.