Nick_Tarleton comments on Complexity of Value ≠ Complexity of Outcome - Less Wrong

32 Post author: Wei_Dai 30 January 2010 02:50AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (198)

You are viewing a single comment's thread. Show more comments above.

Comment author: Nick_Tarleton 30 January 2010 06:16:15PM *  2 points [-]

Things like that seem simply misguided to me. IMO, there are good reasons for thinking that that would lead to enormous complexity

...but not in the least convenient possible world with an ontologically simple turn-everything-into-orgasmium button; and the sort of complexity that you mention that (I agree) would be involved in the actual world isn't a sort that most people regard as terminally valuable.

Comment author: timtyler 30 January 2010 08:27:41PM *  -1 points [-]

Here we were talking about a superintelligent agent whose "fondest desire is to fill the universe with orgasmium". About the only way such an agent would fail to produce enormous complexity is if it died - or was otherwise crippled or imprisoned.

Whether humans would want to live - or would survive in - the same universe as an orgasmium-loving superintelligence seems like a totally different issue to me - and it seems rather irrelevant to the point under discussion.

Comment author: Nick_Tarleton 30 January 2010 09:32:21PM *  3 points [-]

Here we were talking about a superintelligent agent whose "fondest desire is to fill the universe with orgasmium". About the only way such an agent would fail to produce enormous complexity is if it died - or was otherwise crippled or imprisoned.

Or if the agent has a button that, through simple magic, directly fills the universe with (stable) orgasmium. Did you even read what I wrote?

Whether humans would want to live - or would survive in - the same universe as an orgasmium-loving superintelligence seems like a totally different issue to me - and it seems rather irrelevant to the point under discussion.

Human morality is the point under discussion, so of course it's relevant. It seems clear that the chief kind of "complexity" that human morality values is that of conscious (whatever that means) minds and societies of conscious minds, not complex technology produced by unconscious optimizers.

Comment author: timtyler 30 January 2010 09:52:16PM *  -1 points [-]

Re: Did you even read what I wrote?

I think I missed the bit where you went off into a wild and highly-improbable fantasy world.

Re: Human morality is the point under discussion

What I was discussing was the "tendency to assume that complexity of outcome must have been produced by complexity of value". That is not specifically to do with human values.