wedrifid comments on Rationality Quotes September 2012 - Less Wrong

7 Post author: Jayson_Virissimo 03 September 2012 05:18AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1088)

You are viewing a single comment's thread. Show more comments above.

Comment author: Cyan 17 September 2012 02:14:05PM 1 point [-]

"I wish for this wish to have no further effect beyond this utterance."

Comment author: wedrifid 17 September 2012 03:22:49PM *  6 points [-]

"I wish for this wish to have no further effect beyond this utterance."

Overwhelmingly probable dire consequence: You and everyone you love dies (over a period of 70 years) then, eventually, your entire species goes extinct. But hey, at least it's not "your fault".

Comment author: Cyan 17 September 2012 07:36:40PM 0 points [-]

But, alas, it's the wish that maximizes my expected utility -- for the malicious genie, anyway.

Comment author: wedrifid 18 September 2012 06:48:57AM *  2 points [-]

But, alas, it's the wish that maximizes my expected utility -- for the malicious genie, anyway.

Possibly. I don't off hand see what a malicious genie could do about that statement. However it does at least require it to honor a certain interpretation of your words as well as your philosophy about causality---in particular accept a certain idea of what the 'default' is relative to which 'no effect' can have meaning. There is enough flexibility in how to interpret your wish that I begin to suspect that conditional on the genie being sufficiently amiable and constrained that it gives you what you want in response to this wish there is likely to be possible to construct another wish that has no side effects beyond something that you can exploit as a fungible resource.

"No effect" is a whole heap more complicated and ambiguous than it looks!

Comment author: JulianMorrison 17 September 2012 03:36:12PM -2 points [-]

You can't use that tool to solve that problem.

Meanwhile, you have <= 70 years to solve it another way.

Comment author: wedrifid 17 September 2012 04:10:23PM 1 point [-]

You can't use that tool to solve that problem.

You can't? So much the worse for your species. I quite possibly couldn't either. I'd probably at least think about it for five minutes first. I may even make a phone call first. And if I and my advisers conclude that for some bizarre reason "no further effect beyond this utterance" is better than any other simple wish that is an incremental improvement then I may end up settling for it. But I'm not going to pretend that I have found some sort of way to wash my hands of responsibility.

Meanwhile, you have <= 70 years to solve it another way.

Yes, that's better than catastrophic instant death of my species. And if I happen to estimate that my species has 90% chance of extinction within a couple of hundred years then I would be making the choice to accept a 90% chance of that doom. I haven't cleverly tricked my way out of a moral conundrum, I have made a gamble with the universe at stake, for better or for worse.

Relevant reading: The Parable of the Talents.

Comment author: MugaSofer 17 September 2012 04:19:11PM 0 points [-]

You can't use that tool to solve that problem.

"I wish for all humans to be immortal."

Sure, you need to start heavily promoting birth control, and there can be problems depending on how you define "immortal", but ...

It's a wish. You can wish for anything.

Unless, I suppose, that would have been your first wish. But the OP basically says your first wish was an FAI.

Comment author: mfb 17 September 2012 04:49:24PM -1 points [-]

Immortal humans can go horribly wrong, unless "number of dying humans" is really what you want to minimize.

"Increase my utility as much as you can"?

Comment author: MugaSofer 18 September 2012 07:45:52AM *  0 points [-]

I said:

there can be problems depending on how you define "immortal"

You replied:

Immortal humans can go horribly wrong

I am well aware that this wish has major risks as worded. I was responding to the claim that "you can't use that tool to solve that problem."

Yes, obviously you wish for maximised utility. But that requires the genie to understand your utility.

Comment author: chaosmosis 17 September 2012 04:53:09PM *  0 points [-]

"Increase my utility as much as you can"?

That would just cause them to pump chemicals in you head, I think. But it's definitely thinking in the right direction.

"number of dying humans" is really what you want to minimize.

Even with pseudo immortality, accidents happen, which means that the best way to minimize the number of dying humans is either to sterilize the entire species or to kill everyone. The goal shouldn't be to minimize death but to maximize life.

Overwrite my current utility function upon your previous motivational networks, leaving no motivational trace of their remains.

That actually seems like it'd work.

Comment author: wedrifid 17 September 2012 05:33:31PM 1 point [-]

"Increase my utility as much as you can"?

That would just cause them to pump chemicals in you head, I think.

It wouldn't do that (except in some sense in which it is able to do arbitrary things you don't mean when given complicated or undefined requests).

Comment author: mfb 18 September 2012 04:52:08PM -1 points [-]

That would just cause them to pump chemicals in you head, I think. But it's definitely thinking in the right direction.

As long as I am not aware of that (or do not dislike it)... well, why not. However, MugaSofer is right, the genie has to understand the (future) utility function for that. But if it can alter the future without restrictions, it can change the utility function itself (maybe even to an unbounded one... :D)