Alicorn comments on Rationality Quotes September 2012 - Less Wrong

7 Post author: Jayson_Virissimo 03 September 2012 05:18AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1088)

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 05 September 2012 10:34:06PM 22 points [-]

I should like to point out that anyone in this situation who wishes what would've been their first wish if they had three wishes is a bloody idiot.

Comment author: Alicorn 05 September 2012 10:49:57PM 9 points [-]

So: A genie pops up and says, "You have one wish left."

What do you wish for? Because presumably the giftwrapped FAI didn't work so great.

Comment author: CCC 18 September 2012 07:46:25AM 5 points [-]

"I wish to know what went wrong with my first wish."

This way, I at least end up with improved knowledge of what to avoid in the future.

Alternatively, "I wish for a magical map, which shows me, in real time, the location of every trapped genie and other potential source of wishes in the world." Depending on how many there are, I can potentially get a lot more feedback that way.

Comment author: siodine 05 September 2012 11:56:43PM 4 points [-]

I bet he'd wish "to erase all uFAI from existence before they're even born. Every uFAI in every universe, from the past and the future, with my own hands."

Comment author: Eliezer_Yudkowsky 06 September 2012 05:18:28AM 1 point [-]

Nobody believes in the future.

Nobody accepts the future.

Then -

Comment author: MugaSofer 17 September 2012 01:08:29PM 0 points [-]

Perhaps I'm simply being an idiot, but ... huh?

Comment author: ArisKatsaris 17 September 2012 01:39:28PM *  1 point [-]

It's a reference to an anime; you're not an idiot, just unlikely to get the reference and its appropriateness if you've not seen it yourself. PM me for the anime's name, if you are one of the people who either don't mind getting slightly spoiled, or are pretty sure that you would never get a chance to watch it on your own anyway.

Comment author: RichardKennaway 17 September 2012 02:15:02PM 2 points [-]

Could you just rot13 it? I'm curious too, I don't mind the spoiler, and whatever it is, I'd probably be more likely to watch it (even if only 2epsilon rather than epsilon) for knowing the relevance to LW.

Comment author: ArisKatsaris 17 September 2012 02:29:59PM 0 points [-]

I'll just PM you the title too, and anyone else who wants me to likewise. Sorry, it just happens to be one of my favourite series, and all other things being equal I tend to prefer that people go into it as completely unspoilered as possible... Even knowing Eliezer's quote is a reference to it counts as a mild spoiler... explanation about how it is a reference would count as a major spoiler.

Comment author: TimS 17 September 2012 01:13:20PM 1 point [-]

I think that's Eliezer's prediction of the results of siodine's wish. Because wishes are NOT SAFE.

Comment author: MugaSofer 17 September 2012 01:21:02PM 1 point [-]

But what is he predicting, exactly?

Comment author: Cyan 17 September 2012 02:14:05PM 1 point [-]

"I wish for this wish to have no further effect beyond this utterance."

Comment author: wedrifid 17 September 2012 03:22:49PM *  6 points [-]

"I wish for this wish to have no further effect beyond this utterance."

Overwhelmingly probable dire consequence: You and everyone you love dies (over a period of 70 years) then, eventually, your entire species goes extinct. But hey, at least it's not "your fault".

Comment author: Cyan 17 September 2012 07:36:40PM 0 points [-]

But, alas, it's the wish that maximizes my expected utility -- for the malicious genie, anyway.

Comment author: wedrifid 18 September 2012 06:48:57AM *  2 points [-]

But, alas, it's the wish that maximizes my expected utility -- for the malicious genie, anyway.

Possibly. I don't off hand see what a malicious genie could do about that statement. However it does at least require it to honor a certain interpretation of your words as well as your philosophy about causality---in particular accept a certain idea of what the 'default' is relative to which 'no effect' can have meaning. There is enough flexibility in how to interpret your wish that I begin to suspect that conditional on the genie being sufficiently amiable and constrained that it gives you what you want in response to this wish there is likely to be possible to construct another wish that has no side effects beyond something that you can exploit as a fungible resource.

"No effect" is a whole heap more complicated and ambiguous than it looks!

Comment author: JulianMorrison 17 September 2012 03:36:12PM -2 points [-]

You can't use that tool to solve that problem.

Meanwhile, you have <= 70 years to solve it another way.

Comment author: wedrifid 17 September 2012 04:10:23PM 1 point [-]

You can't use that tool to solve that problem.

You can't? So much the worse for your species. I quite possibly couldn't either. I'd probably at least think about it for five minutes first. I may even make a phone call first. And if I and my advisers conclude that for some bizarre reason "no further effect beyond this utterance" is better than any other simple wish that is an incremental improvement then I may end up settling for it. But I'm not going to pretend that I have found some sort of way to wash my hands of responsibility.

Meanwhile, you have <= 70 years to solve it another way.

Yes, that's better than catastrophic instant death of my species. And if I happen to estimate that my species has 90% chance of extinction within a couple of hundred years then I would be making the choice to accept a 90% chance of that doom. I haven't cleverly tricked my way out of a moral conundrum, I have made a gamble with the universe at stake, for better or for worse.

Relevant reading: The Parable of the Talents.

Comment author: MugaSofer 17 September 2012 04:19:11PM 0 points [-]

You can't use that tool to solve that problem.

"I wish for all humans to be immortal."

Sure, you need to start heavily promoting birth control, and there can be problems depending on how you define "immortal", but ...

It's a wish. You can wish for anything.

Unless, I suppose, that would have been your first wish. But the OP basically says your first wish was an FAI.

Comment author: mfb 17 September 2012 04:49:24PM -1 points [-]

Immortal humans can go horribly wrong, unless "number of dying humans" is really what you want to minimize.

"Increase my utility as much as you can"?

Comment author: MugaSofer 18 September 2012 07:45:52AM *  0 points [-]

I said:

there can be problems depending on how you define "immortal"

You replied:

Immortal humans can go horribly wrong

I am well aware that this wish has major risks as worded. I was responding to the claim that "you can't use that tool to solve that problem."

Yes, obviously you wish for maximised utility. But that requires the genie to understand your utility.

Comment author: chaosmosis 17 September 2012 04:53:09PM *  0 points [-]

"Increase my utility as much as you can"?

That would just cause them to pump chemicals in you head, I think. But it's definitely thinking in the right direction.

"number of dying humans" is really what you want to minimize.

Even with pseudo immortality, accidents happen, which means that the best way to minimize the number of dying humans is either to sterilize the entire species or to kill everyone. The goal shouldn't be to minimize death but to maximize life.

Overwrite my current utility function upon your previous motivational networks, leaving no motivational trace of their remains.

That actually seems like it'd work.

Comment author: wedrifid 17 September 2012 05:33:31PM 1 point [-]

"Increase my utility as much as you can"?

That would just cause them to pump chemicals in you head, I think.

It wouldn't do that (except in some sense in which it is able to do arbitrary things you don't mean when given complicated or undefined requests).

Comment author: mfb 18 September 2012 04:52:08PM -1 points [-]

That would just cause them to pump chemicals in you head, I think. But it's definitely thinking in the right direction.

As long as I am not aware of that (or do not dislike it)... well, why not. However, MugaSofer is right, the genie has to understand the (future) utility function for that. But if it can alter the future without restrictions, it can change the utility function itself (maybe even to an unbounded one... :D)

Comment author: JulianMorrison 17 September 2012 01:29:09PM -1 points [-]

"Destroy yourself as near to immediately as possible, given that your method of self destruction causes no avoidable harm to anything larger than an ant."

Comment author: MugaSofer 17 September 2012 01:35:34PM 2 points [-]

They shrink the planet down to below our Schwarzschild radius, holding spacetime in place for just long enough to explain what you just did.

Alternately, they declare your wish is logically contradictory - genies are larger than ants.

Comment author: JulianMorrison 17 September 2012 01:51:45PM -1 points [-]

At the start of the scenario, you are already dead with probability approaching 1. Trying to knock the gun away can't hurt.

Comment author: MugaSofer 17 September 2012 02:15:28PM *  2 points [-]

I was criticizing the wording of the "ant" qualifier, not the attempt to destroy the genie.