You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

eternal_neophyte comments on Crazy Ideas Thread - Less Wrong Discussion

22 Post author: Gunnar_Zarncke 07 July 2015 09:40PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (344)

You are viewing a single comment's thread. Show more comments above.

Comment author: eternal_neophyte 08 July 2015 09:25:19PM *  11 points [-]

Socrates has some sound advice on making requests of powerful beings:

Our prayers should be for blessings in general, for God knows best what is good for us.

Comment author: polymathwannabe 09 July 2015 05:52:07PM 4 points [-]

The human way is not leaving things to be managed by the gods.

Comment author: eternal_neophyte 09 July 2015 06:03:36PM 3 points [-]

Indeed. But if you're going to appeal to an omniscient being, let them in their omniscience decide what's good for you.

Comment author: polymathwannabe 09 July 2015 06:28:22PM *  3 points [-]

I'd feel dirty letting anyone, even a god, overwrite my terminal goals.

Comment author: Lumifer 09 July 2015 06:47:47PM 5 points [-]

That feeling of being dirty can be overwritten, too X-)

Comment author: eternal_neophyte 09 July 2015 06:42:15PM 2 points [-]

Has no human being ever overwritten your terminal goals?

Comment author: polymathwannabe 09 July 2015 07:18:55PM 0 points [-]

I have, a number of times. My parents tried, but at most were able to overrule them.

Comment author: eternal_neophyte 09 July 2015 07:44:21PM 1 point [-]

And it was always for the worse?

Comment author: polymathwannabe 09 July 2015 08:20:00PM 0 points [-]

The ripples keep multiplying.

Comment author: VoiceOfRa 12 July 2015 02:47:22AM 1 point [-]

Do you even have a terminal goal?

Comment author: hyporational 15 July 2015 08:49:24PM 0 points [-]

A god smart enough to know what's good for us is smart enough not to need a prayer to be summoned.

Comment author: James_Miller 19 July 2015 06:11:07PM *  2 points [-]

The god might give great weight to individual preferences. I have tried to convince lots of people to sign up for cryonics. When I say something like "if it were free and you knew it would work would you sign up?" some people have said "no", or even "of course not." Plus, the god might have resource constrains and at the margin it could be a close call whether to bring me back, and my stating a desire to be brought back could tip the god to do so with probability high enough to justify the time I spent making the original comment.

Comment author: David_Bolin 19 July 2015 07:07:00PM 1 point [-]

For many people, 32 karma would also be sufficient benefit to justify the investment made in the comment.

Comment author: hyporational 20 July 2015 11:49:01AM *  -1 points [-]

Our stated preferences are predictably limited and often untrue accounts what actually constitutes our well-being and our utility to those around us. I'm not sure I want to wake up to a god psychologically incompetent enough to revive people based on weighing wishes greatly. If there are resource constraints which I highly doubt it's especially important to make decisions based on reliable data.

When I say something like "if it were free and you knew it would work would you sign up?" some people have said "no", or even "of course not."

I think this much more likely reflects the dynamics of the discussion, the perceived unlikelihood of the hypothetical and the badness of death than actual preferences. If the hypothetical is improbable enough, changing your mind only has the cost of losing social status and whatever comforting lies you have learned to keep death off your mind and not much upside to talk about.

Comment author: ChristianKl 20 July 2015 11:59:41AM 2 points [-]

Consent seems to be an important ethical principle for many people and an FAI might well end up implementing it in some form.

Comment author: hyporational 20 July 2015 12:16:19PM *  0 points [-]

True. Since people are so irrational, not to mention inconsistent and slow, it might be one of the most difficult problems of FAI. The whole concept of consent in the presence of a much more powerful mind seems pretty shaky.

Comment author: eternal_neophyte 15 July 2015 09:32:01PM 0 points [-]

I can easily imagine that if I ran a simulation of mankind's evolutionary history, I'd adopt a principle of responding to the requests of simulants given that they are small enough and won't interfere with the goals of the simulation, just in case they have some awareness. If the purpose of the simulation isn't simply to satisfy all the simulants' needs for them (and would in fact be orthogonal to its actual purpose), they would have to make some kind of request for me to do something.