Elithrion comments on Discussion: Which futures are good enough? - Less Wrong

5 Post author: WrongBot 24 February 2013 12:06AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (50)

You are viewing a single comment's thread. Show more comments above.

Comment author: RomeoStevens 24 February 2013 03:16:55AM 5 points [-]

If believing you inhabit the highest level floats your boat be my guest, just don't mess with the power plug on my experience machine.

Comment author: Elithrion 24 February 2013 04:12:26AM 0 points [-]

From an instrumental viewpoint, I hope you plan to figure out how to make everyone sitting around on a higher level credibly precommit to not messing with the power plug on your experience machine, otherwise it probably won't last very long. (Other than that, I see no problems with us not sharing some terminal values.)

Comment author: Lightwave 24 February 2013 10:59:25AM *  0 points [-]

figure out how to make everyone sitting around on a higher level credibly precommit to not messing with the power plug

That's MFAI's job. Living on the "highest level" also has the same problem, you have to protect your region of the universe from anything that could "de-optimize" it, and FAI will (attempt to) make sure this doesn't happen.

Comment author: RomeoStevens 24 February 2013 06:16:45AM 0 points [-]

I just have to ensure that the inequality (Amount of damage I cause if outside my experience machine>Cost of running my experience machine) holds.

Comment author: RichardKennaway 24 February 2013 10:01:20AM 1 point [-]

Translating that back into English, I get "unplug me from the Matrix and I'll do my best to help Skynet kill you all".

Comment author: Elithrion 24 February 2013 06:03:45PM 0 points [-]

Also that killing you outright isn't optimal.

Comment author: RomeoStevens 24 February 2013 09:30:01PM 0 points [-]

I can't do much about scenarios in which it is optimal to kill humans. We're probably all screwed in such a case. "Kill some humans according to these criteria" is a much smaller target than vast swathes of futures that simply kill us all.