Zetetic comments on Resetting Gandhi-Einstein - Less Wrong

9 Post author: Stuart_Armstrong 13 June 2011 10:02AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (32)

You are viewing a single comment's thread.

Comment author: Zetetic 13 June 2011 10:34:53AM 6 points [-]

I've thought of a few comments:

1) If they are reset every hour of subjective time that would put some serious bounds on how much information they could usefully pass on, especially if it is in the form of a virtual book. Maybe if you rewrote the component of the upload corresponding to memory this would work, but then why bother to reset? Is it to avoid boredom? I suppose you could only rewrite a restricted part of the memory of the upload. Why not try to tweak the upload to alleviate whatever issues you are anticipating (make it not get bored ect.)?

2) Assuming this upload is actually smart enough to make any progress in taking over the world, how do you guard against them deciding that they don't like being reset, and cleverly passing on a plan to eventually prevent you from resetting them? Even Gandhi might not appreciate being put in this sort of scenario.

3) I'm a little unsure of the effect that isolating it from all of the intellectual community might have on its effectiveness as a researcher, it seems like a large part of academic effectiveness comes from the availability of multiple perspectives. Maybe it would make more sense to try to simulate a small community of scholars rather than just one?

Comment author: benelliott 13 June 2011 10:44:59AM 2 points [-]

I can't speak for the OP, but I imagine the reason for the reset is to prevent some sort of personality change. History generally indicates that no matter how altruistic you start out there's a good chance you will turn nasty given enough power.

Comment author: Zetetic 13 June 2011 10:50:30AM 4 points [-]

I imagine sheer boredom and the prospect of the total lack of personal freedom could also play a role in that. In any event, this still makes the transfer of memory tricky, since you want to preserve work done over time without bound, but only selectively 'let through' memories to avoid this sort of change of personality over time.

Comment author: benelliott 13 June 2011 11:02:26AM *  3 points [-]

I fully agree.

A possible solution would be to lengthen the time interval, at a guess you could give them a subjective week without worrying about too much personality change, making it more possible for them to successfully write down everything important.

I'm still very worried about the morality of it, as I see it the resetting amounts to mass-murder.

Comment author: MileyCyrus 13 June 2011 10:43:09PM 8 points [-]

I'm still very worried about the molarity of it

Absolutely. We need to add a few liters of solvent to get the concentration down to acceptable molarity.

Comment author: Stuart_Armstrong 13 June 2011 07:44:56PM 0 points [-]

I'm still very worried about the molarity of it, as I see it the resetting amounts to mass-murder.

So do I. I think it's a hideous immoral idea. Only because the lives of everyone else are in the balance would I consider it.

Comment author: Barry_Cotter 13 June 2011 08:51:14PM 1 point [-]

How about if you get saved at the end of the hour/week, not deleted?

Comment author: Stuart_Armstrong 14 June 2011 10:00:57AM 2 points [-]

That would be better. And then, after the dust settles, all the copies could be resurrected?

Comment author: benelliott 13 June 2011 08:12:07PM 1 point [-]

If it was determined that you were the best candidate to be Gandhi-Einstein, would you volunteer?

Comment author: Yasuo 14 June 2011 06:14:58AM *  2 points [-]

I would. I'd want to do some shorter test runs first though, to get used to the idea, and I'd want to be sure I was in a good mood for the main reset point.

It would probably be good to find a candidate who was enlightened in the buddhist sense, not only because they'd be generally calmer and more stable, but specifically because enlightenment involves confronting the incoherent naïve concept of self and understanding the nature of impermanence. From the enlightened perspective, the peculiar topology of the resetting subjective experience would not be a source of anxiety.

Comment author: Stuart_Armstrong 14 June 2011 10:02:05AM 1 point [-]

Only if there were no other alternatives. And yes, that is a selfish sentiment.

Comment author: endoself 14 June 2011 01:28:17AM 1 point [-]

I'm not Stuart, but I would.

Comment author: Desrtopa 22 June 2011 05:00:26PM 0 points [-]

If it was determined that I was the best candidate, I would lose quite a bit of trust in the world. But if I thought it within my abilities to optimize the world an hour at a time, yes, I would volunteer.

Around the age of ten I made a precommitment that if I were ever offered an exchange of personal torment for saving the world, I should take it.

Comment author: Zetetic 13 June 2011 11:09:27AM 0 points [-]

I'm still very worried about the molarity of it, as I see it the resetting amounts to mass-murder.

This is a little bit difficult to gauge. It seems like it should be roughly equivalent to a surgical memory alteration during cryogenic stasis or something like that, since you're essentially starting the thing right back up again after removing some of the memories. In fact, I don't see why you can't just do a memory alteration and bypass the reset altogether, given that it seems desirable to retain some parts of the memory and not others.

Comment author: Stuart_Armstrong 13 June 2011 07:41:24PM 3 points [-]

Yep. And not just the whole "power corrupts" thing; having an isolated mind, with no peers, capable of direct or indirect self-modification... So many ways it can go wrong.

Comment author: Stuart_Armstrong 13 June 2011 07:43:15PM 0 points [-]

2) Start with someone willing to be reset, and whose willingness will extend to at least an hour. This scenario does involve sacrificing a heroic being, I do admit.

3) Maybe a reset community might work?