You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Zack_M_Davis comments on Could/would an FAI recreate people who are information-theoretically dead by modern standards? - Less Wrong Discussion

6 Post author: AlexMennen 22 January 2011 09:11PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (44)

You are viewing a single comment's thread.

Comment author: Zack_M_Davis 22 January 2011 11:09:27PM *  9 points [-]

Does anyone have any compelling arguments on why an FAI would or would not recreate me if I die, decompose, and then the singularity occurs a long time after my death?

A possible reason why not besides physical impossibility: are you sure you want to be recreated?---I don't mean to invoke any tired cishumanist tropes about death being the natural order of things or any nonsense like that. But it seems plausible to me that a benevolent superintelligence would have no trouble contriving configurations of matter that at least some of us would prefer to personal survival; specifically, the AI could create people better than us by our own standards.

Presumably you don't want to live on exactly as you are now; you have a desire for self-improvement. But given the impossibility of ontologically fundamental mental entities, there's an identity between "self-improvement" and "being deleted and immediately replaced with someone else who happens to be very similar in these-and-such ways"; once you know what experiences are taking place at what points in spacetime, there's no further fact of the matter as to which ones of them are "you." In our own world, this is an unimportant philosophical nicety, but superintelligence has the potential to open up previously inaccessible edge cases, such as extremely desirable outcomes that don't preserve enough personal identity to count as "survival" in the conventional sense.