infotropism comments on Rationality, Cryonics and Pascal's Wager - Less Wrong

12 [deleted] 08 April 2009 08:28PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (54)

You are viewing a single comment's thread. Show more comments above.

Comment author: infotropism 09 April 2009 10:11:55AM 1 point [-]

So you have one precise reason to not want to live on, and it hinges on quite a few assumptions right ? Adding details to a story makes it less probable.

Could you imagine a few other scenarios where things go right instead ? So long as you're alive, there's always at least the unexpected at any rate. You can't predict what your future will be, especially post singularity. Even if you can't imagine how your life could be pleasant or how to make it turn right now, since you aren't expected to outsmart your future self, why wouldn't it find that solution ?

I'll list the most obvious thing to me here, if that can help, which is that I don't see how expanding yourself equates with merging with other minds and loosing your individuality. If there was no way to get individual minds that are bigger than ours without a loss of individuality, then please do tell of how individual and complex humans are as compared to previous lifeforms, upstream the tree of life.