Posts

Sorted by New

Wiki Contributions

Comments

"Most Americans of the time were unabashedly racist, had little concept of electricity and none of computing, had vaguely heard of automobiles, etc."

So if you woke up in a strange world with technologies you don't understand (at first) and mainstream values you disagree with (at first), you would rather commit suicide than try to learn about this new world and see if you can have a pleasant life in it?

Yvain wrote: "The deal-breaker is that I really, really don't want to live forever. I might enjoy living a thousand years, but not forever. "

I'm curious to know how you know that in advance? Isn't it like a kid making a binding decision on its future self?

As Aubrey says, (I'm paraphrasing): "If I'm healthy today and enjoying my life, I'll want to wake up tomorrow. And so on." You live a very long time one day at a time.

Eliezer, could we get a status update on the books that will (I hope) come out of all this material you've been writing?

Is it still part of the grand plan, or did that change?

"I think that unless you're revived very quickly after death you'll most likely wake up in a weirdtopia."

Indeed, though a technologically advanced enough weirdtopia might have pretty good ways to help you adapt and feel at home (f.ex. by modifying your own self to keep up with all the post-humans, or by starting you out in a VR world that you can relate to and progressively introducing you to the current world).

"What if you wake up in Dystopia?"

What is the counterargument to this?

I'm not sure if it's possible to convincingly argue that a dystopia bad enough to not be worth living in probably wouldn't care much about its citizens, and even less about its cryo-suspended ones, so if things get bad enough your chances of being revived are very low.

I'm currently reading Global Catastrophic Risks by Nick Bostrom and Cirkovic, and it's pretty scary to think of how arbitrarily everything could go bad and we could all live through very hard times indeed.

That kind of reading usually keeps me from having my soul sucked into this imagined great future...

"so you don't throw up every time you remember what you did on your vacation."

Oh man. If this AI thing doesn't work out, maybe you can try comedy?

I read on some skeptics blog that Jim Carey left $50 million to Jenny McCarthy. That sure could fund the SIAI for a while...

"So lack of robustness against insufficient omega 6 does indeed cause much mental illness. (One reason my son has been raised on lots of fish oil.)"

Patri, did you mean Omega 3?

"The paperback has an additional 40-page "Afterword"."

Argh. I already have two copies of the hardback, including an autographed one. Now you're tempting me to get a third copy (makes a good gift, I guess).

Load More