Yvain wrote: "The deal-breaker is that I really, really don't want to live forever. I might enjoy living a thousand years, but not forever. "
I'm curious to know how you know that in advance? Isn't it like a kid making a binding decision on its future self?
As Aubrey says, (I'm paraphrasing): "If I'm healthy today and enjoying my life, I'll want to wake up tomorrow. And so on." You live a very long time one day at a time.
Eliezer, could we get a status update on the books that will (I hope) come out of all this material you've been writing?
Is it still part of the grand plan, or did that change?
"I think that unless you're revived very quickly after death you'll most likely wake up in a weirdtopia."
Indeed, though a technologically advanced enough weirdtopia might have pretty good ways to help you adapt and feel at home (f.ex. by modifying your own self to keep up with all the post-humans, or by starting you out in a VR world that you can relate to and progressively introducing you to the current world).
"What if you wake up in Dystopia?"
What is the counterargument to this?
I'm not sure if it's possible to convincingly argue that a dystopia bad enough to not be worth living in probably wouldn't care much about its citizens, and even less about its cryo-suspended ones, so if things get bad enough your chances of being revived are very low.
I'm currently reading Global Catastrophic Risks by Nick Bostrom and Cirkovic, and it's pretty scary to think of how arbitrarily everything could go bad and we could all live through very hard times indeed.
That kind of reading usually keeps me from having my soul sucked into this imagined great future...
"so you don't throw up every time you remember what you did on your vacation."
Oh man. If this AI thing doesn't work out, maybe you can try comedy?
I read on some skeptics blog that Jim Carey left $50 million to Jenny McCarthy. That sure could fund the SIAI for a while...
"So lack of robustness against insufficient omega 6 does indeed cause much mental illness. (One reason my son has been raised on lots of fish oil.)"
Patri, did you mean Omega 3?
"The paperback has an additional 40-page "Afterword"."
Argh. I already have two copies of the hardback, including an autographed one. Now you're tempting me to get a third copy (makes a good gift, I guess).
Personally this year I'm thankful for the Earth's molten interior:
http://michaelgr.com/2008/11/28/be-thankful-for-the-earths-molten-interior/
"Why is daily posting a shibboleth ?
I would still read the site if EY posted once a week"
I second that. Even if OB was updated only 1-3 times a week by posts of the current level of quality, it would still be one of my favorite sites. In fact, I'm having a hard time keeping up with the current quantity of content and I often need to set time aside to clear my OB backlog.
A better software platform would be good, but I doubt that user-generated content could ever become central to the site. Maybe as a sidebar, with a few rare posts getting promoted...
"Eliezer must be disappointed. He makes a thread to help a recovering irrationalist deal with the social consequences of deconversion, and half the posts he gets are from religious believers and believers in religious belief."
I think this has been linked from some social media sites, which can explain the influx of non-regular OB readers.
Maybe this is fodder for another post:
A few people here said: "If that person was really special, there would be no problem with you telling him."
But are things really that simple? Not so long ago, Jo would probably have reacted badly if her special person had told her that he didn't believe anymore in what she believed. Loving someone and getting along well with them doesn't mean that you will accept anything they do without problem and vice versa.
Think about the people that you find "special" in your life and imagine telling them that...
"Jack, I've spoken on many occasions previously but I was never in Toastmasters."
If you're planning to be speaking for money, Toastmasters might be a good investment. I would recommend at least checking it out to see what you could get out of it. Since you are not a beginner, you should check out 'advanced' clubs.
With public speaking, there's nothing like experience. TM allows you to practice in a friendly environment where you can try new approaches (doesn't matter if they fail), and to benefit from the knowledge of a group of people who have be...
"A very impressive interview - I have gained much respect for Eliezer's patience."
In a way, I think that maybe the most important stuff in this interview is what didn't happen. Eliezer indeed seems to possess super-human patience.
"Jaron's laughter seems largely the laughter of frustrated politesse. This comes out in his speech when he repeats to EY "I've been having this discussion for decades.""
I think that's BS. If Jaron didn't want to discuss AI, then why agree to a BGTV episode with Eliezer, a research fellow at the Singularity Institute for Artificial Intelligence.
Eliezer tried to understand what Jaron was saying and asked him questions to get him to better explain his positions. jaron pretty much never tried to make himself clear (probably because there wa...
I'm 20 minutes in and wish Lanier's connection would just cut off and Eliezer would talk by himself.
"Eliezer occasionally looked like he was having trouble following Lanier's reasoning. I certainly did. My guess is that this is because, on those occasions, Lanier didn't have a reasoning."
That's my feeling too. He seemed to love calling anyone who disagrees with him an "idiot" or "religious nut" without ever really explaining why.
I'm going to keep watching because I expect Eliezer to say some interesting stuff.
Here's my theory on this particular AI-Box experiment:
First you explain to the gatekeeper the potential dangers of AIs. General stuff about how large mind design space is, and how it's really easy to screw up and destroy the world with AI.
Then you try to convince him that the solution to that problem is building an AI very carefuly, and that a theory of friendly AI is primordial to increase our chances of a future we would find "nice" (and the stakes are so high, that even increasing these chances a tiny bit is very valuable).
THEN
You explain to t...
Small Typo Alert: The second quote should be attributed to "Mastering Eishin-Ryu Swordsmanship"
"Ryu", not "Ruy".
"The capitalists are trying to save capitalism from the capitalists!"
Actually, if we didn't have fiat money that can be printed at will and interest rates were set by market forces and not a bunch of white men picked by the government, we wouldn't be in this mess.
"Most Americans of the time were unabashedly racist, had little concept of electricity and none of computing, had vaguely heard of automobiles, etc."
So if you woke up in a strange world with technologies you don't understand (at first) and mainstream values you disagree with (at first), you would rather commit suicide than try to learn about this new world and see if you can have a pleasant life in it?