Yvain wrote: "The deal-breaker is that I really, really don't want to live forever. I might enjoy living a thousand years, but not forever. "
I'm curious to know how you know that in advance? Isn't it like a kid making a binding decision on its future self?
As Aubrey says, (I'm paraphrasing): "If I'm healthy today and enjoying my life, I'll want to wake up tomorrow. And so on." You live a very long time one day at a time.
"I think that unless you're revived very quickly after death you'll most likely wake up in a weirdtopia."
Indeed, though a technologically advanced enough weirdtopia might have pretty good ways to help you adapt and feel at home (f.ex. by modifying your own self to keep up with all the post-humans, or by starting you out in a VR world that you can relate to and progressively introducing you to the current world).
"What if you wake up in Dystopia?"
What is the counterargument to this?
I'm not sure if it's possible to convincingly argue that a dystopia bad enough to not be worth living in probably wouldn't care much about its citizens, and even less about its cryo-suspended ones, so if things get bad enough your chances of being revived are very low.
I'm currently reading Global Catastrophic Risks by Nick Bostrom and Cirkovic, and it's pretty scary to think of how arbitrarily everything could go bad and we could all live through very hard times indeed.
That kind of reading usually keeps me from having my soul sucked into this imagined great future...
Personally this year I'm thankful for the Earth's molten interior:
http://michaelgr.com/2008/11/28/be-thankful-for-the-earths-molten-interior/
"Why is daily posting a shibboleth ?
I would still read the site if EY posted once a week"
I second that. Even if OB was updated only 1-3 times a week by posts of the current level of quality, it would still be one of my favorite sites. In fact, I'm having a hard time keeping up with the current quantity of content and I often need to set time aside to clear my OB backlog.
A better software platform would be good, but I doubt that user-generated content could ever become central to the site. Maybe as a sidebar, with a few rare posts getting promoted...
"Eliezer must be disappointed. He makes a thread to help a recovering irrationalist deal with the social consequences of deconversion, and half the posts he gets are from religious believers and believers in religious belief."
I think this has been linked from some social media sites, which can explain the influx of non-regular OB readers.
Maybe this is fodder for another post:
A few people here said: "If that person was really special, there would be no problem with you telling him."
But are things really that simple? Not so long ago, Jo would probably have reacted badly if her special person had told her that he didn't believe anymore in what she believed. Loving someone and getting along well with them doesn't mean that you will accept anything they do without problem and vice versa.
Think about the people that you find "special" in your life and imagine telling them that...
"Jack, I've spoken on many occasions previously but I was never in Toastmasters."
If you're planning to be speaking for money, Toastmasters might be a good investment. I would recommend at least checking it out to see what you could get out of it. Since you are not a beginner, you should check out 'advanced' clubs.
With public speaking, there's nothing like experience. TM allows you to practice in a friendly environment where you can try new approaches (doesn't matter if they fail), and to benefit from the knowledge of a group of people who have be...
"Jaron's laughter seems largely the laughter of frustrated politesse. This comes out in his speech when he repeats to EY "I've been having this discussion for decades.""
I think that's BS. If Jaron didn't want to discuss AI, then why agree to a BGTV episode with Eliezer, a research fellow at the Singularity Institute for Artificial Intelligence.
Eliezer tried to understand what Jaron was saying and asked him questions to get him to better explain his positions. jaron pretty much never tried to make himself clear (probably because there wa...
I'm 20 minutes in and wish Lanier's connection would just cut off and Eliezer would talk by himself.
"Eliezer occasionally looked like he was having trouble following Lanier's reasoning. I certainly did. My guess is that this is because, on those occasions, Lanier didn't have a reasoning."
That's my feeling too. He seemed to love calling anyone who disagrees with him an "idiot" or "religious nut" without ever really explaining why.
I'm going to keep watching because I expect Eliezer to say some interesting stuff.
Here's my theory on this particular AI-Box experiment:
First you explain to the gatekeeper the potential dangers of AIs. General stuff about how large mind design space is, and how it's really easy to screw up and destroy the world with AI.
Then you try to convince him that the solution to that problem is building an AI very carefuly, and that a theory of friendly AI is primordial to increase our chances of a future we would find "nice" (and the stakes are so high, that even increasing these chances a tiny bit is very valuable).
THEN
You explain to t...
"The Second World War, as a whole, was probably the most catastrophic event in humanity's recorded history. The world was pretty much screwed as soon as it started -- indeed, probably as soon as Hitler acquired control of Germany."
WWII was just a continuation of WWI, which was a much less 'noble' war, if such a thing can even be said to exist.
War begets more war.
"What if the alternative was for the U.S. to firebomb and blockade Japan [...]"
That was probably another possibility, but certainly not the only alternative to nuking cities.
How about nuking somewhere very visible but not so populated with the message: "We have more where that came from. Surrender or the next one won't be in a daisy field." ?
I wrote something a little while ago about how Nagasaki was a secondary target, and Kokura was saved by cloudy conditions.
http://michaelgr.com/2008/02/01/nagasakis-nuke-was-supposed-to-be-dropped-on-kokura/
"Actually, the apple-recognition machinery in the human brain really does turn off on a regular basis. You have to be awake in order to recognize an apple; you can't do it while sleeping."
I don't remember ever dreaming about fruits, but I'm pretty sure I could recognize an apple if it happened. Did I just set myself up to have a weird dream tonight? Oh boy...
The fact that the pattern that makes the apple module light up comes from different places while dreaming than while awake doesn't matter; you don't stop recognizing it, so the model probably isn't 'off'.
"That's what Richard Dawkins understands that Michael Rose doesn't - that Reason is not a game."
Dawkins is also acutely aware that his opponents won't always play fair, and have often quoted him and other scientists out of context to try to make it seem like they hold position that they don't actually hold. That's why he wants to have a tape recorder when he dies, so there can't be rumors about his "deathbed conversion".
While on the topic of conferences that might interest this crowd, Aging 2008 will take place on June 27th in Los Angeles at UCLA.
"leading scientists and thinkers in stem cell research and regenerative medicine will gather in Los Angeles at UCLA for Aging 2008 to explain how their work can combat human aging, and the sociological implications of developing rejuvenation therapies.
Aging 2008 is free, with advance registration required"
More details here:
Definitely good advice on textbooks.
I've been slowly, sloooowly reading Molecular Biology of the Cell (5th ed, brand new) by Alberts, and Lehninger: Principles of Biochemistry (4th ed). So far, I prefer the first one.
Until recently I was too intimidated to buy them because that's far from what I studied, but now I regret it. I should have started sooner.
I might be a bit late with this, but here's my 2 cents:
"Maybe spend another month or two doing large transhumanist sequences, either on the Singularity Institute blog (currently fairly defunct) or here on Overcoming Bias if the readers really want that."
I suggest writing the Transhumanist stuff on either a new blog (wordpress.com is free and easy to set up) or the SIAI blog, and link that prominently from Overcoming Bias. Maybe even do a weekly roundup/linkfest post here to remind people.
This would mean that you wouldn't lose those who are interested, and that those who aren't can easily skip it and spare us the complaining.
"I agree, but Eli has already announced his intentions to rewrite most of the material, which will require a great deal of work."
Indeed, but less than coming up with it in the first place, and the total return on investment will likely be much higher.
f.ex., if for 1000 hours of work he got 5,000 regular readers here, for 1300 hours of work he might be able to get 100,000+ (not all at once, but over a few years) with relatively little overlap between both groups.
"Publishing this kind of a book is essentially a shotgun approach"
I'm not su...
""Almost nobody (relatively) will be rediscovering them in a few years. That's simply the nature of blogging. Who's reading 3 years old BoingBoing posts right now?""
"This is a very good point."
I think it would be a waste, and very sad, if Eliezer had spent over a year writing enough high-quality material for a book and that material just stayed buried in Overcoming Bias' archives, nearly forgotten in a matter of years.
For a little more effort, he can produce books that more chances of making a difference.
Tom, you say that peop...
I'd definitely like to have that 500 pages book in my library as reference, and give the shorter popular book as gift to friends (or my future kids?).
Only a small subset of the small group (relatively) of people who have read these blog posts as they were published will use them as reference later. Almost nobody (relatively) will be rediscovering them in a few years. That's simply the nature of blogging. Who's reading 3 years old BoingBoing posts right now?
I'm currently reading "Godel, Escher, Bach", and from what I've read here, I think that Eliezer's book could become something like that. Maybe not a Pulitzer (but who knows?), but certainly something special that changes the way people think.
"I was planning a nigh-complete rewrite for the book - less than 50% previously published sentences, say. Would it be a problem if the ideas, but not sentences, all exist elsewhere?"
You might need to remove some posts from the net if you decide to use them as is in the book, but if it is all modified, there shouldn't be a problem AFAIK.
Ideas can't be copyrighted, and you wouldn't be the first person to turn blog material into dead tree.
"For those who would pick SPECKS, would you pay a single penny to avoid the dust specks?"
To avoid all the dust specks, yeah, I'd pay a penny and more. Not a penny per speck, though ;)
The reason is to avoid having to deal with the "unintended consequences" of being responsible for that very very small change over such a large number of people. It's bound to have some significant indirect consequences, both positive and negative, on the far edges of the bell curve... the net impact could be negative, and a penny is little to pay to avoid responsibility for that possibility.
"Most Americans of the time were unabashedly racist, had little concept of electricity and none of computing, had vaguely heard of automobiles, etc."
So if you woke up in a strange world with technologies you don't understand (at first) and mainstream values you disagree with (at first), you would rather commit suicide than try to learn about this new world and see if you can have a pleasant life in it?