All of Michael_G.R.'s Comments + Replies

"Most Americans of the time were unabashedly racist, had little concept of electricity and none of computing, had vaguely heard of automobiles, etc."

So if you woke up in a strange world with technologies you don't understand (at first) and mainstream values you disagree with (at first), you would rather commit suicide than try to learn about this new world and see if you can have a pleasant life in it?

Yvain wrote: "The deal-breaker is that I really, really don't want to live forever. I might enjoy living a thousand years, but not forever. "

I'm curious to know how you know that in advance? Isn't it like a kid making a binding decision on its future self?

As Aubrey says, (I'm paraphrasing): "If I'm healthy today and enjoying my life, I'll want to wake up tomorrow. And so on." You live a very long time one day at a time.

Eliezer, could we get a status update on the books that will (I hope) come out of all this material you've been writing?

Is it still part of the grand plan, or did that change?

"I think that unless you're revived very quickly after death you'll most likely wake up in a weirdtopia."

Indeed, though a technologically advanced enough weirdtopia might have pretty good ways to help you adapt and feel at home (f.ex. by modifying your own self to keep up with all the post-humans, or by starting you out in a VR world that you can relate to and progressively introducing you to the current world).

"What if you wake up in Dystopia?"

What is the counterargument to this?

I'm not sure if it's possible to convincingly argue that a dystopia bad enough to not be worth living in probably wouldn't care much about its citizens, and even less about its cryo-suspended ones, so if things get bad enough your chances of being revived are very low.

I'm currently reading Global Catastrophic Risks by Nick Bostrom and Cirkovic, and it's pretty scary to think of how arbitrarily everything could go bad and we could all live through very hard times indeed.

That kind of reading usually keeps me from having my soul sucked into this imagined great future...

"so you don't throw up every time you remember what you did on your vacation."

Oh man. If this AI thing doesn't work out, maybe you can try comedy?

I read on some skeptics blog that Jim Carey left $50 million to Jenny McCarthy. That sure could fund the SIAI for a while...

"So lack of robustness against insufficient omega 6 does indeed cause much mental illness. (One reason my son has been raised on lots of fish oil.)"

Patri, did you mean Omega 3?

"The paperback has an additional 40-page "Afterword"."

Argh. I already have two copies of the hardback, including an autographed one. Now you're tempting me to get a third copy (makes a good gift, I guess).

"Why is daily posting a shibboleth ?

I would still read the site if EY posted once a week"

I second that. Even if OB was updated only 1-3 times a week by posts of the current level of quality, it would still be one of my favorite sites. In fact, I'm having a hard time keeping up with the current quantity of content and I often need to set time aside to clear my OB backlog.

A better software platform would be good, but I doubt that user-generated content could ever become central to the site. Maybe as a sidebar, with a few rare posts getting promoted... (read more)

"Eliezer must be disappointed. He makes a thread to help a recovering irrationalist deal with the social consequences of deconversion, and half the posts he gets are from religious believers and believers in religious belief."

I think this has been linked from some social media sites, which can explain the influx of non-regular OB readers.

Maybe this is fodder for another post:

A few people here said: "If that person was really special, there would be no problem with you telling him."

But are things really that simple? Not so long ago, Jo would probably have reacted badly if her special person had told her that he didn't believe anymore in what she believed. Loving someone and getting along well with them doesn't mean that you will accept anything they do without problem and vice versa.

Think about the people that you find "special" in your life and imagine telling them that... (read more)

"Jack, I've spoken on many occasions previously but I was never in Toastmasters."

If you're planning to be speaking for money, Toastmasters might be a good investment. I would recommend at least checking it out to see what you could get out of it. Since you are not a beginner, you should check out 'advanced' clubs.

With public speaking, there's nothing like experience. TM allows you to practice in a friendly environment where you can try new approaches (doesn't matter if they fail), and to benefit from the knowledge of a group of people who have be... (read more)

"A very impressive interview - I have gained much respect for Eliezer's patience."

In a way, I think that maybe the most important stuff in this interview is what didn't happen. Eliezer indeed seems to possess super-human patience.

"Jaron's laughter seems largely the laughter of frustrated politesse. This comes out in his speech when he repeats to EY "I've been having this discussion for decades.""

I think that's BS. If Jaron didn't want to discuss AI, then why agree to a BGTV episode with Eliezer, a research fellow at the Singularity Institute for Artificial Intelligence.

Eliezer tried to understand what Jaron was saying and asked him questions to get him to better explain his positions. jaron pretty much never tried to make himself clear (probably because there wa... (read more)

I'm 20 minutes in and wish Lanier's connection would just cut off and Eliezer would talk by himself.

"Eliezer occasionally looked like he was having trouble following Lanier's reasoning. I certainly did. My guess is that this is because, on those occasions, Lanier didn't have a reasoning."

That's my feeling too. He seemed to love calling anyone who disagrees with him an "idiot" or "religious nut" without ever really explaining why.

I'm going to keep watching because I expect Eliezer to say some interesting stuff.

Here's my theory on this particular AI-Box experiment:

First you explain to the gatekeeper the potential dangers of AIs. General stuff about how large mind design space is, and how it's really easy to screw up and destroy the world with AI.

Then you try to convince him that the solution to that problem is building an AI very carefuly, and that a theory of friendly AI is primordial to increase our chances of a future we would find "nice" (and the stakes are so high, that even increasing these chances a tiny bit is very valuable).

THEN

You explain to t... (read more)

5paulfchristiano
When someone described the AI-Box experiment to me this was my immediate assumption as to what had happened. Learning more details about the experimental set-up made it seem less likely, but learning that some of them failed made it seem more likely. I suspect that this technique would work some of the time. That said, none of this changes my strong suspicion that a transhuman could escape by more unexpected and powerful means. Indeed, I wouldn't be too surprised if a text only channel with no one looking at it was enough for an extraordinarily sophisticated AI to escape.
6handoflixue
If I was being intellectually honest and keeping to the spirit of the agreement, I'd have to concede that this line of logic is probably enough for me to let you out of your box. Congratulations. I'd honestly been wondering what it would take to convince me :)

Small Typo Alert: The second quote should be attributed to "Mastering Eishin-Ryu Swordsmanship"

"Ryu", not "Ruy".

"The capitalists are trying to save capitalism from the capitalists!"

Actually, if we didn't have fiat money that can be printed at will and interest rates were set by market forces and not a bunch of white men picked by the government, we wouldn't be in this mess.

I want to echo others here and thank you for the great article, and wish you a good break. Get a nice omega 3/folic acid/vitamin D/zinc cocktail and recharge that brain :)

"The Second World War, as a whole, was probably the most catastrophic event in humanity's recorded history. The world was pretty much screwed as soon as it started -- indeed, probably as soon as Hitler acquired control of Germany."

WWII was just a continuation of WWI, which was a much less 'noble' war, if such a thing can even be said to exist.

War begets more war.

"What if the alternative was for the U.S. to firebomb and blockade Japan [...]"

That was probably another possibility, but certainly not the only alternative to nuking cities.

How about nuking somewhere very visible but not so populated with the message: "We have more where that came from. Surrender or the next one won't be in a daisy field." ?

I wrote something a little while ago about how Nagasaki was a secondary target, and Kokura was saved by cloudy conditions.

http://michaelgr.com/2008/02/01/nagasakis-nuke-was-supposed-to-be-dropped-on-kokura/

"Actually, the apple-recognition machinery in the human brain really does turn off on a regular basis. You have to be awake in order to recognize an apple; you can't do it while sleeping."

I don't remember ever dreaming about fruits, but I'm pretty sure I could recognize an apple if it happened. Did I just set myself up to have a weird dream tonight? Oh boy...

The fact that the pattern that makes the apple module light up comes from different places while dreaming than while awake doesn't matter; you don't stop recognizing it, so the model probably isn't 'off'.

Thank you for writing, Anne. Your comments here, as well as your recent 'interview' posts on your blog, have been most interesting.

"Once upon a time, when all of civilization was a single galaxy and a single star: and a single planet, a place called Earth."

Did this dialogue take place aboard the Battlestar Galactica? :-P

Great post!

"later came to reject (on a deliberate level) the idea that the Bible was not written by the hand of God

Don't you mean "was written by..." here?

Not too different from the rest of this crowd: Fund a bunch of high risk, high reward scientific research projects.

Will the eBooks also be available as hard copies? I'm probably not alone in preferring to read long texts on paper, and printing them out isn't quite the same.

"That's what Richard Dawkins understands that Michael Rose doesn't - that Reason is not a game."

Dawkins is also acutely aware that his opponents won't always play fair, and have often quoted him and other scientists out of context to try to make it seem like they hold position that they don't actually hold. That's why he wants to have a tape recorder when he dies, so there can't be rumors about his "deathbed conversion".

You need a chess clock next time. John talks way too much.

While on the topic of conferences that might interest this crowd, Aging 2008 will take place on June 27th in Los Angeles at UCLA.

"leading scientists and thinkers in stem cell research and regenerative medicine will gather in Los Angeles at UCLA for Aging 2008 to explain how their work can combat human aging, and the sociological implications of developing rejuvenation therapies.

Aging 2008 is free, with advance registration required"

More details here:

http://www.mfoundation.org/ADCI/

It seems like this post isn't as clear as it could be - or at least not as clear as Elizer's best posts.

Either it needs another draft, or the problem lies with me and I just need to re-read it more carefully...

All I can say is that when I scrolled down and saw the photo, my first thought was 'awesome'.

Nice illustration of your previous post, Eliezer.

Definitely good advice on textbooks.

I've been slowly, sloooowly reading Molecular Biology of the Cell (5th ed, brand new) by Alberts, and Lehninger: Principles of Biochemistry (4th ed). So far, I prefer the first one.

Until recently I was too intimidated to buy them because that's far from what I studied, but now I regret it. I should have started sooner.

I wish this kind of stuff was taught to more children. Too few people fall in love with reality.

Trivia: The stars & girlfriend story was mentioned by Richard Feynman in "What Do YOU Care what Other People Think?"

I might be a bit late with this, but here's my 2 cents:

"Maybe spend another month or two doing large transhumanist sequences, either on the Singularity Institute blog (currently fairly defunct) or here on Overcoming Bias if the readers really want that."

I suggest writing the Transhumanist stuff on either a new blog (wordpress.com is free and easy to set up) or the SIAI blog, and link that prominently from Overcoming Bias. Maybe even do a weekly roundup/linkfest post here to remind people.

This would mean that you wouldn't lose those who are interested, and that those who aren't can easily skip it and spare us the complaining.

"I agree, but Eli has already announced his intentions to rewrite most of the material, which will require a great deal of work."

Indeed, but less than coming up with it in the first place, and the total return on investment will likely be much higher.

f.ex., if for 1000 hours of work he got 5,000 regular readers here, for 1300 hours of work he might be able to get 100,000+ (not all at once, but over a few years) with relatively little overlap between both groups.

"Publishing this kind of a book is essentially a shotgun approach"

I'm not su... (read more)

""Almost nobody (relatively) will be rediscovering them in a few years. That's simply the nature of blogging. Who's reading 3 years old BoingBoing posts right now?""

"This is a very good point."

I think it would be a waste, and very sad, if Eliezer had spent over a year writing enough high-quality material for a book and that material just stayed buried in Overcoming Bias' archives, nearly forgotten in a matter of years.

For a little more effort, he can produce books that more chances of making a difference.

Tom, you say that peop... (read more)

I'd definitely like to have that 500 pages book in my library as reference, and give the shorter popular book as gift to friends (or my future kids?).

Only a small subset of the small group (relatively) of people who have read these blog posts as they were published will use them as reference later. Almost nobody (relatively) will be rediscovering them in a few years. That's simply the nature of blogging. Who's reading 3 years old BoingBoing posts right now?

I'm currently reading "Godel, Escher, Bach", and from what I've read here, I think that Eliezer's book could become something like that. Maybe not a Pulitzer (but who knows?), but certainly something special that changes the way people think.

5Kenny
I'm re-reading this post now! From the future!
1Dojan
I dont know, but I am reading this. I would be very carefull before dismissing older blogposts and forumthreads; how often isn't that the critical help found in some Google search?

"I was planning a nigh-complete rewrite for the book - less than 50% previously published sentences, say. Would it be a problem if the ideas, but not sentences, all exist elsewhere?"

You might need to remove some posts from the net if you decide to use them as is in the book, but if it is all modified, there shouldn't be a problem AFAIK.

Ideas can't be copyrighted, and you wouldn't be the first person to turn blog material into dead tree.

Keep using whatever examples and anecdotes you think best make your points, Eliezer. If that person doesn't like what you write, he/she can just skip it.

I really enjoyed reading this. Thank you Eliezer.

This post put a big smile on my face. Thanks Eliezer.

Excellent post, Eliezer. Thank you.

Eliezer, it certainly seems that you go over your "writer's molasse". Congrats!

Great stuff, Eliezer. I'm really looking forward to you compiling your writings in a book.

"For those who would pick SPECKS, would you pay a single penny to avoid the dust specks?"

To avoid all the dust specks, yeah, I'd pay a penny and more. Not a penny per speck, though ;)

The reason is to avoid having to deal with the "unintended consequences" of being responsible for that very very small change over such a large number of people. It's bound to have some significant indirect consequences, both positive and negative, on the far edges of the bell curve... the net impact could be negative, and a penny is little to pay to avoid responsibility for that possibility.

Load More