Rain comments on 12-year old challenges the Big Bang - Less Wrong

1 [deleted] 29 March 2011 05:40AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (42)

You are viewing a single comment's thread. Show more comments above.

Comment author: XiXiDu 29 March 2011 11:28:49AM *  5 points [-]

He's a smart 12 year old who has some silly ideas, as smart 12 year olds often do, and now he'll never be able to live them down because some reporter wrote a fluff piece about him.

Reminds me of this old article (04.19.01) about Yudkowsky:

Since then, Yudkowsky has become not just someone who predicts the Singularity, but a committed activist trying to speed its arrival. "My first allegiance is to the Singularity, not humanity," he writes in one essay. "I don't know what the Singularity will do with us. I don't know whether Singularities upgrade mortal races, or disassemble us for spare atoms.... If it comes down to Us or Them, I'm with Them."

[...]

Yudkowsky takes it a step further, writing that he believes AI "will be developed on symmetric-multiprocessing hardware, at least initially." He said he expects Singularity could happen in the very near future: "I wouldn't be surprised if tomorrow was the Final Dawn, the last sunrise before the Earth and Sun are reshaped into computing elements."

When one researcher booted up a program he hoped would be AI-like, Yudkowsky said he believed there was a 5 percent chance the Singularity was about to happen and human existence would be forever changed.

[...]

In an autobiographical essay, he writes: "I think my efforts could spell the difference between life and death for most of humanity, or even the difference between a Singularity and a lifeless, sterilized planet... I think that I can save the world, not just because I'm the one who happens to be making the effort, but because I'm the only one who can make the effort."

Comment author: Rain 30 March 2011 12:51:04AM 9 points [-]

It truly is astonishing, the number of quotes that XiXiDu has about Eliezer. It's like he has a thick dossier, slowly accumulating negative content...

Comment author: timtyler 30 March 2011 11:54:19AM *  13 points [-]

It would be interesting to see a list of all the material that has been deleted in cover-up operations over the years. We really need a SIAIWatch organisation.

[Added] some deletions that spring to mind:

Physics Workarounds (archived here)

Coding a Transhuman AI.(archived here)

Eliezer, the person (archived here)

The deleted posts from around the time of Roko's departure.

Algernon's Law: (archived here)

Love and Life Just Before the Singularity

Flare - though remanants survive.

SysopMind. (archived here)

Gaussian Humans (archived here)

The Seed AI page.

Becoming a Seed AI Programmer. (archived here)

The “Commitments” vanished from: http://singinst.org/aboutus/ourmission

They used to look like this:

Commitments

  • SIAI will not enter any partnership that compromises our values.
  • Technology developed by SIAI will not be used to harm human life.
  • The challenge, opportunity and risk of artificial intelligence is the common concern of all humanity. SIAI will not show ethnic, national, political, or religious favoritism in the discharge of our mission.
Comment author: Jayson_Virissimo 17 January 2012 07:12:07AM 3 points [-]

So, did anyone actually save Roko's comments before the mass deletion?

Comment author: Humbug 17 January 2012 04:41:39PM 2 points [-]

So, did anyone actually save Roko's comments before the mass deletion?

Google Reader fetches every post and comment that is being made on lesswrong. Editing or deleting won't remove it. All comments and posts that have ever been made are still there, saved by Google. You just have to add the right RSS feeds to Google Reader.

Comment author: Houshalter 21 September 2013 12:32:54AM 0 points [-]

Ok now what.

Comment author: timtyler 17 January 2012 02:07:23PM *  2 points [-]

They did. There's a very brief synopsis here.

Comment author: hairyfigment 30 March 2011 08:34:35PM *  3 points [-]

See, while I'm not sure about the Commitments and obviously I'm reasoning from partial data across the board, most of these look like aspects of Eliezer's pre-2003 mistake(s). I thought of that but decided calling it a cover-up didn't make much sense; he spent a lot of time explaining his mistake and how his later views on it motivated his posts on LW.

[edited slightly for clarity]

Comment author: lukeprog 17 January 2012 06:15:34AM *  1 point [-]

Eliezer, the person (archived here)

Updated link.

Reading it for the first time today, I'm amused by how much section 1.8 resembles my own Singularitarian conversion moment.

And boy, is this quote ever true of me: "I do my best thinking into a keyboard."

Comment author: XiXiDu 17 January 2012 12:30:44PM 0 points [-]

I'm amused by how much section 1.8 resembles my own Singularitarian conversion moment.

It is quite funny how my story differs. See the banner on my homepage in 2005?

"Towards the Singularity and a Posthuman Future"

I was a believer. It seemed completely obvious that we'll soon see superhuman AI. When reading 'Permutation City' and 'Diaspora' I was bothered by how there was no AI, just emulations. That didn't seem right.

I changed my mind. I now think that a lot of what I believed to know was based on extrapolations of current trends mixed with pure speculation. Those incredible amounts of technological optimism just seem naive now. It all sounds really cool and convincing when formulated in English, but that's not enough.

Comment author: timtyler 17 January 2012 01:54:54PM 2 points [-]

I was a believer. It seemed completely obvious that we'll soon see superhuman AI. When reading 'Permutation City' and 'Diaspora' I was bothered by how there was no AI, just emulations. That didn't seem right.

A common plot device - humans need human-like protagonists to identify with - or the story doesn't sell.

Such scenarios then get "reified" in people's minds, and a whole lot of nonsense results.

Comment author: JoshuaZ 17 April 2011 03:20:51PM *  1 point [-]

Deleting content that is no longer relevant is not the same thing as a cover up. It might be best to keep copies of such content around, but there's nothing inherently sinister about not doing so.

Comment author: Nick_Tarleton 30 March 2011 06:45:16PM 1 point [-]

Citations needed.

Comment author: timtyler 30 March 2011 07:07:09PM 0 points [-]

I added some.

Comment author: hairyfigment 30 March 2011 05:54:00PM 1 point [-]

I only know of one cover-up operation, and that didn't include material directly about Eliezer or the SIAI.

Hey, maybe this was the point of that exercise -- a deliberately flawed cover-up, to make me underestimate how easily the SI can hide any facts they really want to keep secret!

Comment author: Quirinus_Quirrell 31 March 2011 12:03:54AM 10 points [-]

I wouldn't exactly call it a cover-up. It looks to me like the actual goal was to ensure that a particular subject wouldn't develop further, by derailing any discussions about it into meta-discussions about censorship. Lots of noise was made, but no one ever published a sufficiently detailed description of the spell, so this did in fact succeed in averting a minor disaster.

Comment author: Lambda 03 December 2012 03:03:05AM 0 points [-]

Was this the "PUA Controversy"?

Comment author: XiXiDu 30 March 2011 11:21:11AM 5 points [-]

It truly is astonishing, the number of quotes that XiXiDu has about Eliezer. It's like he has a thick dossier, slowly accumulating negative content...

I don't save them anywhere and I never actively searched for negative content. It was either given to me by other people or I came across it by chance. That particular link is from a comment made by David Pearce on Facebook on a link posted there to the latest interview between Eliezer Yudkowsky and John Baez.

Do you think it is a bad idea to take a closer look at what is said about and by someone who is working on fooming AI?