Wei_Dai comments on Thoughts on the Singularity Institute (SI) - Less Wrong

256 Post author: HoldenKarnofsky 11 May 2012 04:31AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1270)

You are viewing a single comment's thread. Show more comments above.

Comment author: Wei_Dai 16 May 2012 11:01:57PM 14 points [-]

"And if Novamente should ever cross the finish line, we all die."

And yet SIAI didn't do anything to Ben Goertzel (except make him Director of Research for a time, which is kind of insane in my judgement, but obviously not in the sense you intend).

Comment author: Eliezer_Yudkowsky 17 May 2012 09:31:20PM *  12 points [-]

Ben Goertzel's projects are knowably hopeless, so I didn't too strongly oppose Tyler Emerson's project from within SIAI's then-Board of Directors; it was being argued to have political benefits, and I saw no noticeable x-risk so I didn't expend my own political capital to veto it, just sighed. Nowadays the Board would not vote for this.

And it is also true that, in the hypothetical counterfactual conditional where Goertzel's creations work, we all die. I'd phrase the email message differently today to avoid any appearance of endorsing the probability, because today I understand better that most people have trouble mentally separating hypotheticals. But the hypothetical is still true in that counterfactual universe, if not in this one.

There is no contradiction here.

Comment author: Wei_Dai 18 May 2012 12:23:06AM 6 points [-]

To clarify, by "kind of insane" I didn't mean you personally, but was commenting on SIAI's group rationality at that time.

Comment author: private_messaging 17 May 2012 05:16:35AM *  -2 points [-]

Did I claim they did beat him up or what? Ultimately, more recent opinion which I seen somewhere is that Eliezer ended up considering Ben harmless as in unlikely to achieve the result. I also see you guys really loving trolley problems including extreme forms of it (with 3^^^3 dustspecks in 3^^^3 eyes).

Having it popularly told that your project is going to kill everyone is already a risk given all the other nutjobs:

http://www.nature.com/news/2011/110822/full/476373a.html

Even if later atoned for by making you head of SI or something (with unclear motivation which may well be creepy in nature)

See, i did not say he was going to definitely get killed or something. I said, there was a risk. Yea, nothing happening to Ben Goertzel's persona is proof positive that the risk is zero. Geez, why won't you for once reason like this about AI risk for example.

Ultimately: encounters with a nutjob* who may, after presentation of technical details, believe you are going to kill everyone, are about as safe as making credible death threats against normal person and his relatives and his family etc. Or less safe, even. Neither results in 100% probability of anything happening.

*though of course the point may be made that he doesn't believe the stuff he says he believes, or that a sane portion of his brain will reliably enact akrasia over the decision, or something.

Comment author: othercriteria 17 May 2012 03:51:07PM 2 points [-]

The existence of third-party anti-technology terrorists adds something to the conversation beyond the risks FinalState can directly pose to SIAI-folk and vice versa. I'm curious about gwern's response, especially, given his interest in Death Note, which describes a world where law enforcement can indirectly have people killed just by publishing their identifying information.