cousin_it comments on This Failing Earth - Less Wrong

19 Post author: Eliezer_Yudkowsky 24 May 2009 04:09PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (158)

You are viewing a single comment's thread. Show more comments above.

Comment author: JoeShipley 28 May 2009 12:55:45AM *  1 point [-]

Phil, I'm sorry if this sounds negative, but I don't understand this attitude at all. Intelligence is how accurately you can mine the information of the past in order to predict the future. You can't possibly think that all of our problems would go away if you gave everybody in the world a lobotomy? Or is there just some preferable lower-limit to intelligence we should engineer people to?

I think the historical problems with intelligence is an uneven increase in different fields or typical misuse. This isn't a problem with the tool, but the protocols and practices surrounding it.

Comment author: cousin_it 05 June 2009 11:16:38AM *  2 points [-]

Devil's advocate mode!

This isn't a problem with the tool, but the protocols and practices surrounding it.

Joe, I couldn't discern any verifiable meaning in this sentence. Both the tool and the protocols/practices contribute to the problem.

You can't possibly think that all of our problems would go away if you gave everybody in the world a lobotomy?

No more nukes. No more surveillance states. No more military/police robots. No more AI threat, nanotech threat, biotech threat. A lobotomy for everyone would increase our chances of surviving the century as a species.

I guess most people would object to an intelligence decrease because they feel it would make them worse off. In a world where growth of intelligence and knowledge leads to multiple high-probability scenarios of global collapse, this stance looks eerily equivalent to defecting in the Prisoner's Dilemma. I wonder what Eliezer, an outspoken proponent of cooperating, would say about global lobotomies viewed in this light.