Rain comments on Other Existential Risks - Less Wrong

32 Post author: multifoliaterose 17 August 2010 09:24PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (120)

You are viewing a single comment's thread.

Comment author: Rain 09 September 2010 05:57:43PM *  2 points [-]

I'm surprised you bring up Mikhail Gromov as a counterexample to Eliezer, considering that Gromov's solution to existential risk, as presented in the quote above, can be paraphrased as: increase education so someone has a good idea on how to fix everything.

(Actual quote: "People must have ideas and they must prepare now. In two generations people must be educated. Teachers must be educated now, and then the teachers will educate a new generation. Then there will be sufficiently many people to face the difficulties. I am sure this will give a result.")

If he doesn't have any other concrete ideas, then I would think he'd recognize Eliezer as being a knowledgeable person with a potential solution fitting his criteria, and thus support him.

Comment author: multifoliaterose 26 October 2010 06:22:40PM *  1 point [-]

I don't think that Gromov's views and Eliezer's views are necessarily incompatible.

My reading of Gromov's quotation is that he does not have his eyes on a technological intelligence explosion and that the existential risk that he's presently most concerned about is natural resource shortage.

This is in contrast with Eliezer who does have his eyes on a technological singularity and does not presently seem to be concerned about natural resource shortage.

I would be very interested in seeing Gromov study the evidence for a near-term intelligence explosion and seeing how this affects his views.

I may eventually approach him personally about this matter (although I hesitate to do so as I think that it's important that whoever approach him on this point make a good first impression and I'm not sure that I'm in a good position to do so at the moment).