Perplexed comments on Existential Risk and Public Relations - Less Wrong

36 Post author: multifoliaterose 15 August 2010 07:16AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (613)

You are viewing a single comment's thread.

Comment author: Perplexed 15 August 2010 06:22:57PM 10 points [-]

I believe that more likely than not, the reason why Eliezer has missed the point that I raise in this post is social naivete on his part rather than willful self-deception.

I find it impossible to believe that the author of Harry Potter and the Methods of Rationality is oblivious to the first impression he creates. However, I can well believe that he imagines it to be a minor handicap which will fade in importance with continued exposure to his brilliance (as was the fictional case with HP). The unacknowledged problem in the non-fictional case, of course, is in maintaining that continued exposure.

I am personally currently skeptical that the singularity represents existential risk. But having watched Eliezer completely confuse and irritate Robert Wright, and having read half of the "debate" with Hanson, I am quite willing to hypothesize that the explanation of what the singularity is (and why we should be nervous about it) ought to come from anybody but Eliezer. He speaks and writes clearly on many subjects, but not that one.

Perhaps he would communicate more successfully on this topic if he tried a dialog format. But it would have to be one in which his constructed interlocutors are convincing opponents, rather than straw men.

Comment author: timtyler 15 August 2010 06:31:09PM *  -1 points [-]

It depends on exactly what you mean by "existential risk". Development will likely - IMO - create genetic and phenotypic takeovers in due course - as the bioverse becomes engineered. That will mean no more "wild" humans.

That is something which some people seem to wail and wave their hands about - talking about the end of the human race.

The end of earth-originating civilisation seems highly unlikely to me too - which is not to say that the small chance of it is not significant enough to discuss.

Eliezer's main case for that appears to be on http://lesswrong.com/lw/y3/value_is_fragile/

I think that document is incoherent.