timtyler comments on Intellectual Hipsters and Meta-Contrarianism - Less Wrong

147 Post author: Yvain 13 September 2010 09:36PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (323)

You are viewing a single comment's thread. Show more comments above.

Comment author: Will_Newsome 01 October 2010 07:30:58PM *  8 points [-]

FWIW, I don't see what I am saying as particularly "contrarian". A lot of people would be pretty sceptical about the end of the world being nigh - or the idea that a bug might take over the world - or the idea that a bunch of saintly programmers will be the ones to save us all. Maybe contrary to the ideas of the true believers - if that is what you mean.

Right, I said metacontrarian. Although most LW people seem SIAI-agnostic, a lot of the most vocal and most experienced posters are pro-SIAI or SIAI-related, so LW comes across as having a generally pro-SIAI attitude, which is a traditionally contrarian attitude. Thus going against the contrarian status quo is metacontrarian.

You encourage me to speculate about the motives of the individuals involved. While that might be fun, it doesn't seem to matter much - the SIAI itself is evidently behaving as though it wants dollars, attention, and manpower - to help it meet its aims.

I'm confused. Anyone trying to accomplish anything is going to try to get dollars, attention, and manpower. I'm confused as to how this is relevant to the merit of SIAI's purpose. SIAI's never claimed to be fundamentally opposed to having resources. Can you expand on this?

I hope I don't come across as thinking "the worst" about those involved. I expect they are all very nice and sincere. By way of comparison, not all cults have deliberately exploitative ringleaders.

What makes that comparison spring to mind? Everyone is incredibly critical of Eliezer, probably much more so than he deserves, because everyone is racing to be first to establish their non-cult-victim status. Everyone at SIAI has different beliefs about the relative merits of different strategies for successful FAI development. That isn't a good thing -- fractured strategy is never good -- but it is evidence against cultishness. SIAI grounds its predictions in clear and careful epistemology. SIAI publishes in academic journals, attends scientific conferences, and hosts the Singularity Summit, where tons of prominent high status folk show up to speak about Singularity-related issues. Why is cult your choice of reference class? It is no more a cult than a typical global warming awareness organization. It's just that 'science fiction' is a low status literary genre in modern liberal society.

Comment author: timtyler 01 October 2010 08:00:40PM *  0 points [-]

Anyone trying to accomplish anything is going to try to get dollars, attention, and manpower. I'm confused as to how this is relevant to the merit of SIAI's purpose.

To recap, the SIAI is funded by donations from those who think that they will help prevent the end of the world at the hands of intelligent machines. For this pitch to work, the world must be at risk - in order for them to be able to save it. The SIAI face some resistance over this point, and these days, much of their output is oriented towards convincing others that these may be the end days. Also there will be a selection bias, with those most convinced of a high p(DOOM) most likely to be involved. Like I said, not necessarily the type of organisation one would want to approach if seeking the facts of the matter.

You pretend to fail to see connections between the SIAI and an END OF THE WORLD cult - but it isn't a terribly convincing act.

For the connections, see here. For protesting too much, see You're calling who a cult leader?

Comment author: Will_Newsome 01 October 2010 09:17:56PM 6 points [-]

You pretend to fail to see connections between the SIAI and an END OF THE WORLD cult - but it isn't a terribly convincing act.

No, I see it, look further, and find the model lacking in explanatory power. It selectively leaves out all kinds of useful information that I can use to control my anticipations.

Hmuh, I guess we won't be able to make progress, 'cuz I pretty much wholeheartedly agree with Vladimir when he says:

This whole "outside view" methodology, where you insist on arguing from ignorance even where you have additional knowledge, is insane (outside of avoiding the specific biases such as planning fallacy induced by making additional detail available to your mind, where you indirectly benefit from basing your decision on ignorance).

and Nick Tarleton when he says:

We all already know about this pattern match. Its reiteration is boring and detracts from the conversation.

Comment author: wedrifid 02 October 2010 01:58:23AM 1 point [-]

No, I see it, look further, and find the model lacking in explanatory power. It selectively leaves out all kinds of useful information that I can use to control my anticipations.

"This one is right" for example. ;)

Comment deleted 02 October 2010 04:25:03AM [-]
Comment author: timtyler 02 October 2010 12:17:54PM *  0 points [-]

I didn't say anyone was "racing to be first to establish their non-cult-victim status" - but it is certainly a curious image! [deleted parent comment was a dupe].

Comment author: wedrifid 02 October 2010 01:25:43PM 0 points [-]

Oops, connection troubles then missed.