cupholder comments on Open Thread June 2010, Part 3 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (606)
Comment on markup: I saw the first version of your comment, where you were using "(*)" as a textual marker, and I see you're now using "#" because the asterisks were messing with the markup. You should be able to get the "(*)" marker to work by putting a backslash before the asterisk (and I preferred the "(*)" indicator because that's more easily recognized as a footnote-style marker).
Feels weird to post an entire paragraph just to nitpick someone's markup, so here's an actual comment!
Let me try and rephrase this in a way that might be more testable/easier to think about. It sounds like the question here is what is causing the correlation between being a member of LW/SIAI and agreeing with LW/SIAI that future AI is one of the most important things to worry about. There are several possible causes:
And we want to know whether #1 is strong enough that we're drifting towards a cult attractor or some other groupthink attractor.
I'm not instantly sure how to answer this, but I thought it might help to rephrase this more explicitly in terms of causal inference.
I'm not sure that your rephrasing accurately captures what I was trying to get at. In particular, strictly speaking (*) doesn't require that one be a part of a group , although being part of a group often plays a role in enabling (*).
Also, I'm not only interested in possible irrational causes for LW/SIAI members' belief that future AI is one of the most important things to worry about, but also possible irrational causes for each of:
(1) SIAI members' belief that donating to SIAI in particular is the most leveraged way to reduce existential risks? Note that it's possible to devote ones' live to a project without believing that it's the best project for additional funding - see Givewell's blog posts on Room For More Funding:
For reference, PeerInfinity says
(2) The belief that refining the art of human rationality is very important.
On (2), I basically agree with Yvain's post Extreme Rationality: It's Not That Great.
My own take is that the Less Wrong community has been very enriching in some of its members lives on account of allowing them the opportunity to connect with people similar to themselves, and that their very positive feelings connected with their Less Wrong experience have led some of them to overrate the overall importance of Less Wrong's stated mission. I can write more about this if there's interest.
Thank you for clarifying. I don't think I really have an opinion on this, but I figure it's good to have someone bring it up as a potential issue.
I'm interested. I've been thinking about this issue myself for a bit, and something like an 'internal review' would greatly help in bringing any potential biases the community holds to light.