You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Bart119 comments on Stupid Questions Open Thread Round 2 - Less Wrong Discussion

15 Post author: OpenThreadGuy 20 April 2012 07:38PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (208)

You are viewing a single comment's thread.

Comment author: Bart119 24 April 2012 05:05:28PM 2 points [-]

I've been aware of the concept of cognitive biases going back to 1972 or so, when I was a college freshman. I think I've done a decent job of avoiding the worst of them -- or at least better than a lot of people -- though there is an enormous amount I don't know and I'm sure I mess up. Less Wrong is a very impressive site for looking into nooks and crannies and really following things through to their conclusions.

My initial question is perhaps about the social psychology of the site. Why are two popular subjects here (1) extending lifespan, including cryogenics, (2) increasingly powerful AIs leading to a singularity. Is there an argument that concern for these things is somehow derivable from a Bayesian approach? Or is it more or less an accident that these things are of interest to the people here?

Examples of other things that might be of interest could be (a) "may I grow firmer, quieter, warmer" (rough paraphrase of Dag Hammarskjold), (b) I want to make the very best art, (c) economics rules and the key problem is affording enough for everyone. I'm not saying those are better, just that they're different. Are there reasons people here talk about the one set and not the other?

Comment author: Alejandro1 24 April 2012 06:11:26PM 5 points [-]

Welcome to LW! You pose an interesting question.

My initial question is perhaps about the social psychology of the site. Why are two popular subjects here (1) extending lifespan, including cryogenics, (2) increasingly powerful AIs leading to a singularity. Is there an argument that concern for these things is somehow derivable from a Bayesian approach? Or is it more or less an accident that these things are of interest to the people here?

I think there is a purely sociological explanation. LW was started by Eliezer Yudkowsky, who is a transhumanist and an AI researcher very concerned about the singularity, and his writings at Overcoming Bias (the blog from which LW was born by splitting) naturally tended to attract people with the same interests. But as LW grows and attracts more diverse people, I don't see why transhumanism/futurism related topics must necessarily stay at the forefront, though they might (path-dependence effect). I guess time will tell.

Examples of other things that might be of interest could be (a) "may I grow firmer, quieter, warmer" (rough paraphrase of Dag Hammarskjold), (b) I want to make the very best art, (c) economics rules and the key problem is affording enough for everyone. I'm not saying those are better, just that they're different. Are there reasons people here talk about the one set and not the other?

If you have something interesting to say about these topics and the application of rationality to them, by all means do! However, about topic (c) you must bear in mind that there is a community consensus to avoid political discussions, which often translates to severely downvoting any post that maps too closely to an established political/ideological position.

Comment author: gwern 24 April 2012 06:59:08PM 4 points [-]

Why are two popular subjects here (1) extending lifespan, including cryogenics

This is factually false. I suspect if you looked through the last 1000 Articles or Discussion posts, you'd find <5% on life extension (including cryonics) and surely <10%.

Cryonics does not even command much support; in the last LW survey, 'probability cryonics will work' averaged 21%; 4% of LWers were signed up, 36% opposed, and 54% merely 'considering' it. So if you posted something criticizing cryonics (which a number of my posts could be construed as...), you would be either supported or regarded indifferently by ~90% of LW.

Comment author: Vladimir_Nesov 24 April 2012 08:11:55PM *  1 point [-]

As I wrote in a comment to the survey results post, the interpretation of assignment of low probability to cryonics as some sort of disagreement or opposition is misleading:

... if ... probability of global catastrophe ... [is] taken into account ... even though I'm almost certain that cryonics fundamentally works, I gave only something like 3% probability. Should I really be classified as "doesn't believe in cryonics"?

Comment author: gwern 24 April 2012 08:21:47PM *  0 points [-]

Of course not. Why the low probability is important is because it defeats the simplistic non-probabilistic usual accounts of cultists as believing in dogmatic shibboleths; if Bart119 were sophisticated enough to say that 10% is still too much, then we can move the discussion to a higher plane of disagreement than simply claiming 'LW seems obsessed with cryonics', hopefully good arguments like '$250k is too much to pay for such a risky shot at future life' or 'organizational mortality implies <1% chance of cryopreservation over centuries and the LW average is shockingly optimistic' etc.

To continue your existential risk analogy, this is like introducing someone to existential risks and saying it's really important stuff, and then them saying 'but all those risks have never happened to us!' This person clearly hasn't grasped the basic cost-benefit claim, so you need to start at the beginning in a way you would not with someone who immediately grasps it and makes a sophisticated counter-claim like 'anthropic arguments show that existential risks have been overestimated'.

Comment author: Bart119 24 April 2012 07:38:16PM 1 point [-]

Where can I find survey results? I had just been thinking I'd be interested in a survey, also hopefully broken down by frequency of postings and/or karma. But if they've been done, in whatever form, great.

Comment author: gwern 24 April 2012 08:10:49PM 0 points [-]
Comment author: MinibearRex 26 April 2012 09:19:27PM 0 points [-]

Why are two popular subjects here (1) extending lifespan, including cryogenics, (2) increasingly powerful AIs leading to a singularity. Is there an argument that concern for these things is somehow derivable from a Bayesian approach? Or is it more or less an accident that these things are of interest to the people here?

The short answer is that the people who originally created this site (the SIAI, FHI, Yudkowsky, etc) were all people who were working on these topics as their careers, and using Bayesian rationality in order to do those things. So, the people who initially made up the community were made up, in large part, of people who were interested in those topics and rationality. There is a bit more variation in this group now, but it's still generally true.