kodos96 comments on Taking Ideas Seriously - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (257)
I would not question what you are taking seriously and it seems fairly typical of the LW group.
On the other hand, I am surprised that climate change is rarely or never mentioned on LW. The lost of biodiversity and the rate of extinction - ditto. We are going through a biological crisis. It is bad enough that a 'world economic collapse' might even be a blessing in the long term.
You do not mention the neuroscience revolution but I am sure I have noticed some of the LW group taking it seriously.
This may be the place to mention cryonics without starting another riot. It is boring to me and I do not take it seriously - but I have reasons and I am not judging others who are not in my situation. (1) I had cancer when I was very young, in the olden days when all most everyone diedt. I waited for years for it to reappear. I got used to my mortality. Now I am 70 and quite comfortable with death within the next decade or two. (2) I am very poor, living in a small pension. I could not pay for it if I wanted to. (3) I don't believe that being brought back to life is a technical question only. I think that future generations will not actually value the preserved bodies because they will not value what these people know or can do. They may want some of the famous people alive today but they will not value someone as ordinary as myself. (4) I do not want to be without a body as some disembodied brain or be in a damaged body. I doubt that I would be happy as an immortal.
My hypothesis would be that this is due to these issues falling within the Correct Contrarian Cluster
I don't understand your comment. Do you mean that climate change and biodiversity are not discussed because everyone in LW thinks the same about them? because there is nothing to say? because there is nothing that can be done? because it is settled science? Please explain how issues falling within the correct contrarian cluster are not discussed at all and why you think that these issues fall within the cluster.
Well, I was just speculating - I don't actually have any idea what the LW community in general thinks of the issue. What I was attempting to speculate is that the reason these topics aren't discussed much is because the contrarian/skeptical position on them is clustered with the set of contrarian positions commonly held by LWers, and therefore aren't discussed much since the contrarian position on them is basically that they aren't deserving of much attention, especially relative to the kinds of existential risks LW is concerned with.
I'm not sure how much more detail I can go into on my thinking without violating the "no current politics" rule.
Something I should have said in my previous reply. I agree with the "no current politics" rule. My problem is with what is politics - to some everything is and to some almost nothing is. When a subject is a purely scientific one and the disagreement is about whether there is evidence and how to interpret it, then this is a area for rationality. We should be looking at evidence and evaluating it. That does not involve what I would call politics.
When I first got here I thought "existential risk" referred to a generalization of the ideas related to catastrophic climate change. That is, if we should plan for the low-probability but deadly event that climate change will be very severe, then we should also plan for other low-probability (or far-future) catastrophes: asteroid impacts, biological and nuclear weapons, and unfriendly AI, among others. I was surprised that, of the existential risks discussed, catastrophic climate change never seems to come up at all.
It's possible that this is an innocent result of specialization: people here spend most of their time thinking about AI, and not about other things that they aren't trained for.
If there were an organization committed to clarifying how we think about planning for low-probability risks, that organization really ought to consider climate change among other risks. It would be an interesting thing to study: how far in the future is it reasonable for present-day institutions to plan? How can scientists with predictions of possible catastrophe effectively communicate to governments, businesses, etc. that they need to plan, without starting a panic? The art of planning for existential risks in general is something that could really benefit from more study.
And it ought to include well-studied and well-publicized risks (like climate change) in addition to less-studied and less-publicized risks (like risks from technology not yet developed.) People have been planning for floods for a long time; surely people concerned about other risks can learn something from people who plan for the risk of floods.
But I don't think SIAI or LessWrong is equipped for that mission.
I think you're looking for the Future of Humanity Institute and their work on Global Catastrophic Risks
It would be nice if people could use some rationality in deciding which ideas to be contrarian on. Maybe I live in an ivory tower but I don't see any connection between biological/environmental dangers and politics.