For most people, climate change is pretty much the only world-scale issue they've heard of. That makes it very important (in relative terms); climate change has a world-scale impact, and no other issues they're familiar with do, so it's very important.
LessWrong has a history of dealing with other world-scale issues, and EA (an overlapping neighboring community) likes to make a habit of pointing out all the cause areas and weighing them against each other. When climate change is weighed against AI risk, animal welfare, biosafety, developing-world poverty, and various meta-level options... well, AGW didn't get any less important in absolute terms, but you can see why people's enthusiasm and concern might lie elsewhere.
As a secondary issue, this is a community that prides itself on having high epistemic standards, so when the advocates of a cause area have conspicuously low epistemic standards, it winds up being a significant turn-off. When you have a skeptical eye, you start to automatically notice when people make overblown claims, and recommend interventions that obviously won't help or will do more harm than good. Most of what I see about AGW on social media and on newspaper front pages falls into these categories, and while this fact isn't going to show up on any cause-prioritization spreadsheets, on a gut level it's a major turnoff.
For an example of what I'm talking about, look into the publicity surrounding hydrogen cars. They're not a viable technology, and this is obvious to sufficiently smart people, but because they claim to be relevant to AGW, they get a lot of press anyways. The result is a con-artist magnet and a strong ick-feeling which radiates one conceptual level out to AGW-interventions in general.
For an example of what I'm talking about, look into the publicity surrounding hydrogen cars. They're not a viable technology, and this is obvious to sufficiently smart people, but because they claim to be relevant to AGW
Elon made his bet on battery driven cars. It's not clear to me that it's the right call. Hydrogen can be stored for longer timeframes which means that in a world where most of our energy comes from solar cells you can create it from surplus energy in the summer and use it up in the winter while batteries can only charge with energy that's available at the particular time you want to charge your car.
For most people, climate change is pretty much the only world-scale issue they've heard of. That makes it very important (in relative terms)
Suppose climate change were like air pollution: greenhouse gas emissions in New York made it hotter in New York but not in Shanghai, and greenhouse gas emissions in Shanghai made it hotter in Shanghai but not in New York. I don't see how that would make it less important.
Furthermore, if you seek to contribute to a global cause via technical means then it often makes sense to specialize. If you know you can have a greater marginal impact in biosafety than AGW then you should allocate (almost) all of your altruistic attention to biosafety and (almost) none of it to AGW.
Epistemic status: You asked, so I'm answering, though I'm open to having my mind changed on several details if my assumptions turn out to be wrong. I probably wouldn't have written something like this without prompting. If it's relevant, I'm the author of at least one paper commissioned by the EPA on climate-related concerns.
I don't like the branding of "Fighting Climate Change" and would like to see less of it. The actual goal is providing energy to sustain the survival and flourishing of 7.8+ billion people, fueling a technologically advanced global civilization, while simultaneously reducing the negative externalities of energy generation. In other words, we're faced with a multi-dimensional optimization problem, while the rhetoric of "Fighting Climate Change" almost universally only addresses the last dimension, reducing externalities. Currently 80% of worldwide energy comes from fossil fuels and only 5% comes from renewables. So, simplistically, renewables need to generate 16x as much energy as they do right now. This number is "not so bad" if you assume that technology will continue to develop, putting renewables on an exponential curve, and "pretty bad" if you assume that renewables continue to be implemented at about the current rate.
And we need more energy generating capacity than we have now. A lot more. Current energy generation capacity only really provides a high standard of living for a small percentage of the world population. Everybody wants to lift Africa out of poverty, but nobody seems interested in asking many new power plants that will require. These power plants will be built with whatever technology is cheapest. We cannot dictate policy in power plant construction in the developing world; all we can do is try to make sure that better technologies exist when those plants are built.
I have seen no realistic policy proposal that meaningfully addresses climate change through austerity (voluntary reduced consumption) or increased energy usage efficiency. These sorts of things can help on the margins, but any actual solution will involve technology development. Direct carbon capture is also a possible target for technological breakthrough.
So, simplistically, renewables need to generate 16x as much energy as they do right now.
You can drown in a river that's on average 1cm deep. The problem is a lot harden then simply producing 16x as much energy with renewables.
We have been working on technological fixes for over 50 years, and we don't have anything that could realistically address the problem to show for it.* We should at least consider the possibility that a technological fix will not be available. **
Humans are often wrong-genre savvy. Most people in the rationalist community seem to think we're in a Star Trek prequel, but we may actually be in a big budget reboot of Decline and Fall of the Roman Empire. For what it's worth, the guy who's playing Caligula is a great performer. Huge talen...
I believe climate change will provide a significant (>90%) net-negative (>90%) impact on future human welfare in the forseeable future. I believe (>90%) that climate change is a large-scale policy and technology problem i.e. individual self-regulation has more to do with fuzzies than utilons. I have triaged climate change as a less-than-optimal target of my limited resources. Climate change therefore deserves no more further attention from me.
To put it bluntly, I believe
AGI therefore overwhelms all my other actionable large-scale altruistic concerns. In particular, climate change is a relatively minor (>%75) threat I am unlikely to significantly influence directly via intentional action (>95%). Furthermore, climate change is likely (>75%) to have relatively minor (affecting <10% of my overall material standard of living) negative personal impact on me. Thought climate change is important to human welfare, I ought not to be "concerned" about it at all.
As for mass media's respresentation of climate change, I think it's crap—just like all other propaganda. This is by design.
I'll bite --- as a "not feeling alarmed" sort of person. First, though, I'll clarify that I'm reading"climate change" as shorthand for "climate change that is net-negative for human welfare" (herein CCNNHW), since obviously the climate is in a state of constant change.
Confidence levels expressed as rough probabilities:
0.70 : we are observing CCNNHW
0.80 : current human behavior increases probability of CCNNHW
0.10 : future magnitude of CCNNHW will be massive
0.98 : future human behavior will change, given CCNNHW
0.90 : some current and proposed mitigations are themselves NNHW
0.60 : some proposed mitigations have negative effects rivaling that of CC
0.50 : it's possible to design a net-positive mitigation [1]
0.10 : it's possible to implement a net-positive mitigation [2]
Taken together, I assign higher risk to collective, politically directed efforts to mitigate CC than to CC itself.
---
[1] non-linear feedback effects depress this value
[2] political processes depress this value
Human acitivies contribute to climate change.
Changing to renewable energy is both very expensive as we don't have a good way to store energy and provides systemtic risk because sun and wind are unreliable in many geographies and especially as more of our infrastracture depends on electricity instead of oil an electricity outage of one or two week produces bigger problems.
There were studies that modeled hydropower plants as being able to store a lot of energy and release it when needed but that's not how hydropower plants work. If a hydropower plant releases much more energy then on average in a shorter timeframe it floods the regions further down the river.
The electricity system requires that the amount of energy that gets pulled from the system is equal to the electricity that's put into the system. If that equality breaks down, the system breaks down. We don't have good mechanisms to reduce power consumption, so if not enough energy gets produced we usually have to create full power outages in a region.
There's not enough political will to switch our energy system to either renewable energy or nuclear plants in a timeframe that's enough to prevent undesirable climate changes on it's own. Given that a single actor can implement geoengineering, geoengineering will be used starting between 2030 and 2060 decades to reduce climate impact.
Nitrogen and phosphorus pollution should likely get more attention as it's currently getting as those lifecycles are not working well.
Coal kills many people through direct airpollution. In cities non-electric cars emit both airpollution and noise pollution. That means that it's desireable to switch to electric cars and less coal plants besides climate change concerns.
There's a good chance that the great stagnation is partly caused by the stagnation in energy prices that good cheaper year-by-year before the great stagnation. This means it's very valuable for future technological growth to have cheap energy.
Both AI safety, bio-safety and global peace seem more important cause areas as they have more associated risk then climate change.
I agree with the other answers that say climate change is a big deal and risky and worth a lot of resources and attention, but it’s already getting a lot of resources and attention, and it’s pretty low as an existential threat.
Also, my impression is that there are important facts about how climate change works that are almost never mentioned. For example, this claim that there are diminishing greenhouse effects to CO2: https://wattsupwiththat.com/2014/08/10/the-diminishing-influence-of-increasing-carbon-dioxide-on-temperature/
Also, I think most of the activism I see around climate change is dumb and counterproductive and moralizing, e.g. encouraging personal lifestyle sacrifices.
That link is from a climate change denier, so it is probably taken grossly out of context or something.
Some actual facts I think most people don't know: Sea level rise is caused by melting glaciers + thermal expansion, not melting sea ice (because physics). Warming oceans might cause a decrease in tropical cyclone frequency and increase in intensity (page 163 of this IPCC report).
The link says a lot of things, but the basic claim that greenhouse forcing is logarithmic as a function of concentration is as far as I know completely uncontroversial.
Climate change is obviously real and getting worse. We are seeing the early effects already, and they are straining our emergency measures beyond capacity. Immediate and widespread systemic changes are needed to alter course.
I am powerless to effect such changes.
I suspect that climate change is both overhyped and underhyped.
I expect that the current models underestimate the rate of change, and that the Arctic, permafrost, Greenland and eventually Antarctic will melt much sooner than projected, with the corresponding sea level rise. A lot of formerly livable places will stop being so, whether due to temperature extremes or ending up underwater.
That said, even the highest possible global warming will not exceed what happened 50 million years ago. And that time was actually one of the best for the diversity of life on Earth, and it could be again. What we have now is basically frozen leftovers of what once was.
That said, the scale of the warming is unprecedented, and so a lot of wildlife will not be able to adapt, and will go extinct, only for the new varieties of species to take their habitats.
That said, humans will suffer from various calamities and from forced migration north into livable areas. There will be population pressures that will result in disappearance of the current Arctic states like Russia, Canada and Denmark's Greenland. And this will not happen without a fight, hopefully not a nuclear one, but who knows.
That said, there are plenty of potential technological ways to cool the planet down, and some may end up being implemented, whether unilaterally or consensually. This may happen as a short-term measure until other technologies are used to remove carbon dioxide from the atmosphere.
TL;DR: Climate change is a slow-moving disaster, but not an X-risk.
I am generally concerned, and also think this makes me an outlier. I don't have any specific model of what will happen.
This is a low information belief that could definitely change in the future. However, it doesn't seem important to figure out how dangerous climate change is exactly because doing something about it is definitely not my comparative advantage, and I'm confident that it's less under-prioritized and less important than dangers from AI. It's mostly like, 'well the future of life institute has studied this problem, they don't seem to think we can disregard it as a contributor to existential risk, and they seem like the most reasonable authority to trust here'.
A personal quibble I have is that I've seen people dismiss climate change because they don't think it poses a first-order existential risk. I think this is a confused framing that comes from asking 'is climate change an existential risk?' rather than 'does climate change contribute to existential risk?', which is the correct question because existential risk is a single category. The answer to the latter question seems to be trivially yes, and the follow-up question is just how much.
It's mostly like, 'well the future of life institute has studied this problem, they don't seem to think we can disregard it as a contributor to existential risk, and they seem like the most reasonable authority to trust here'.
Woah, yeah, just let it be known that I don't think you should trust FLI with this kind of stuff. They seem to pretty transparently have messed up prioritization in this way a few times, trying to be more appealing to a broader audience, by emphasizing hypotheses that seem intuitively compelling but not actually very likely to be true...
FWIW I don't think the FLI is that reasonable an authority here, I'm not sure why you'd defer to them.
They do a good job coordinating lots of things to happen, but I think their public statements on AI, nukes, climate change, etc, are often pretty confused or are wrong. For example, their focus on lethal autonomous weapons seems confused about the problem we have with AI, focusing on the direct destructive capabilities of AI instead of the alignment problem where we don't understand what decisions they're even making and so cannot in-principle align their ...
As per above, it is a difficult question. However, even if we found a good solution, the issue has become so politicised that carrying out any plan without massive disruption by interest groups is unavoidable.
When I began writing this, I thought very little good could be done by working on climate change, since of how popular the topic is. But as I wrote, and thought about the issue, I realized that you have a point, and that working on effective solutions to the problem has a high chance of being effective, if not particularly suited for me. I would enjoy seeing more in-depth analyses which do actual research, and attach numbers to the vague feelings of importance I express here.
Using EA's usual decision matrix of Scale, Solvability, and Neglectedness :
Neglectedness, at first glance, seems very low. For the past 20 years there's been a huge media campaign to get people to "solve" climate change, and everyone's aware of it. However, very little effort is expended working & advocating for effective solutions to the problem (ie helping developing countries prepare), and much of the effort seems to be going to low-Solvability & Scale tasks such as attempting to prevent carbon emissions. Thus, despite near-constant media attention, it seems likely that effective solutions are actually very Neglected.
Scale seems pretty large. Those hit hardest will be the people with the least ability to invest in mitigation technologies, and most reliance on nature. Aka: developing countries. Thus lifting developing countries out of their poverty will be much harder in the near-term future. Notably, this poses little risk to the long-term flourishing of the human race, whereas other global catastrophic risks such as dangerous AI, nuclear war, biological war, etc. seem to have both a higher Scale, and higher Neglectedness.
Solvability seems like it'd range from insurmountably low to medium-high, depending on what you choose to focus on. Many of the problems that affect more affluent nations seem like they'd be solved through mitigating technologies, and not through reversing climate change's effects. Things like dams and levees are technologies we already have, and things that the Dutch (note: I looked that up, so I could provide a source, but I knew it was a thing already from an Environmental Science course I took during high school) already use to keep their cities above sea-level. I would bet there are other, similarly low-hanging technologies which would vastly lower the effects of climate change on developing countries. These developing countries would likely develop and implement these technologies once effects from climate change are seen, regardless of what they believe the cause of such climate change is.
Increases in resources here though, seem like they'd have little impact on the outcome for these developing countries. Since there is a large incentive for cities and companies to make and invest in these technologies, they will likely be developed regardless of what interventions are worked on.
By my understanding, even if we stopped all of our carbon output immediately, there'd still be a devastating 2C increase in the average temperature of the earth. And developing countries would be at a great disadvantage developing the infrastructure needed to mitigate it's effects, so the Solvability here is incredibly low.
Thus the goal of "fighting" climate change should focus on providing developing countries the infrastructure they need to be prepared. This doesn't seem like particularly interesting work to me, nor particularly suited to my skills when compared to other ways of improving the world. However, I'd need more knowledge about the effects and the current effective interventions to be confident in my conclusions. Currently, counter to what I thought before writing this, the field seems promising.
Developing technologies and best practices for enabling people to quickly adapt farming practices to a different local environment (rainfall, temperature, etc), including education and outreach and possibly switching crops (or generically engineering / breeding new varieties) along with associated farming tools and know-how and distribution and storage systems etc., would seem helpful for mitigating the damage of not only climate change but also nuclear winter / volcanic winter. While this seems very hard to do completely, it seems feasible to make progres...
By my understanding, even if we stopped all of our carbon output immediately, there'd still be a devastating 2C increase in the average temperature of the earth.
I don't think this is true:
...According to an analysis featured in the recent IPCC special report on 1.5C, reducing all human emissions of greenhouse gases and aerosols to zero immediately would result in a modest short-term bump in global temperatures of around 0.15C as Earth-cooling aerosols disappear, followed by a decline. Around 20 years after emissions went to zero, global temperatures woul
This analysis assumes that we won't do geoengineering. If we do geoengineering to keep temperatures from increasing too much over the present point all the spending on mitigation is wasted.
I am greatly concerned about the risks associated with climate change and have been for several years now, though earlier in my adult life I didn't know much about it and gave too much credence to skeptics such as Bjorn Lomborg. I anticipate that (barring some kind of singularity that makes a mockery of all prediction) the greatest harms from climate change this century will come from mass displacement and migration ("climate refugees"); indeed already there are folks talking about leaving California to escape the ever-growing annual fire seasons. The same will happen (is happening) for those along flooding coastlines or increasingly drought-stricken or fish-depleted regions. Also important to consider are tail-risks, the small but non-negligible possibility that actual warming turns out rather higher than the (already bad!) average-case predictions (see Martin Weitzmann's work, or David Wallace-Wells's famous NY Mag article "The Uninhabitable Earth").
If the recent hype from MIT about nuclear fusion is for real, maybe we can all breathe a sigh of great relief—it could turn out to be some of the best, and most significant, news of the century. We should have been building out old-fashioned nuclear power for decades now, but we are civilizationally inadequate to this sort of basic collective foresight and action. Other high-value actions include modernizing the electrical grid and increasing by orders of magnitude funding for basic research in clean energy, and of course a hefty carbon tax, for Christ's sake (civilizational inadequacies abound!). Geoengineering should be a last resort, since messing with the world's atmospheric/oceanic systems is what got us into this mess in the first place. They are complex nonlinear systems that we literally rely on being relatively stable for the continued existence of humanity; screwing up geoengineering, like screwing up artificial superintelligence, could be the last mistake our species makes.
Just writing out some current beliefs in steam of consciousness. Percentages in parentheses are confidence levels.
Global warming and climate change are happening, and we're well on track to pass 2.5 C total rise. The best way to mitigate this is to reduce fossil fuel use approximately yesterday, and cutting other GHG emissions (85%). The second-most-important thing to do is to adapt to the changes, and trying to sequester carbon or turn back the clock by other means is lower priority than that (60%).
The most camera-friendly impact felt by the developed world will be sea level rise, but I think the biggest problem will be drought and shifting climate patterns in the developing world (70%). A lot of people are going to die or be left in precarious situations (gut estimate: 300M displaced over the next 80 years). I think cataclysmic scenarios such as runaway greenhouse effect, releasing atmosphere-changing amounts of methane hydrates, or melting the Greenland ice sheet are relatively unlikely and not the main problem even after weighting them by importance (75%).
I am concerned about climate change, and I believe we need to have serious changes in the way our world works. We need to be more sustainability-minded in terms of economic growth, for example -- infinite growth is not reasonable. And we seriously need to switch to clean energy sources. The inflation reduction act gives me a lot of hope for this. It's ridiculous how subsidized fossil fuels are. We're dumping money into destroying ourselves when we should be investing in reversing the damage. I'm concerned, but I have to be hopeful -- otherwise I'd be constantly depressed. I don't like the word "alarmed" because I am fighting for a better future by donating, voting, protesting, etc. and "alarmed" sounds like panic, which isn't helpful. I try not to think about climate change too often or else I'd feel totally stuck. Instead I generally try to stay up-to-date on how to help environmentalist movements, try to stay up-to-date on policies and science, etc.
I'm very concerned about climate change having a large negative impact. It seems unlikely to be threatening systemic collapse, but a lot of unnecessary suffering and a slowdown of long-run progress seems likely. Some small risk of extreme scenarios also seems to exist.
My view is that it's not a technological problem but a political one: it would be easy to solve with a global governing body. We have nuclear technology for power, and temporarily reduced availability of vehicle fuel seems to be a small problem that we could easily adapt to. The standard solution of taxing negative externalities should work just as well for this as for other things.
Because of our inept institutions, I believe the only likely solution, apart from accepting a dramatic adaptation with huge biodiversity loss, is that the damages motivate a coalition of major powers to strike a deal and use economic or political leverage to force everyone else into it. Either China and the US change their minds and the EU agrees happily, or India threatens unilateral solar radiation management. Both scenarios seem a couple of decades away.
While the issue is important and interesting I'm quite pessimistic about getting a timely solution, and about the possibility of individuals to make a difference. I still believe most people in a hundred years will have a standard of living similar to Western people now, but lots of suffering will come between now and then.
My position is similar to that of 80000 hours: it seems like a super high impact cause, vying for the top with AI risk, pandemic risk, global poverty, and maybe 1 or 2 others. But is far more widely recognized and worked-on than those other causes. Enough so that it doesn’t seem like the marginal thing I can do is interesting compared to other problems I could work on.
My models for how to work on it if I did decide to work on it: 1) technology - we should have technology that solves the problem if widely enough deployed. I think we are basically there with nuclear and solar PV+energy storage, so I would probably only spend 10% or so of time getting up to speed on the technology before focusing on
I don’t have a clear policy agenda but it seems like some combination of carbon tax, investment in PV, and nuclear is the right way to go. I currently would expect that work on the nuclear blind spot would be the most leveraged thing. The reason we have a blind spot seems to be the work of environmentalists from the 70s. As long as we could get them to flip, that could propagate through society in a useful way.
I completely agree, and would like to add that I personally draw a clear line between "the importance of climate change" and "the importance of me working on/worrying about climate change". All the arguments and evidence I've seen so far suggest solutions that are technological, social(/legal), or some combination of both. I have very little influence on any of these, and they are certainly not my comparative advantage.
If OP has a scheme where my time can be leveraged to have a large (or, at least, more than likely cost-effective) impact on climate change then this scheme would instantly be near the top of my priorities. But as it stands my main options are mostly symbolic.
As an aside, and also to engage with lincoln's points, I am highly sceptical of proposed solutions that require overhauls in policy and public attitude. These may or may not be the way forward, but my personal ability to tip the scales on these matters are slim to none. Wishing for societal change to suit any plans is just that, a wish.
I'm trying to reply as little as possible to the comments of this post to avoid influencing the future replies I'll get, but in this case I felt that it was better to do so, since this point is likely an important one to determine the interest users will have for this subject, and consequently to determine how many replies I'll have.
I'm aware that it wouldn't be very useful to make a post exclusively aimed at making the users of this site feel more worried about climate change.
What the individual users of this site can do about it, considering the cost-effectiveness of the possible actions, will be treated extensively in the post I'm planning on the subject. I'd rather not try to summarise them here because I couldn't explain them effectively.
If anyone reading this comment has the same opinions of Major, please write them so anyway.
Regarding one’s ability to effect social change: It seems like the standard arguments about small-probability, high-impact paths apply. I think a lot of STEM types tend to default to shy away from policy change, not because of comparative advantage (which would often be a good reason) but because of some blind spot in the way technologists talk about how to get things done in society. I think for historical reasons (the way the rationality community has grown) we tend to be biased towards technical solutions and away from policy ones.
I definitely agree that there is a bias in this community for technological solutions over policy solutions. However, I don't think that this bias is the deciding factor for judging 'trying to induce policy solutions on climate change' to not be cost-effective. You (and others) already said it best: climate change is far more widely recognised than other topics, with a lot of people already contributing. This topic is quite heavily politicized, and it is very difficult to distinguish "I think this policy would, despite the high costs, be a great benefit to humanity as a whole" from "Go go climate change team! This is a serious issue! Look at me being serious!".
Which reminds me: I think the standard counter-argument to applying the "low probability, high impact" argument to political situations applies: how can you be sure that you're backing the right side, or that your call to action won't be met with an equal call to opposite action by your political opponents? I'm not that eager to have an in-depth discussion on this in the comments here (especially since we don't actually have a policy proposal or a method to implement it), but one of the main reasons I am hesitant about policy proposals is the significant chance for large negative externalities, and the strong motivation of the proposers to downplay those.
Emiya said cost-effectiveness will be treated extensively, and I am extremely eager to read the full post. As I said above, if there is a cost-effective way for me to combat climate change this would jump to (near) the top of my priorities instantly.
My impression of the consensus is that at the scale of human civilization, climate change is expected to slowly inflict significant (but not global catastrophic) economic and humanitarian damage, sometimes forcing whole regions to change how they do things, and that it's cost-effective to coordinate while it's not entirely too late to reduce this damage by reducing climate change. Many people know this (it's not a neglected cause), so additional personal effort won't meaningfully affect things. There is essentially no direct existential risk. Is this a mistaken impression (about the consensus, not about the issue itself), are there more significant aspects?
So I'm not at all concerned about this. In the sense that being justifiably concerned is to expect that additional effort or study in this direction is one of the most valuable things I should be doing.
First, upvotes and kudos for asking about current attitudes and opinions before diving into specifics and explanation/exhortation on the topic. This is awesome - well done!
My general opinion is that this topic is too politicized to be a great fit for LessWrong. Objective modeling of climate change and making predictions might be OK, but somewhere before you get to "mass media attitude", you've crossed the line into talking about outgroup errors and things other than truth-seeking in one's own beliefs. Even when focusing on predictions and truth (for which other people's actions is definitely in scope), this is hard-mode discussion, and likely to derail by confusing positive with normative elements of the analyses.
I'd keep it off of LW, or perhaps have a linkpost and see the reaction - maybe I'm wrong.
My personal opinion: climate change (and more directly, conflict caused or exacerbated by it) is the single biggest risk to human-like intelligence flourishing in the galaxy - very likely that it's a large component of the Great Filter. And it's caused by such deep human drives (procreation and scope-insensitive caring about our young in the short-term) that it's probably inevitable - any additional efficiency or sustainability we undertake will get used up by making more people. I'd like to see more focus on how to get truly self-sufficient Mars (and asteroid/moons) colonies of at least 100K people with a clear upward slope, and on how to get at least 0.5B people to survive the collapse of earth, with enough knowledge and resources that the dark age lasts less than 300 years. I don't currently see a path to either, nor to a reduction in human population and resource usage that doesn't include more destruction in war than it's worth.
My personal opinion: climate change (and more directly, conflict caused or exacerbated by it) is the single biggest risk to human-like intelligence flourishing in the galaxy - very likely that it's a large component of the Great Filter.
I don't think that the idea of the Great Filter fits very well here. The Great Filter would be something so universal that it eliminates ~100% of all civilizations. Climate change seems to be conditional on a number of factors specific to earth, e.g. carbon-based life, green-house gas effects, interdependent civilization etc., that it doesn't really work well as a factor that eliminates nearly all civilizations at a specific level of development.
My suspicion is that it generalizes well beyond mechanisms of greenhouse gasses or temperature ranges. The path from "able to manipulate a civilization's environment at scale" to "able to modulate use of resources in order not to destroy said civilization", with an added element of "over-optimization for a given environment rendering a nascent civilization extremely vulnerable to changes in their environment" could easily be universal problems.
It's the fragility that worries me most - I believe that if we could remain calm and coordinate the application of mitigations, we could make it through most of the projected changes. But I don't believe that we CAN remain calm - I suspect (and fear) that humans will react violently to any significant future changes, and our civilization will turn out to be much much easier to destroy than to maintain.
Regardless of whether it's universal, that's the x-risk I see to our brand of human-like intelligent experiences. Not climate change directly, but war and destruction about how to slow it down, and over who gets the remaining nice bits as it gets worse.
An angle that's interesting (though only tangentially connected with climate change) is how civilizations deal with waste heat.
My personal views on climate change are extremely heterodox among the rationalist community*, but not uncommon among intellectuals of other stripes:
Since this thread is a poll, it should go without saying that reasonable people disagree. But I said it anyway.
*The "rationalist community" is centered around Silly Con Valley and tends to be credulous about the potential of technology.
**Energy return on energy invested. A recent solar plant in Spain managed an EROEI of roughly 3. An EROEI of 12 is thought to be sufficient to support a stripped-down, efficiency-oriented, zero-growth version of civilization as we know it. In the 1960s, oil wells with EROEIs of thousands were available; they're all but gone now.
Well, I share the majority of your points. I think that in 30 years millions of people will try to relocate in more fertile areas. And I think that not even the firing of the clathrate gun will force humans to coordinate globally. Although I am a bit more optimist about technology, the actual status quo is broken beyond repair
I’m surprised at these EROI figures: that solar PV is producing energy at very low levalised cost but utterly pathetic EROEI fails the sniff test. A quick scoot through Wikipedia finds a methodological argument (comments on https://www.sciencedirect.com/science/article/abs/pii/S0360544213000492?via%3Dihub).
Part of it is that high-performance solar cells require single-crystal silicon or gallium arsenide. The purification process for semiconductors is extremely energy intensive. The device fabrication processes are resource and energy intensive as well. But yes, storage is also a huge problem (especially for winter heating, etc.)
I suspect this is one of those cases where the truth is (moderately) outside the Overton window and forcing people to spell this out has the potential to cause great harm to the rationalist community.
Can I ask you more information on why do you think this could cause great harm to the community? I'd really rather not cause that and I'm relatively new here, so I might be wrong in my expectations on how people could react.
If we add expected technological change to the picture, climate change starts mattering even less. It's plausible that this kind of conclusion, when presented in sufficient detail as the position of the forum, can then be framed as some sort of climate change denialism or an attitude of gross irresponsibility. If the position on this topic is spelled out, it presents an opportunity for an attack that would otherwise be unavailable.
The alternative of only saying more politically correct things is even worse, encourages motivated reasoning. To some extent my own comment is already a victim (and perpetrator) of this problem, as the potential for such an attack motivated me to avoid mentioning that climate change matters much less than otherwise because of AGI timelines, where with sufficient AGI-accelerated technological progress the whole issue becomes completely irrelevant in a reasonable time, unless we get some civilization-shattering catastrophy that delays AGI by more than decades (compared to a no-catastrophy scenario), in which case climate change would also be the least of our problems. So I chose to talk about the consensus and not the phenomenon itself, as AGI timeline considerations are not part of the standard discussion on the topic.
These arguments are not needed for my thesis that people who are not already working on climate change shouldn't be concerned about it. And regulating climate change is still the cost-effective thing to do (probably), similarly to buying more of the things that are on a sale from a grocery store, no reason to abandon that activity. But the above point is a vital component of my understanding of seriousness of climate change as a civilizational concern, making it another order of magnitude less important than it would be otherwise.
It isn't at all my intention to frame the position of the forum as one of gross irresponsibility, or to use the replies I'll get to present the forum's position as one which is pro climate change denialism (either in the sense that climate change isn't happening, that it won't be harmful, or that it shouldn't be avoided).
I also won't try to censor my post by including only statements that would be uncontroversial in a laymen discussion (I don't like to use politically correct with that meaning), I believe this is one of the few sites where one can be both polite and accurate in his statements and also be perceived as so.
If you were instead worried that my question, the replies it got, or my planned future post, could be used by someone to attack the site or its users, I'd like to know more about it.
If it seems like a real risk, I'd take countermeasures such as avoid stating what the users beliefs are in my future post (NOTE: I'm not planning to link any beliefs I'd talk about to any specific users, my current plan is just to address the common beliefs about the subject and try to provide good informations and analysis about them) and prevent people from commenting on it. If what's been already said could already be a likely source of damage, I could try to find ways to sink or delete this question and the replies I got.
So far the greatest potential risk I see is to create a toxic discussion between the members of this community.
I don't want that to happen, of course, but feel that if the different positions aren't explained and if any eventual errors in them aren't corrected, toxic discussions could form every time related topics will be mentioned in the future in any post.
In another discussion that touched related topics, I wrote at least two comments that I still endorse and think have correct information and reasoning, but that I realised were unnecessarily angry in tone. Even worse, I realised my brain had switched on it's "politic debate" mode as I was writing a third one. All around, the discussion felt to me as being remarkably more similar to the kind of discussion one can see on an average site rather than the level of discussion I usually see here, and I believe that an important part of that is that there wasn't a diffused attempt to understand why people had different beliefs about the subject, and to figure out where the mistakes were.
The risk is of inciting a discussion that's easy to exploit for a demagogue (whether they participate in the discussion or quote it long after the fact). You don't have to personally be the demagogue, though many people get their inner demagogues awakened by an appropriate choice of topic. This indirectly creates a risk of motivated reasoning to self-censor the vulnerable aspects of the discussion. There's also a conflict about acceptability and harm of self-censoring of any kind, though discussing this at a sufficient level of abstraction might be valuable.
My reply in the grandparent is half-motivated by noticing that I probably self-censored too much in the original comment on this post. When it's just my own comment, however noncontroversial I expect its thesis to be, it's not yet particularly exploitable. If it eventually turns into a recurring topic of discussion with well-written highly upvoted posts, or ideologically charged highly debated posts, that might be a problem.
(To be clear, I don't know if the concern I'm discussing is close to what steven0461 was alluding to. Both the relevant aspect of truth and the harm that its discussion could cause might be different.)
I see. My current aim is to provide knowledge and reasoning that would actually lower the chances of such discussions happening, moving the subject of climate change away from ideology and political opinions.
I'll try to think of ways to further reduce the likelihood of exploitable discussions and demagoguing happening in my post. Knowing what I plan to write, I don't think such discussions would easily be created even if I didn't, though.
For my attempt ending up as increasing the likelihood of future posts and that leading to harmful discussions... I think it would require people being so determined in arguing about this and ignoring all the points I'd try to make that the current lack of posts on the subject wouldn't serve as a sufficient barrier to stop them from arguing about it now.
Lastly, the site seems to me as having been designed with very effective barriers about such things spiralling out of of control enough to make not trivial damage, though, since you have been on this site from a lot longer than me, I feel like I should value your intuition on the subject more than mine.
All considered, it feels to me that if I consider the risks in leaving the situation as it is and the benefits good reasoning on the subject could provide, what I should do is write my post and try to minimise the chances of the discussion on that turning out badly.
I'm not claiming that this is a likely scenario (it might be, but that's not the point). It's about the meaning, not the truth. The question is what kind of hazards specifically steven0461 might be referring to, regardless of whether such hazards are likely to occur in reality ("has the potential to cause great harm" is also not a claim about high credence, only about great harm).
Personally I feel the forum finds the topic uninteresting, so that it's hard to spark a continuing discussion, even if someone decides to write a lot of good posts on it. I also don't expect a nontrivial amount of toxic debate. But that's the nature of risks, they can be a cause for concern even when unlikely to materialize.
I mostly agree with Vladimir's comments. My wording may have been over-dramatic. I've been fascinated with these topics and have thought and read a lot about them, and my conclusions have been mostly in the direction of not feeling as much concern, but I think if a narrative like that became legibly a "rationalist meme" like how the many worlds interpretation of quantum mechanics is a "rationalist meme", it could be strategically quite harmful, and at any rate I don't care about it as a subject of activism. On the other hand, I don't want people to be wrong. I've been going back and forth on whether to write a Megapost, but I also have the thing where writing multiple sentences is like pulling teeth; let me know if you have a solution to that one.
I agree on your evaluation on the strategic harm this meme would cause if spread. I will have to be careful to not spread this narrative when I write about this subject, it's not a risk I had already considered.
The likelihood of this narrative spreading doesn't feel to me as lesser if I don't write my post or if I didn't wrote this question, though.
I posted this question specifically because I had noticed in several occasions comments that would support that narrative, especially if taken out of contest, also because they weren't throughly explaining the thought processes behind them, and I think I've saw a number of conversations below average for users of this site, so others could get that idea as well. But after hearing the reasonings in these comments "not caring about climate change" is not how I see the viewpoint of the community anymore and I have a model of it that's a lot less negative (in a sense of the utility values I assign to it). I still feel like I can provide an improvement, though.
I'd be very interested in knowing how to not write a Megapost half the times I comment, instead. I can't help but to obsess over having been enough explicative or precise, writing this took me thirty-eight minutes by the clock.
I would really like to see estimates of the cost of climate change, along with their probabilities. A paper I found is this one, but it is not quite up to the standard I have. It states that 1 bio. humans will die counterfactually due to climate change.
Also, the probabilities given for human extinction from climate change are quite low in comparison to other risks (8% for 10% human population decline from climate disaster conditional on such decline occuring, 1% (probably even less) on 95% decline till 2100).
Current belief: Something (nuclear war, biological catastrophe, unaligned AI, something else entirely) will either get humanity before climate change does (27%), humanity gets through and grows insanely quickly (Roodman 2020) (33%), neither happens & basically status quo with more droughts, famines, poverty, small scale wars etc. due to climate change, which cause several hundred million/few billion deaths over next centuries, but don't let humanity go extinct (17%), something else entirely (23%) (scenarios fitting very loosely onto reality, probabilities are intuition after calibration training).
My master thesis was on the subject of climate change and mass media attitude toward it. Working on it, I've read over 300 newspaper articles on the subject and studied a large number of papers regarding its consequences, its solutions, and the attempts to frame the debate over them, so I consider myself to be well informed on it.
In the months I've been on this site though, I've been greatly surprised by the apparent unconcerned attitude many of the users have toward this theme.
Given that
I plan to write a post that would cover this subject and hopefully resolve the disagreement.
Before doing that, though, I'd find extremely useful if people would write me what they believe, feel and anticipate about climate change, its future consequences and the processes that would be required to stop it, and why they think they believe, feel and anticipate that way.
I'm greatly interested in receiving replies both from people feeling alarmed and from people not feeling alarmed.
If, while reading this question or writing me your reply, you realised you'd like to document yourself more on the subject and update your beliefs, that's perfectly fine, but I'd ask that you write me your thoughts about it before doing so. If you, instead, already documented about it before reading this question, that's perfectly fine and I'd like to read your thoughts anyway.
What I'm trying to understand are the beliefs of the users of this site at the current moment.
You can be as synthetic or detailed as you'd like, and can write me either a reply under this post or a private message.
I'd like to thank in advance everyone who'll send me their thoughts and who'll dedicate this question a bit of their time.