blogospheroid comments on Vote Qualifications, Not Issues - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (185)
Opposition to nuclear power?
OK, but apart from Marxism, nuclear power, coercive eugenics, Christianity, psychoanalysis, and the respective importance of nature and nurture - when has the intellectual establishment ever been an unreliable guide to finding truth?
Come to think of it, one thing I'm surprised nobody mentioned is the present neglect of technology-related existential risks.
Yeah, that provides some more examples. The elite was very worried about existential risks from nuclear war ("The Fate of the Earth"), resource shortages and mass starvation ("Club of Rome"), and technology-based totalitarianism ("1984"). Now, having been embarrassed by falling for too many cries of wolf (or at least, for worrying prematurely), they are wary of being burned again.
I don't think worrying about nuclear war during the Cold War constituted either "crying wolf" or worrying prematurely. The Cuban Missile Crisis, the Able Archer 83 exercise (a year after "The Fate of the Earth" was published), and various false alert incidents could have resulted in nuclear war, and I'm not sure why anyone who opposed nuclear weapons at the time would be "embarrassed" in the light of what we now know.
I don't think an existential risk has to be a certainty for it to be worth taking seriously.
In the US, concerns about some technology risks like EMP attacks and nuclear terrorism are still taken seriously, even though these are probably unlikely to happen and the damage would be much less severe than a nuclear war.
I agree. And nuclear war was certainly a risk that was worth taking seriously at the time.
However, that doesn't make my last sentence any less true, especially if you replace "embarrassed" with "exhausted". The risk of a nuclear war, somewhere, some time within the next 100 years, is still high - more likely than not, I would guess. It probably won't destroy the human race, or even modern technology, but it could easily cost 400 million human lives. Yet, in part because people have become tired of worrying about such things, having already worried for decades, no one seems to be doing much about this danger.
When you say that no one seems to be doing much, are you sure that's not just because the efforts don't get much publicity?
There is a lot that's being done:
Most nuclear-armed governments have massively reduced their nuclear weapon stockpiles, and try to stop other countries getting nuclear weapons. There's an international effort to track fissile material.
After the Cold War ended, the west set up programmes to employ Soviet nuclear scientists which have run until today (Russia is about to end them).
South Africa had nuclear weapons, then gave them up.
Israel destroyed the Iraqi and Syrian nuclear programmes with airstrikes. OK, self-interested, but existing nuclear states stop their enemies getting nuclear weapons then it reduces the risk of a nuclear war.
Somebody wrote the Stuxnet worm to attack Iran's enrichment facilities (probably) and Iran is under massive international pressure not to develop nuclear weapons.
Western leaders are at least talking about the goal of a world without nuclear weapons. OK, probably empty rhetoric.
India and Pakistan have reduced the tension between them, and now keep their nuclear weapons stored disassembled.
The US is developing missile defences to deter 'rogue states' who might have a limited nuclear missile capability (although I'm not sure why the threat of nuclear retaliation isn't a better deterrent than shooting down missiles). The Western world is paranoid about nuclear terrorism, even putting nuclear detectors in its ports to try to detect weapons being smuggled into the country (which a lot of experts think is silly, but I guess it might make it harder to move fissile material around on the black market).
etc. etc.
Sure, in the 100 year timeframe, there is still a risk. It just seems like a world with two ideologically opposed nuclear-armed superpowers, with limited ways to gather information and their arsenals on a hair trigger, was much riskier than today's situation. Even when "rogue states" get hold of nuclear weapons, they seem to want them to deter a US/UN invasion, rather than to actually use offensively.
Plus we invented the internet - greatly strengthening international relations - and creating social and economic interdependency.
This doesn't appear to be the case at all. There are a variety of claimed existential risks which the intellectual elite are in general quite worried about. They just don't overlap much with the kind of risks people here talk about. Global warming is an obvious example (and some people here probably think they're right on that one) but the overhyped fears of SARS and H1N1 killing millions of people look like recent examples of lessons about crying wolf not being learned.
I don't know about SARS, but in the case of H1N1 it wasn't "crying wolf" so much as being prepared for a potential pandemic which didn't happen. I mean, very severe global flu pandemics have happened before. Just because H1N1 didn't become as virulent as expected doesn't mean that preparing for that eventuality was a waste of time.
Obviously the crux of the issue is whether the official probability estimates and predictions for these types of threats are accurate or not. It's difficult to judge this in any individual case that fails to develop into a serious problem but if you can observe a consistent ongoing pattern of dire predictions that do not pan out this is evidence of an underlying bias in the estimates of risk. Preparing for an eventuality as if it had a 10% probability of happening when the true risk is 1% will lead to serious mis-allocation of resources.
It looks to me like there is a consistent pattern of overstating the risks of various catastrophes. Rigorously proving this is difficult. I've pointed to some examples of what look like over-confident predictions of disaster (there's lots more in The Rational Optimist). I'm not sure we can easily resolve any remaining disagreement on the extent of risk exaggeration however.
Well, you also need to factor in the severity of the threat, as well as the risk of it happening.
Since the era of cheap international travel, there have been about 20 new flu subtypes, and one of those killed 50 million people (the Spanish flu, one of the greatest natural disasters ever), with a couple of others killing a few million. Plus, having almost everyone infected with a severe illness tends to disrupt society.
So to me that looks like there is a substantial risk (bigger than 1%) of something quite bad happening when a new subtype appears.
Given how difficult it is to predict biological systems, I think it makes sense to treat the arrival of a new flu subtype with concern and for governments to set up contingency programmes. That's not to say that the media didn't hype swine flu and bird flu, but that doesn't mean that the government preparations were an overreaction.
That's not to say that some threats aren't exaggerated, and others (low-probability, global threats like asteroid strikes or big volcanic eruptions) don't get enough attention.
I wouldn't put much trust in Matt Ridley's abilities to estimate risk:
http://news.bbc.co.uk/1/hi/7052828.stm (yes, it's the same Matt Ridley)
Well obviously. I refer you to my previous comment. At this point our remaining disagreement on this issue is unlikely to be resolved without better data. Continuing to go back and forth repeating that I think there is a pattern of overestimation for certain types of risk and that you think the estimates are accurate is not going to resolve the question.
Maybe at first, but I clearly recall that the hype was still ongoing even after it was known that this was a milder flu-version than usual.
And the reactions were not well designed to handle the flu either. One example is that my university installed hand sanitizers, well, pretty much everywhere. But the flu is primarily transmitted not from hand-to-hand contact, but by miniature droplets when people cough, sneeze, or just talk and breathe:
http://www.cdc.gov/h1n1flu/qa.htm
Wikipedia takes a more middle-of-the-road view, noting that it's not entirely clear how much transmission happens in which route, but still:
http://en.wikipedia.org/wiki/Influenza
Which really suggests to me that hand-washing (or sanitizing) just isn't going to be terribly effective. The best preventative is making sick people stay home.
Now, regular hand-washing is a great prophylactic for many other disease pathways, of course. But not for what the supposed purpose was.
I interpret what happened with H1N1 a little differently. Before it was known how serious it would be, the media started covering it. Now even given that H1N1 was relatively harmless, it is quite likely that similar but non-harmless diseases will appear in the future, so having containment strategies and knowing what works is important. By making H1N1 sound scary, they gave countries and health organizations an incentive to test their strategies with lower consequences for failure than there would be if they had to test them on something more lethal. The reactins make a lot more sense if you look at it as a large-scale training exercise. If people knew that it was harmless, they would've behaved differently and lowered the validity of the test..
This looks like a fully general argument for panicking about anything.
It isn't fully general; it only applies when the expected benefits (from lessons learned) exceed the costs of that particular kind of drill, and there's no cheaper way to learn the same lessons.
Just because some institutions over-reacted or implemented ineffective measures, doesn't mean that the concern wasn't proportionate or that effective measures weren't also being implemented.
In the UK, the government response was to tell infected people to stay at home and away from their GPs, and provide a phone system for people to get Tamiflu. They also ran advertising telling people to cover their mouths when they sneezed ("Catch it, bin it, kill it").
If anything, the government reaction was insufficient, because the phone system was delayed and the Tamiflu stockpiles were limited (although Tamiflu is apparently pretty marginal anyway, so making infected people stay at home was more important).
The media may have carried on hyping the threat after it turned out not to be so severe. They also ran stories complaining that the threat had been overhyped and the effort wasted. Just because the media or university administrators say stupid things about something, that doesn't mean it's not real.
SARS and H1N1 both looked like media-manufactured scares, rather than actual concern from the intellectual elite.
It wasn't just the media:
So Nabarro explicitly says that he's talking about a possibility and not making a prediction, and ABC News reports it as a prediction. This seems consistent with the media-manufactured scare model.
Haha, ok point taken. I'm clearly wrong on this and there are a lot of examples. (At this point I'm also reminded of this Monty Python sketch although this is sort of the inverse).