taw comments on Thoughts on the Singularity Institute (SI) - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (1270)
Existential risk reduction is a very worthy cause. As far as I can tell there are a few serious efforts - they have scenarios which by outside view have non-negligible chances, and in case of many of these scenarios these efforts make non-negligible difference to the outcome.
Such efforts are:
That's about the entire list I'm aware of (are there any others?)
And then there's huge number of efforts which claim to do something based on existential risk, but either theories behind risk they're concerning themselves with, or theories behind why their efforts are likely to help, are based on assumptions not shared by vast majority of competent people.
All FAI-related stuff suffers from both of these problems - their risk is not based on any established science, and their answer is even less based in reality. If it suffered from only one of these problems it might be fixable, but as far as I can tell it is extremely unlikely to join the category of serious efforts ever.
The best claim those non-serious effort can make is that tiny chance that the risk is real * tiny change the organization will make a difference * huge risk is still a big number, but that's not a terribly convincing argument.
I'm under impression that we're doing far less than everything we can with these serious efforts, and we haven't really identified everything that can be dealt with with such serious effort. We should focus there (and on a lot of things which are not related to existential risk).
Here is the list from Global Catastrophic Risks.
Most of entries on the list are either not quantifiable even approximately to within order of magnitude. Of those that are (which is pretty much only "risks from nature" in Bostrom's system) many are still bad candidates for putting significant effort into, because:
About the only new risk I see on the list which can and should be dealt with is having some backup plans for massive solar flares, but I'm not sure what we can do about it other than putting some extra money into astrophysics departments so they can figure things out better and give us better estimates.
nuclear holocaust. biological holocaust. super eruptions whose ash blocks significant levels of sunlight.
I understand that global thermonuclear war could cause serious damage, but I'm not aware of any credible efforts that can prove they're moving things in the right direction.
What do you mean by "biological holocaust"?
Super eruptions surely follow some kind of power law, and as far as I can tell (and we can be sure by extrapolating from the power law), they don't get anywhere remotely near levels of destroying all life on Earth.
And we sure know how to heat Earth significantly in no time - just release some of these into atmosphere. It will only increase temperature, not sunlight, so food production and such will still be affected, but we already produce way more food per capita to feed everyone, so even a pretty big reduction won't get anywhere near compromising food security for majority of people, let alone threatening to kill everyone.
http://en.wikipedia.org/wiki/New_START
This stuff, as slow and grinding as it is, does make a difference.
There's no particular reason to believe this is going to make global thermonuclear war any less likely. Russia and United States aren't particularly likely to start a global thermonuclear warfare anytime soon, and in longer perspective any major developed country, if it wanted, could build nuclear arsenals sufficient to make a continent uninhabitable within a few years.
There's also this argument that mutually assured destruction was somehow stabilizing and preventing nuclear warfare - the only use of nuclear weapons so far happened when the other side had no way to retaliate. I'm quite neutral on this - I'm unwilling to say that nuclear arms reductions either increase or decrease risk of global war (which will eventually turn nuclear or otherwise very nasty).
They don't have to destroy all life on earth to be existential risks. They just have to damage human civilization to the point where it can't recover; we've already used up basically all of the easily accessible, non-renewable natural resources; for example, a future civilization reduced to Roman Empire level technology would find itself with a severe shortage of exploitable ores - good luck running your empire without iron or copper!
The remains of the prior civilization would provide quite a bit. Indeed, for some metals this would be even easier. Aluminum for example requires a lot of technology to refine, but if one has already refined aluminum lying around one can easily make things out of it. A more serious problem would be the substantial reduction in easily accessible coal and oil. The remaining fossil fuels require a lot more technology to access.
Yeah, this is one of the scarier future prospects I've heard kicking around. We can really only bootstrap an industrial civilization once, because the available energy simply isn't going to be there next time. We'd better get it right. Fortunately, we've done pretty well on that score thus far, but it's one of those lingering distant-future fears.
That reasoning is just extremely unconvincing, essentially 100% wrong and backwards.
Renewable energy available annually is many orders of magnitude greater than all fossil fuels we're using, and it has been used as primary source of energy for almost the entire history up to industrial revolution. Biomass for everything, animal muscle power, wind and gravity for water transport, charcoal for melting etc. were used successfully at massive scale before anybody even thought of oil or gas or made much use of coal.
Other than energy, most other resources - like ores - are trivially recyclable. If New Rome wanted iron and copper and so on they'd just need to head toward the nearest dump, and dig there. Amount of ores we dug out and made trivially accessible is ridiculously greater than what they had available.
Annual iron ore mining for example is 2.4 billion metric tons, or 1 kg per person per day. Annual steel production is 1.49 billion metric tons, or 220 kg per person per year. Every year (OK, some of that steel is from recycled iron). Vast majority of them would be easily extractable if civilization collapsed. If we went back to Roman levels of population, each Roman could easily extract tens or hundreds of tons of usable steel from just the stuff we extracted that their technology couldn't.
The same applies to every other metal, and most non-metal resources. It doesn't apply to a few resources like phosphorus and helium, but they'll figure it out somehow.
And even if civilization "collapsed" it's not like our scientific and organizational knowledge would have disappeared, making it ridiculously easier to rebuild than it was to build in the first place.
Okay, this has been driving me bonkers for years now. I keep encountering blatantly contradictory claims about what is "obviously" true about the territory. taw, you said:
And you might well be right. But the people involved in transition towns insist quite the opposite: I've been explicitly told, for one example, that it would take the equivalent of building five Three Gorges Dams every year for the next 50 years to keep up with the energy requirements provided by fossil fuels. By my reading, these two facts cannot both be correct. One of them says that civilization can rebuild just fine if we run out of fossil fuels, and the other says that we may well hit something dangerously close to a whimper.
I'm not asking for a historical analysis here about whether we needed fossil fuels to get to where we are. I'd like clarification on a fact about the territory: is it the case that renewable forms of energy can replace fossil fuels without modern civilization having to power down? I'm asking this as an engineering question, not a political one.
They are incorrect. Here's a helpful diagram of available energy.
Can you pretty, pretty please tell me where this graph gets its information from? I've seen similar graphs that basically permute the cubes' labels. It would also be wonderful to unpack what they mean by "solar" since the raw amount of sunlight power hitting the Earth's surface is a very different amount than the energy we can actually harness as an engineering feat over the next, say, five years (due to materials needed to build solar panels, efficiency of solar panels, etc.).
And just to reiterate, I'm really not arguing here. I'm honestly confused. I look at things like this video and books like this one and am left scratching my head. Someone is deluded. And if I guess wrong I could end up wasting a lot of resources and time on projects that are doomed to total irrelevance from the start. So, having some good, solid Bayesian entanglement would be absolutely wonderful right about now!
The diagram comes from Wikipedia (tineye says this) but it seems they recently started merging and reshuffling content in all energy-related articles, so I can no longer find it there.
That's total energy available of course, not any 5 year projection.
Thank you!
Do you happen to know anything about the claim that we're running out of the supplies we need to build solar panels needed to tap into all that wonderful sunlight?
Right, and the energy demands of those societies were substantially lower than those later societies which used oil and coal. The industrial revolution would likely not have been possible without the presence of oil and coal in easily accessible locations. Total energy isn't all that matters- the efficiency of the energy, ease of transport, and energy density all matter a lot also. In those cases, fossil fuels are substantially better and more versatile.
This argument is only convincing to people who never bothered to look at timeline of historical events in technology. No country had any significant amount of coal mining before let's say UK in 1790-ish and forwards, and even there it was primarily to replace wood and charcoal.
Technologies we managed to build by then were absolutely amazing. Until 1870 the majority of locomotives in the USA operated on wood, canal transport was as important as railroads and was even less dependent on dense fuels, so transportation was perfectly fine.
Entire industries operated on water power just fine for decades before coal or electricity.
Just look at how well science, and technology was doing before coal came about.
Even mentioning oil in this context is pretty ridiculous - it only came to importance by about 1950-ish. Cars can be modified to run on wood of all things without much difficulty, and it happened on mass scale in many economies in war conditions.
Most of your analysis seems accurate, but there do seem to be some issues.
While you are correct that the until 1870 the majority of locomotives in the USA operated on wood, the same article you linked to notes that this was phased out as the major forests were cut down and demand went up. This is not a long-term sustainable process that was converted over to coal simply because it was more efficient. Even if one had forests grow back to pre-industrial levels (a not completely unlikely possibility if most of humanity has been wipe out), you don't have that much time to use wood on a large scale before you need to switch over.
You also are underestimating the transformation that occurred in the second half of the 19th century. In particular, while it is true that industries operated on water power, the total number of industries, and the energy demands they made were much smaller. Consider for example chip making plants which have massive energy needs. One can't run a modern economy on water power because there wouldn't be nearly enough water power to go around. This is connected to how while in the US in the 1870s and 1880s many of the first power plants were hydroelectric, support of a substantial grid required the switch to coal which could both provide more power and could have plants built at the most convenient location. This is discussed in Maggie Koerth-Baker's book "Before the Lights Go Out" which has a detailed discussion about the history of the US electric grids.
And while it is true that no country had major coal mining before 1790 by modern standards, again the replacement of wood and charcoal occurred to a large extent because they were running out of cheap wood, and because increased industry substantially benefited from the increased energy density. And even well before that, coal was used already in the late Middle Ages for speciaized purposes, such as metal working with metals that required high temperatures. While not a large industry, it was large enough that you had coal regulation in the 1300s, and by the 1620s it was economically practical to have coal mines that included large scale drainage and pumping systems so one could mine coal well below sea level.
It is relevant in this context in that it became important in part due to the rising price of coal (as easy to access coal had been depleted). It isn't a coincidence that in World War II, a major goal of the German invasion of Russia was to get access to the Baku oil fields.
Wood ran out because forests weren't properly managed, not because photosynthesis is somehow insufficiently fast at growing forest - and in any case there are countless agricultural alternative energy sources like ethanol from sugar cane.
In 1990 3.5 billion m^3 of wood were harvested. With density of about 0.9kg/cubic meter, and energy of about 15 MJ/kg, that's about 47 trillion MJ (if we burned it all, which we're not going to).
All coal produced in 1905 was about 0.9 billion tons, or about 20 trillion MJ.
In 2010 worldwide biofuel production reached 105 billion liters (or 2.4 trillion MJ). But that's very modest amount - according to the International Energy Agency, biofuels have the potential to meet more than a quarter of world demand for transportation fuels by 2050. And that's not any new technology, we knew how to extract alcohol from plants thousands of years ago.
We don't have enough hydropower to cover all our use, but it could cover very large fraction of our needs, definitely enough to jumpstart civilization, and there's many times more of any of - wind, solar, biomass, or nuclear power than we need - none of them fully available to any new civilization.
The fact that we used something for a certain purpose is no evidence that it was necessary for this purpose, it's just evidence that we're not total idiots to leave a resource unused. Many alternatives which would work nearly just as well were available in pretty much every single case.
The key point of economics you are missing here is the price of wood was driven up by increased demand. Wood never ran out, but it did become so expensive that some uses became uneconomical. This allowed substitution of the previously more expensive coal. This did not happen because of poor management of forests. Good management of forests might have encouraged it, by limiting the amount of wood taken for burning.
This is especially true because we are not talking about a modern globalized economy where cheap sugar from Brazil, corn from Kansas, or pine from the Rockies can come into play. We are talking about the 19th century in industrializing Europe. The energy use of England could not have been met by better forestry. All stats from 200 years later are a red herring.
If there were other alternatives that were almost as good, please produce them. Not now, but at the time being discussed.
I'm a bit sceptical about that. Compare the technological level of Europe in AD 100 with that of Europe in AD 700.
Which part of "Europe" are you talking about? Western peripheries of Roman Empire got somewhat backwards, and that was after massive demographic collapse of late Antiquity, the rest of Europe didn't really change all that drastically, or even progressed quite a lot.
pandemics, man-made or natural.
Yeah, I've mentioned pandemics already.
I'm not terribly willing to treat them as an "existential" risk, since countless pandemics already happened and for natural reasons they never actually kill the entire population.
And the way how awesomely we've dealt with SARS is a good data point showing that pandemics might actually be under control now. At least we should have far more confidence in our ability to deal with pandemics is far better than our ability to deal with just about any other existential threat.
And one nice side effect of just plain old medicine is reduction of this existential risk, even without any efforts specifically towards handling existential risk. Every antibiotic, every antiviral, every new way of keeping patients alive longer, every diagnostic improvement, every improvement in hygiene in poor countries etc. - they all make pandemics less likely and more manageable.
Most major pandemics have occurred before modern transport was common. The presence of easy air travel makes a serious pandemic more problematic. And in fact if one looks at emergent diseases in the last sixty years, such as HIV, one sees that they are effectively taking advantage of the ease of transport in the modern world.
HIV emerged before modern medicine developed. It was discovered in 1981 - almost prehistory by medical standards, but it was actually transfered to humans somewhere in late 19th century. It wrecks the most havoc in places which are extremely far from modern medicine as well, in developed countries HIV is a fairly minor problem.
SARS is a much better example of a new disease and how modern medicine can deal with it.
Even in Africa, HIV has taken advantage of modern transport. Migrant workers are a major cause of HIV spread in sub-Saharan Africa. This has advanced to the point where new road building projects think about what they will do to disease transmission. These laborers and the like aren't just walking- the possibility of such migrant labor is connected to the fact that even in the developing world, buses exist.
Oh, I somehow skipped seeing that in the OP. I don't think our ability to deal with mundane bugs has much transferability to our ability to deal with super bugs.
There's really no such thing as a "super bug". All organisms follow the same constraints of biology and epidemiology. If there was even some magical "super bug" it would infect everything of any remotely compatible species, not be constrained to one species, and small subset of cells in it.
We might not have any drugs ready for a particular infection, but we didn't have any for SARS, it was extremely infectious, and extremely deadly, and it worked perfectly fine in the end. We have tools like quarantine, detection etc. which work against any disease known or unknown.
Medicine made a massive progress since then - mass sequencing of infectious genomes for quick reaction time is now far more practical, and we might soon even get broad spectrum antivirals.
And we've eradicated two diseases already (smallpox, rinderpest) with two more being very close to eradication (polio, dracunculiasis), and it's not like anybody has any intentions of stopping the total at 4. We'll keep eradicating diseases, even if it takes a decade or two for each such attempt. Every time we manage to do that, there's one less source of potential pandemic.
I cannot really imagine how it could be going better than that.
This doesn't fully apply to hypothetical manmade pandemics, but currently we don't really know how to make such thing (the best we can do it modify existing disease to be a bit more nasty, creating diseases de novo is far beyond our capabilities), nobody has any particular desire to do so, and any broad spectrum countermeasures we develop against natural diseases will likely at least partly apply against manmade diseases in any case.
AFAIK nothing precludes extremely lethal bugs with long incubation periods. As for "nobody has any particular desire to", I hope you are right.
Except the fact they wouldn't be particularly lethal.
If 100% of humans had HIV, it would increase probably make most countries disregard patent laws on a few drugs, and human life spans would get shorter by like 5-10 years on average.
This should keep things in perspective.
My Google-fu seems to indicate a drop of about 20 years.
The term does not imply magic, it merely implies nasty. Smallpox and Spanish flu were both superbugs in every meaningful sense, but they worked on DNA just like everything else. The question is not whether someone builds a flesh-eating nanite our immune system can't handle or whatever, it's just about whether an infectious disease comes along that's worse than our medical system can cope with. That is a much lower bar.
Smallpox wasn't that bad if you look at statistics, and spanish flu happened at a time when humans have been murdering each other at unprecedented rate and normal society was either suspended or collapsed altogether everywhere.
Usually the chance of getting infected is inversely correlated with severity of symptoms (by laws of epidemiology), and nastiness is inversely correlated with broad range (by laws of biology), so you have diseases that are really extreme by any one criterion, but they tend to be really weak by some other criterion.
And in any case we're getting amazingly better at this.
Not that bad?
I agree that there were aggravating factors, particularly in the Spanish flu case, and that tradeoffs between impact and spread generally form a brake. But nasty diseases do exist, and our medical science is sufficiently imperfect that the possibility of one slipping through even in the modern world is not to be ignored. Fortunately, it's a field we're already pouring some pretty stupendous sums of money into, so it's not a risk we're likely to be totally blindsided by, but it's one to keep in mind.