A question: why anything about global warming gets downvoted, even popularly readable explanation of the fairly mainstream scientific consensus? edit: Okay, this is loaded. I should put it more carefully: why is the warming discussion generally considered inappropriate here? That seems to be the case; and there are pretty good reasons for this. But why can't AGW debate be invoked as example controversy? The disagreement on AGW is pretty damn unproductive, and so it is a good example of argument where productivity may be improved.
The global warming is a pretty damn good reason to build FAI. It's quite seriously possible that we won't be able to do anything else about it. Even mildly superhuman intelligence, though, should be able to eat the problem for breakfast. Even practical sub-human AIs can massively help with the space based efforts to limit this issue (e.g. friendly space-worthy von Neumann machinery would allow to almost immediately solve the problem). We probably will still have extra CO2 in atmosphere, but that is overall probably not a bad thing - it is good for plants.
For that to be important it is sufficient to have 50/50 risk of global warming Even probabilities less than 0.5 for the 'strong' warning scenarios still are a big factor - in terms of 'expected deaths' and 'expected suffering' considering how many humans on this planet lack access to air conditioning. I frankly am surprised that the group of people fascinated with AI would have such a trouble with the warming controversy, as to make it too hot of a topic for an example of highly unproductive arguments.
I do understand that LW does not want political controversies. Politics is a mind killer. But this stuff matters. And I trust it has been explained here that non-scientists are best off not trying to second guess the science, but relying on the expert opinion. The global warming is our first example of the manmade problems which are going to kill us if there is no AI. The engineered diseases, the gray goo, that sort of stuff comes later, and will likely be equally controversial. For now we have coal.
The uFAI risk also is going to be extremely controversial as soon as those with commercial interests in the AI development take notice - way more controversial than AGW, for which we do have fairly solid science. If we cannot discuss AGW now, we won't be able to discuss AI risks once Google - or any other player - deems those discussions a PR problem. The discussions at any time will be restricted to the issues about which no-one really has to do anything at the time.
I understand that, but would AI be able to stay an exception if any particular risks become as controversial as AGW ?
With regards to the global warming, if you provisionally take that rational person tends to have a stance on the AGW which is in alignment with scientific consensus, then the AGW supporters that join the issue are better on average at rationality; especially the applied rationality; not worse. If you, however, proposition that rational person tends to have a stance on the AGW in disagreement with the scientific consensus - then okay, that is a very valid point that you don't want those aligned with scientific consensus to join in. Furthermore I don't see what's so special about religion.
I am a sort of atheist, but I see the support for atheism as much, much more shaky than support for AGW, and I know many people who are theists of various kinds, and are otherwise quite rational, while I do not know anyone even remotely rational in disagreement with scientific consensus, who is not a scientist doing novel research that disagrees with the consensus personally himself.
If AI in general or uFAI in paticular becomes a politicized issue (not quite identical to "controversial") to the extent that AGW now is, I suspect it'll be grandfathered in here by the same mechanism that religion now is; it's too near and dear a topic to too many critical community members for it to ever be entirely dismissed. However, its relative prominence might go down a notch or two -- moves to promote this may already be happening, given the Center for Modern Rationality's upcoming differentiation from SIAI.
As to applied rationality and ... (read more)