rwallace comments on Climate change: existential risk? - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (25)
It is the way of extinction that what kills the last individual commonly has nothing to do with the underlying factors that doomed the species. Could climate change by itself kill everyone on earth? No. Could it be a significant contributing factor to a downward spiral that ends in extinction? I hope not, but I don't really know. Leaving aside the purely fictional versions of AI and nanotech to which you refer, could real-life versions of those technologies help us develop sustainable energy sources? Yes. Will they do so fast enough? I hope so, but I don't really know.
As for whether there's anything we here can usefully do about it, I don't think it would be useful for us to get bogged down in the sort of bickering about politics that all too often goes with this kind of territory, but LW does have a good track record of avoiding that; and perhaps it would be useful for us to explore potential solutions.
Any ideas about how capable computer programs would need to be to give significant help to researchers with hypothesis generation and with whether research programs make sense? With seeing whether abstracts match experimental results?
It seems to me that some of this could be done without even having full natural language.
As long as we're talking about, as you say, significant help rather than solving the whole problem, about what can be done without having full natural language - then I think this is one of the more promising areas of AI research for the next couple of decades.
I talked a few months ago to somebody who's doing biomedical research - one of the smartest guys I know - asking what AI might be able to do to make his job easier, and his answer was that the one thing likely to be feasible in the near future that would really help would be better text mining, something that could do better than just keyword matching for e.g. flagging papers likely to be relevant to a particular problem.
We don't need help finding sustainable resources. We already have nuclear power. We just need to convince everyone that nuclear isn't bad.
How much climate change are we talking? :)
Let's say the amount realistically liable to occur in the next few centuries :) If we have to worry about it on megayear timescales, we'll already have failed.