Since XiXiDu also asked this question on my blog, I answered over there.
I tell you that all you have to do is to read the LessWrong Sequences and the publications written by the SIAI to agree that working on AI is much more important than climate change, are you going to take the time and do it?
I have read most of those things, and indeed I've been interested in AI and the possibility of a singularity at least since college (say, 1980). That's why I interviewed Yudkowsky.
The content of John Baez's This Week's Finds: Week 310:
Includes
Note: The upcoming This Week's Finds: Week 311 is an interview with Eliezer Yudkowsky by John Baez.