The content of John Baez's This Week's Finds: Week 310:
Includes
- Discussion of global warming and geoengineering.
- A reference to a paper by David Wolpert and Gregory Benford on Newcomb's paradox
Note: The upcoming This Week's Finds: Week 311 is an interview with Eliezer Yudkowsky by John Baez.
Would you be willing to write a blog post reviewing his arguments and explaining why you either reject them, don't understand them or accept them and start working to mitigate risks from AI? It would be valuable to have someone like you, who is not deeply involved with the SIAI (Singularity Institute) or LessWrong.com, to write a critique on their arguments and objectives. I myself don't have the education (yet) to do so and welcome any reassurance that would help me to take action.
If you don't have the time to write a blog post, maybe you can answer just the following question. If someone was going to donate $100k and you could pick the charity, would you choose the SIAI? Yes/No answer if you're too busy, a short explanation if you've the time. Thank you!
You mean, "before we take on the galaxy, let’s do a smaller problem"? So you don't think that we'll have to face risks from AI before climate change takes a larger toll? You don't think that working on AGI means working on the best possible solution to the problem of climate change? And even if we had to start taking active measures against climate change in the 2020s, you don't think we should rather spend that time on AI because we can survive a warmer world but no runaway AI? Gregory Benford writes that "we still barely glimpse the horrors we could be visiting on our children and their grandchildren’s grandchildren". That sounds to me like he assumes that there will be grandchildren, which might not be the case if some kind of AGI doesn't take care of a lot of other problems we'll have to face soon.
I tell you that all you have to do is to read the LessWrong Sequences and the publications written by the SIAI to agree that working on AI is much more important than climate change, are you going to take the time and do it?
Since XiXiDu also asked this question on my blog, I answered over there.
I have read most of those things, and indeed I've been interested in AI and the possibility of a singularity at least since college (say, 1980). That's why I interviewed Yudkowsky.