Eliezer_Yudkowsky comments on Do Earths with slower economic growth have a better chance at FAI? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (174)
Some Facebook discussion here including Carl's opinion:
https://www.facebook.com/yudkowsky/posts/10151665252179228
I'm reposting Carl's Facebook comments to LW, for convenience. Carl's comments were:
Eliezer replied to Carl:
It's worth noting that the relationship between economic growth and the expected quality of global outcomes is not necessarily a linear one. The optimal speed of economic growth may be neither super-slow nor super-fast, but some "just right" value in between that makes peace, cooperation, and long-term thinking commonplace while avoiding technological advancement substantially faster than what we see today.
The possibility of AI being invented to deal with climate change hadn't occurred to me, but now that it's mentioned, it doesn't seem impossible, especially if climate engineering is on the agenda.
Any thoughts about whether climate is a sufficiently hard problem to inspire work on AIs?
Climate seems far easier. At least it's known what causes climate change, more or less. No one knows what it would take to make an AGI.
I didn't mean work on climate change might specifically be useful for developing an AI, I meant that people might develop AI to work on weather/climate prediction.
Right, and my reply was that AGI is much harder, so unlikely. Sorry about not being clear.