If it's worth saying, but not worth its own post (even in Discussion), then it goes here.
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)
3. Open Threads should be posted in Discussion, and not Main.
4. Open Threads should start on Monday, and end on Sunday.
What do all of you think the awareness of concerns about superhuman intelligence as a catastrophic risk going "mainstream", culturally, will be now?
Examples:
Bill Gates, Stephen Hawking, and Elon Musk voice concern over Artificial Intelligence after Superintelligence was published
Network TV series Elementary covers AI risk
The release of major Hollywood films touching upon what dangerous AI might really be like: Transcendence, Ex Machina, Avengers: Age of Ultron
The pop-culture examples demonstrate an awareness of what Nick Bostrom or Eliezer Yudkowksy caution the world about, but only an understanding of the real issues barely better than "AI is 'The Terminator', right?" Personally, I don't expect all this increased awareness to change what the Future of Humanity Institute, the Future of Life Institute, and the Machine Intelligence Research Institute would otherwise (plan to) do, unless there is an increased proportion of people who have a philosophical or technical understanding of how challenging and crucial the development of superhuman machine intelligence. It will take more than buzz on social media and Hollywood depictions to inculcate more than the shallowest of understanding.
I wonder if a movie with an AI box-based story would have any potential? Perhaps something treated as more of a psychological horror/thriller than as a guns-and-explosions action movie might help to distance people's intuitions from "AI is 'The Terminator', right?"