What do we want out of AI? Is it happiness? If so, then why not just research wireheading itself and not encounter the risks of an unfriendly AI?
The promise of AI is irresistibly seductive because an FAI would make everything easier, including wireheading and survival.
If it's worth saying, but not worth its own post (even in Discussion), then it goes here.
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one.
3. Open Threads should be posted in Discussion, and not Main.
4. Open Threads should start on Monday, and end on Sunday.