I want to start my reply by saying I am dubious of the best future for humanity being one in which a super-intelligence we build ends up giving all control and decision making to humans. However, the tone of the post feels somewhat too anti-human (that a future where humans have greater agency is necessarily "bad", not just sub-optimal) and narrow in its interpretation for me to move on without comment. There is a lot to be learned from considering the necessary conflict between human and FAI agency. Yes, conflict.
The first point I don't fully agree with is the lack of capacity humans have to change, or grow, even as adults. You... (read 1005 more words →)
I want to start my reply by saying I am dubious of the best future for humanity being one in which a super-intelligence we build ends up giving all control and decision making to humans. However, the tone of the post feels somewhat too anti-human (that a future where humans have greater agency is necessarily "bad", not just sub-optimal) and narrow in its interpretation for me to move on without comment. There is a lot to be learned from considering the necessary conflict between human and FAI agency. Yes, conflict.
The first point I don't fully agree with is the lack of capacity humans have to change, or grow, even as adults. You... (read 1005 more words →)