http://www.wired.com/2016/03/fault-microsofts-teen-ai-turned-jerk/
Could this be a lesson for future AIs? The AI control problem?
[future AIs may be shutdown, and matyred..]
That seems hard to test beforehand.
They could have done an internal beta and said "fuck with us". They could have allocated time to a dedicated internal team to do so. Don't they have internal hacking teams to similarly test their security?
http://www.wired.com/2016/03/fault-microsofts-teen-ai-turned-jerk/
Could this be a lesson for future AIs? The AI control problem?
[future AIs may be shutdown, and matyred..]