lmm comments on Summoning the Least Powerful Genie - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (48)
I've known plenty of cases where people's programs were more agentive than they expected. And we don't have a good track record on predicting which parts of what people do are hard for computers - we thought chess would be harder than computer vision, but the opposite turned out to be true.
"Doing something other than what the programmer expects" != "agentive". An optimizer picking a solution that you did not consider is not being agentive.
I haven't: have you any specific examples?