You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

TheAncientGeek comments on Summoning the Least Powerful Genie - Less Wrong Discussion

-1 Post author: Houshalter 16 September 2015 05:10AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (48)

You are viewing a single comment's thread. Show more comments above.

Comment author: TheAncientGeek 19 September 2015 08:09:32AM 0 points [-]

It's all standard software engineering.

Comment author: lmm 20 September 2015 06:39:24PM 0 points [-]

I'm a professional software engineer, feel free to get technical.

Comment author: TheAncientGeek 21 September 2015 09:45:10AM 1 point [-]

Have you ever heard of someone designing a nonagentive programme that unexpectedly turned out to be agentive? Because to me that sounds like into the workshop to build a skateboard abd coming with a F1 car.

Comment author: lmm 25 September 2015 06:48:43AM 1 point [-]

I've known plenty of cases where people's programs were more agentive than they expected. And we don't have a good track record on predicting which parts of what people do are hard for computers - we thought chess would be harder than computer vision, but the opposite turned out to be true.

Comment author: Lumifer 25 September 2015 02:54:08PM 1 point [-]

I've known plenty of cases where people's programs were more agentive than they expected.

"Doing something other than what the programmer expects" != "agentive". An optimizer picking a solution that you did not consider is not being agentive.

Comment author: TheAncientGeek 28 September 2015 02:43:57PM 0 points [-]

I've known plenty of cases where people's programs were more agentive than they expected.

I haven't: have you any specific examples?