johnlawrenceaspden comments on Fake Utility Functions - Less Wrong

22 Post author: Eliezer_Yudkowsky 06 December 2007 04:55PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (54)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: JulianMorrison 07 December 2007 10:25:51AM 0 points [-]

I'm not sure that friendly AI even makes conceptual sense. I think of it as the "genie to an ant problem". An ant has the ability to give you commands, and by your basic nature you must obey the letter of the command. How can the ant tie you up in fail-safes so you can't take an excuse to stomp him, burn him with a magnifying glass, feed him poison, etc? (NB: said fail-safes must be conceivable to an ant!) It's impossible. Even general benevolence doesn't help - you might decide to feed him to a starving bird.

Comment author: johnlawrenceaspden 27 August 2012 04:18:47PM 2 points [-]

What about, if by your basic nature you like the ant? Hell, you might even find yourself doing things like moving him off the road he'd wandered onto on his own.

Comment author: DaFranker 27 August 2012 04:24:25PM *  3 points [-]

But just liking the ants is also not sufficient. You might kill the bird for wanting to eat the ant, and then realize that all birds are threats, and kill all birds without telling the ants, because that's the maximizing solution, despite the possibility of the ants not wanting this and objecting had they known about it.

FAI is not impossible, but it's certainly a hard problem in many ways.

Also, there are problems with "by your basic nature you like the ant". Have you read the Guide to Words yet?

Comment author: johnlawrenceaspden 27 August 2012 04:59:17PM 0 points [-]

Also, there are problems with "by your basic nature you like the ant".

Indeed. I was hoping to refute the refutation in its own language.

Comment author: DaFranker 27 August 2012 05:14:09PM *  1 point [-]

Ah, thanks for clarifying that. While I don't want to other-optimize, I feel compelled to tell you that many people would say such a strategy is usually suboptimal, and often leads into falling into the trap of an "argumentative opponent".

Incidentally, this strikes me as particularly vulnerable to the worst argument in the world, (probably) due to my availability heuristic.

Comment author: johnlawrenceaspden 27 August 2012 06:38:52PM 0 points [-]

Actually I rather like the idea of being optimized. Have you got any good links to sources of argument/counterargument strategies? The more I read about the Dark Arts, the more I wish to learn them.

Comment author: DaFranker 27 August 2012 06:50:37PM 0 points [-]

Being optimized is net positive, and generally understood as good. Other-optimizing, on the other hand, is prone to tons of heuristical errors, map imperfections, scaling problems, mind projection and many other problems such that attempting to optimize the strategy of someone else for many things not already reduced is very risky and has low expected utility, often in the negatives. Telling others what argumentative techniques to use or not definitely falls into this category.

That's the thing with the Dark Arts. They lure you in, craft a beautiful song, fashion an intricate and alluring web of rapid-access winning arguments with seemingly massive instrumental value towards achieving further-reaching goals... but they trap and ensnare you, they freeze your thought into their logic, they slow down and hamper solid rationality, and they sabotage the thoughts of others.

It takes quite a master of rationality to use the art of Shadowdancing with reliably positive net expected utility. As long as you want to optimize for the "greater good", that is.

Comment author: johnlawrenceaspden 27 August 2012 07:11:22PM 0 points [-]

I thank you for your concern. I'd never even think about being evil unless it were for the greater good.

Comment author: RichardKennaway 27 August 2012 07:39:50PM 0 points [-]

I'd never even think about being evil unless it were for the greater good.

Most evil people would say that. They'd even believe it.

Comment author: johnlawrenceaspden 27 August 2012 08:20:11PM 1 point [-]

curses. there goes my cover.