HungryHobo comments on The Hidden Complexity of Wishes - Less Wrong

58 Post author: Eliezer_Yudkowsky 24 November 2007 12:12AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (121)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: J_Thomas 24 November 2007 03:15:13PM 5 points [-]

This generalises. Since you don't know everything, anything you do might wind up being counterproductive.

Like, I once knew a group of young merchants who wanted their shopping district revitalised. They worked at it and got their share of federal money that was assigned to their city, and they got the lighting improved, and the landscaping, and a beautiful fountain, and so on. It took several years and most of the improvements came in the third year. Then their landlords all raised the rents and they had to move out.

That one was predictable in hindsight, but I didn't predict it. There could always be things like that.

When anything you do could backfire, are you better off to stay in bed? No, the advantages of that are obvious but it's also obvious you can't make a living that way.

You have to make your choices and take your chances. If I had an outcome pump and my mother was trapped in a burning building and I had no other way to get her out, I hope I'd use it. The result might be worse than letting her burn to death but at least there would be a chance for a good outcome. If I can just get it to remove some of the bad outcomes the result may be an improvement.

Comment author: HungryHobo 28 August 2015 12:38:49PM 0 points [-]

I think the unlimited potential for bad outcomes may be a problem there.

After all, the house might not explode, instead a military transport plane nearby might suffer a failure and the nuclear weapon on board might suffer a very unlikely set of failures and trigger on impact killing everyone for miles and throwing your mothers body far far far away. The pump isn't just dangerous to those involved and nearby.

Most consequences are limited in scope. You have a slim chance of killing many others through everyday accident but a pump would magnify that terribly.

Comment author: nyralech 28 August 2015 11:49:23PM 1 point [-]

Most consequences are limited in scope. You have a slim chance of killing many others through everyday accident but a pump would magnify that terribly.

That depends entirely on how the pump works. If it picks uniformly among bad outcomes, your point might be correct. However, it might still be biased towards narrow local effects for sheer sake of computability. If this is the case, I don't see why it would necessarily shift towards bigger bad outcomes rather than more limited ones.

Comment author: HungryHobo 14 September 2015 10:46:09AM 0 points [-]

In the example I gave the nuke exploding would be a narrow local effect which bleeds over into a large area. I agree that a pump which needed to monitor everything might very well choose only quite local direct effects but that could still have a lot of long range bad side effects.

Bursting the damn a few hundred meters upriver might have the effect of carrying your mother, possibly even alive, far from the center of the building and it may also involve extinguishing the fire if you've thought to add that in as a desirable element of the outcome yet lead to wiping out a whole town ten miles downstream. The sort of the point is that the pump wouldn't care about those side effects.

Comment author: nyralech 14 September 2015 11:04:36AM 1 point [-]

But those outcomes which have a limited initial effect yet have a very large overall effect are very sparsely distributed among all possible outcomes with a limited initial effect.

I still do not see why the pump would magnify the chance of those outcomes terribly. The space of possible actions which have a very large negative utility grows by a huge amount, but so does the space of actions which have trivial consequences beside doing what you want.