Banish the Clippy-creating Bias Demon!

12 Post author: Stuart_Armstrong 18 January 2013 02:57PM

I posted in Practical Ethics, arguing that if we mentally anthropomorphised certain risks, then we'd be more likely to give them the attention they deserved. Slaying the Cardiovascular Vampire, defeating the Parasitic Diseases Death Cult, and banishing the Demon of Infection... these stories give a mental picture of the actual good we're doing when combating these issues, and the bad we're doing by ignoring them. Imagine a politician proclaiming:

  • I will not let the Cardiovascular Vampire continue his unrelenting war upon the American people, slaying over a third of our citizens - the eldest, in their weakened state, among his most numerous victims. There is no negotiating with such a terrorist - I will direct the full resources of the state to crushing his campaign of destruction.

An amusing thing to contemplate - except, of course, if there were a real Cardiovascular Vampire, politicians and pundits would be falling over themselves with those kinds of announcements.

The field of AI is already over-saturated with anthropomorphisation, so we definitely shouldn't be imagining Clippy as some human-like entity that we can heroically combat, with all the rules of narrative applying. Still it can't hurt to dream up a hideous Bias Demon in its mishaped (though superficially plausible) lair, cackling in glee as someone foolishly attempts to implement an AI design without the proper safety precautions, smiling serenely as prominent futurist dismiss the risk... and dissolving, hit by the holy water of increased rationality and proper AI research. Those images might help us make the right emotional connection to what we're achieving here.

Comments (37)

Comment author: Luke_A_Somers 18 January 2013 07:08:39PM 17 points [-]

:facepalm:

This is dark artsy to the point of self-parody. The outcome seems to me highly dependent on the parity of the number of meta levels the viewer goes to.

Comment author: Emile 18 January 2013 03:28:24PM *  14 points [-]

Many religions do anthropomorphize evil - the devil may not actually exist, but we may all be better off if we talk about him as if he did.

I suspect that there are quite a few things like this, where religion is kinda right, as long as you don't take it too literally. Maybe the best solution isn't to reject religion wholesale, but to reform it so that it's tacitly acknowledged that it isn't really true, a bit like Santa Claus, or professional wrestling. Arguably that may already be the attitude of many Anglicans and Unitarian Universalists.

Comment author: Peterdjones 21 January 2013 06:02:26PM 0 points [-]

The extreme is Bokonism.

That reminds me of when I shared an office with a scorrsh catholic atheist and a scottish protestant atheist, who still managed to wrangle all the time.

Comment author: ESRogs 18 January 2013 10:43:04PM 11 points [-]

I'm reminded of this early GiveWell post :)

"When I was younger, I loved playing video games. [...] I just liked killing bad guys. Well, more than that, I hated not killing bad guys. When Heat Man killed my guy and stood around smugly, I wanted to throw the TV across the room, and I couldn’t stop until he was dead.

What sucked about this experience was that it was all fake, and in the back of my head I knew that. In the end I felt pretty empty and lame. Enter altruism – where the bad guys are ACTUALLY BAD GUYS. [...] it’s infinitely better because it’s real. I don’t care whether the kids are cute, or whether the organizations are nice to me, or whether my friends like my decisions. As with video games, I probably spend 99% of my time frustrated rather than happy. But … Malaria Man just pisses me off. It’s that simple." http://blog.givewell.org/2007/04/03/charity-the-video-game-thats-real/

Comment author: Document 27 January 2013 04:43:08AM 0 points [-]

I'd play a game where scoring points or the equivalent wired tiny payments to a nonprofit of my choice.

Comment author: Alicorn 27 January 2013 07:46:43PM 1 point [-]

You don't get to pick the nonprofit, but there's Free Rice.

Comment author: Document 27 January 2013 07:53:57PM *  0 points [-]

I meant payments out of funds I provided, the idea being to maximize the fuzzies produced by a donation by increasing the effort expended to make it. But thanks for the link.

Comment author: DaFranker 18 January 2013 05:37:20PM *  7 points [-]

Your example political speech makes me want to just run for office and do it.

"I now solemnly vow, on all honors, to rid our country of the vile terrorists who call themselves the Slippery Baths. If I am elected, our people shall be safe and squicky-clean once more!"

Hey, I figure it's almost worth a try. If someone could find the right Mass Media people to bribe for help, I think there's a lot of potential here.

Comment author: shminux 18 January 2013 05:41:36PM 10 points [-]

What about the mindless roaring four-wheeled blood-thirsty flashy-eyed monsters roaming our streets?

Comment author: DaFranker 18 January 2013 05:49:18PM *  11 points [-]

I thought of them too, but they've got their filthy money-laundering hands in too many pockets and they're controlling too many people - it would be a losing battle. The triads would be a more realistic target.

Besides, they literally take our people hostage, wear ablative carbon-composite / high-tech-metal-alloy armor, and lug around gallons of flamethrower fuel. They also tend to hunt in packs¹ and establish war camps on our bridges every morning.

We'll need a lot more than one good politician and a few bribes to the media to win that war.²


  1. Most deaths involve multiple of them, IIRC.

  2. But please, if you can, I strongly encourage anyone to prove me wrong. The implication here is that lots of science and engineering and money is needed to fix the dangers and reduce the risks. The kind of science and engineering and money that Google already started doing a while ago.

Comment author: shminux 18 January 2013 06:34:37PM *  7 points [-]

Well, there is a movement afoot to tame their wild nature. Some day being trampled or squished into pulp by these creatures will be but a distant memory, as their descendants follow the path of domestication well traveled by other animals, the past perils replayed only in the highly scripted spectacle of corrida de coches.

Comment author: Desrtopa 19 January 2013 05:11:56AM 2 points [-]

What's probably going to be really difficult is not getting automated cars on the market, but getting all the non-automated cars off the road. An entirely automated traffic flow would be much safer than a partly automated traffic flow, but there are going to be lots of holdouts who refuse to trust an automated car over their own driving ability, or who simply can't or won't buy an up-to-date car.

Comment author: shminux 19 January 2013 07:27:45PM *  2 points [-]

All good points, I addressed some of them in my previous comment on self-driving cars.

Comment author: Stuart_Armstrong 22 January 2013 11:51:54AM 0 points [-]

When automated cars are at 90% or so, and if you keep on getting statistics like how many accidents and deaths are caused by humans versus machines, I think the pressure to go all automated will be strong. Some municipalities and states will go for it, and then it'll be hard to get anywhere with a human-driven car.

Comment author: Desrtopa 22 January 2013 02:09:02PM 1 point [-]

I suspect getting the prevalence as high as 90% will be pretty difficult itself.

Comment author: [deleted] 18 January 2013 07:21:34PM 10 points [-]

Adjusts her prior that you are a biker in Seattle waaaaay upward

Comment author: DanArmak 18 January 2013 04:47:35PM *  6 points [-]

The great enemy of humanity is already anthropomorphised: it is the Death Himself we do battle with, the Lord of Entropy.

Comment author: CronoDAS 21 January 2013 10:18:10PM 0 points [-]

Nah, he's actually a pretty nice guy once you get to know him. He doesn't cause deaths; he's just the one who cleans up afterward. And he'd probably be grateful for a chance to retire peacefully.

The proper incarnation of entropy is the Frost Giant, not the bony-looking guy in a cape.

Comment author: ikrase 20 January 2013 12:37:59AM 0 points [-]

You beat me to it. I already tend to narrativise this. Other cases, though, are very risky; an alternative, striving-based narrative might be better.

Comment author: Qiaochu_Yuan 18 January 2013 07:54:31PM 16 points [-]

I would suggest that this is a useful thing to do on an individual level (to adjust for scope insensitivity and so forth) but a terrible thing to do on a group level (because it's mind-killing). Smells too much like the Yellow Peril for my taste.

The Anthropomorphization Cannon is a powerful weapon, and if it were to fall into the wrong hands...

Comment author: patrickmclaren 19 January 2013 06:34:41AM 1 point [-]

I feel that this position could be equally argued if the scopes were switched, given the following motivation.

...if we mentally anthropomorphised certain risks, then we'd be more likely to give them the attention they deserved. -- OP

For example, a harmless :-) play on your comment. All the while, keeping the above maximization criteria in mind.

I would suggest that this is a useful thing to do on a group level (because it's mind-killing; take Yellow Peril for example) but a terrible thing to do on an individual level (to adjust for scope insensitivity and so forth).

Comment author: drethelin 18 January 2013 07:11:31PM 11 points [-]

I don't think anthropomorphising al qaeda in the form of Osama Bin Laden or demonizing Saddam Hussein was a net good for america. Framing the arguments over drug control as "The War On Drugs" has almost certainly led to the loss of billions of dollars and many lives. Do you really think encouraging this idea in general is good?

Comment author: Stuart_Armstrong 18 January 2013 07:21:27PM 12 points [-]

Do you really think encouraging this idea in general is good?

I'd certainly prefer if the serious risks were the anthropomorphised ones, rather than the trivial ones.

Comment author: Pavitra 19 January 2013 04:41:48AM 3 points [-]

So it's a great idea as long as only causes you agree with get to use the superweapon?

Comment author: Desrtopa 19 January 2013 05:14:43AM 7 points [-]

Well, if you can't stop people from using a superweapon for bad causes, it may be an improvement to see to it that it's also used for good causes.

Comment author: Pavitra 20 January 2013 01:33:10PM -1 points [-]

The original question was:

Do you really think encouraging this idea in general is good?

That is: assuming it is possible to reduce bad uses at the cost of also reducing good uses, should one do so?

Your reply seems to assume that the bad uses can't be reduced, which contradicts the pre-established assumptions. If you want to change the assumptions of a discussion, please include a note that you are doing so and ideally a short explanation of why you think the previous assumptions should be rejected in favor of the new ones.

Comment author: Desrtopa 20 January 2013 01:52:27PM 0 points [-]

I don't assume that bad uses can't be reduced, and my answer is somewhat tongue in cheek, but I do suspect that getting people to stop using this mode of thought for bad ideas would be very difficult. Getting people to apply it to good causes as well might be worse, outcome-wise, than getting them to stop applying it all, but trying to get people to apply it to good causes might still have a better return on investment than trying to get them to stop, simply because it's easier.

Comment author: Pavitra 20 January 2013 01:55:31PM -1 points [-]

You may be right, but I don't trust a human to only arrive at that conclusion if it's true. I think we ought to refrain from pressing D, just in case.

Comment author: CellBioGuy 19 January 2013 10:45:30PM 4 points [-]

Isn't this basically what the saner strand of occultists do when they personify archetypes and aspects of humanity into minor deities?

Comment author: Dre 19 January 2013 06:59:47AM 4 points [-]

I worry that this would bias the kind of policy responses we want. I obviously don't have a study or anything, but it seems that the framing of the War on Drugs and the War on Terrorism have encouraged too much violence. Which sounds like a better way to fight the War on Terror, negotiating in complicated local tribal politics or going in and killing some terrorists? Which is actually a better policy?

I don't know exactly how this would play out in a case where no violence makes sense (like the Cardiovascular Vampire). Maybe increased research as part of a "war effort" would work. But it seems to me that this framing would encourage simple and immediate solutions, which would be a serious drawback.

Comment author: shminux 18 January 2013 04:58:39PM 4 points [-]

The field of AI is already over-saturated with anthropomorphisation

Actually, on this forum Clippymorphisation is rather prevalent.

Comment author: private_messaging 20 January 2013 04:21:04PM *  1 point [-]

Clippy is very anthropomorphic though - it magically has a real world goal, it equates the algorithm that's it from the inside with the hardware it sees through it's eyes from the outside, it will 'improve' that hardware inclusive of paperclip counter's accuracy but it won't improve the output of paperclip counter. It's easy to imagine - in your mind you have the number of paperclips counted externally, and a paperclip maximizer increases this count - and hard or impossible to actually define, let alone build.

Comment author: CronoDAS 21 January 2013 10:19:03PM 3 points [-]
Comment author: [deleted] 18 January 2013 07:37:16PM *  3 points [-]

Religion being the only social structure that is know to be able to endure even a fraction of the time required, it has been proposed that religion is the least worse means to warn distsnt future generations about nuclear waste sites. Not money or architecture or language, but ghost stories.

Comment author: private_messaging 20 January 2013 04:06:28PM *  4 points [-]

How's about a bias demon where people who read far too much scifi, via the same biases that on national scale produced the TSA, are overly concerned about things like clippy, creating the concept. Now that's a clippy-creating bias demon.

Comment author: David_Gerard 20 January 2013 04:50:41PM 1 point [-]

Yes, the whole thing is a bit close to "but some biases are good!" No, they aren't.

Comment author: Rain 30 January 2013 08:47:35PM 0 points [-]

This was used in an episode of South Park.