RolfAndreassen comments on Near-Term Risk: Killer Robots a Threat to Freedom and Democracy - Less Wrong

10 Post author: Epiphany 14 June 2013 06:28AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (105)

You are viewing a single comment's thread. Show more comments above.

Comment author: Emile 14 June 2013 08:24:37AM *  2 points [-]

it would allow a small number of people to concentrate a very large amount of power

Possibly a smaller number than with soldiers, but not that small - you still need to deal with logistics, maintenance, programming...

it's unthinkable today that American soldiers might suddenly decide to follow a tyrannical leader tomorrow whose goal is to have total power and murder all opponents. It is not, however, unthinkable at all that the same tyrant, if empowered by an army of combat drones, could successfully launch such an attack without risk of mutiny.

It might me a bit more likely, but it stills seems like a very unlikely scenario (0.3% instead of 0.1%?), still staying less likely than other disaster scenarios (breakdown of infrastructure/economy leading in food shortages and panic and riots; a big war starting on one of the less stable parts of the world (ex-Yugoslavia, China//Taiwan, Middle east) and spilling over; an ideological movement motivating a big part of the population into violent action; UFAI; etc.)

EDIT: to expand a bit on this, I don't think replacing soldiers by drones increases risk much all else being equal because the kind of things soldiers would refuse to do are also the kind of things the (current) command structure is unlikely to want to do anyway.

Comment author: Epiphany 14 June 2013 08:44:09AM *  -1 points [-]

Ok let's get some numbers.

I highly doubt that either one of us would be able to accurately estimate how many employees it would require to make a robot army large enough to take over a population, but looking at some numbers will at least give us some perspective. I'll use the USA as an example.

The USA has 120,022,084 people fit for military service according to Wikipedia. (The current military is much smaller, but if there were a takeover in progress, that's the maximum number of hypothetical America soldiers we could have defending the country.)

We'll say that making a robot army takes as many programmers as Microsoft and as many engineers and factory workers as Boeing:

Microsoft employees: 97,811 Boeing employees: 171,700

That's 0.22% of the number of soldiers.

I'm not sure how many maintenance people and logistics people it would require, but even if we double that .22%, we still have only .44%.

Is it possible that 1 in 200 people or so are crazy enough to build and maintain a robot army for a tyrant?

Number of sociopaths: 1 in 20.

And you wouldn't even have to be a sociopath to follow a new Hitler.

I like that you brought up the point that it would take a significant number of employees to make a robot army happen, but I'm not convinced that this makes us safe. This is especially because they could do something like build military robots that are very close to lethal autonomy but not quite, tell people they're making something else, make software to run the basic functions like walking and seeing, and then have a very small number of people make modifications to the hardware and/or software to turn them into autonomous killers.

Of course, once the killer robots are made, then they can just use them to coerce the maintenance and logistics people.

How many employees would have to be aware of their true ambitions? That might be the key question.

Comment author: RolfAndreassen 14 June 2013 09:59:08PM 2 points [-]

The USA has 120,022,084 people fit for military service according to Wikipedia. (...) That's 0.22% of the number of soldiers.

Excuse me? You are taking the number of military-age males and using it as the number of soldiers! The actual US armed forces are a few million. 5% would be a much better estimate. This aside, you are ignoring that "lethal autonomy" is nowhere near the same thing as "operational autonomy". A Predator drone requires more people to run it - fuelling, arming, polishing the paint - than a fighter aircraft does.

Of course, once the killer robots are made, then they can just use them to coerce the maintenance and logistics people.

How? "Do as I say, or else I'll order you to fire up the drones on your base and have them shoot you!" And while you might credibly threaten to instead order the people on the next base over to fire up their drones, well, now you've started a civil war in your own armed forces. Why will that work better with drones than with rifles?

Again, you are confusing lethal with operational autonomy. A lethally-autonomous robot is just a weapon whose operator is well out of range at the moment of killing. It still has to be pointed in the general direction of the enemy, loaded, fuelled, and launched; and you still have to convince the people doing the work that it needs to be done.

Comment author: gwern 14 June 2013 11:14:20PM 5 points [-]

A Predator drone requires more people to run it - fuelling, arming, polishing the paint - than a fighter aircraft does.

It does? I would've guessed the exact opposite and that the difference would be by a large margin: drones are smaller, eliminate all the equipment necessary to support a human, don't have to be man-rated, and are expected to have drastically less performance in terms of going supersonic or executing high-g maneuvers.

Comment author: Randaly 15 June 2013 11:10:02PM *  0 points [-]

Yes. An F-16 requires 100 support personnel; a Predator 168; a Reaper, 180. Source.

It seems like some but not all of the difference is that manned planes have only a single pilot, whereas UAV's not only have multiple pilots, but also perform much more analysis on recorded data and split the job of piloting up into multiple subtasks for different people, since they are not limited by the need to have only 1 or 2 people controlling the plane.

If I had to guess, some of the remaining difference is probably due to the need to maintain the equipment connecting the pilots to the UAV, in addition to the UAV itself; the most high-profile UAV failure thus far was due to a failure in the connection between the pilots and the UAV.

Comment author: gwern 16 June 2013 01:14:57AM 0 points [-]

I'm not sure that's comparing apples and oranges. From the citation for the Predator figure:

About 168 people are needed to keep a single Predator aloft for 24 hours, according to the Air Force. The larger Global Hawk surveillance drone requires 300 people. In contrast, an F-16 fighter aircraft needs fewer than 100 people per mission.

I'm not sure how long the average mission for an F-16 is, but if it's less than ~12 hours, then the Predator would seem to have a manpower advantage; and the CRS paper cited also specifically says:

In addition to having lower operating costs per flight hour, specialized unmanned aircraft systems can reduce flight hours for fighter aircraft

Comment author: Randaly 16 June 2013 04:06:39AM 0 points [-]

The F-16 seems to have a maximum endurance of 3-4 hours, so I'm pretty sure its average mission is less than 12 hours.

My understanding was that Rolf's argument depended on the ratio personnel:plane, not on the ratio personnel:flight hour; the latter is more relevant for reconnaissance, ground attack against hidden targets, or potentially for strikes at range, whereas the former is more relevant for air superiority or short range strikes.

Comment author: gwern 16 June 2013 05:05:27PM 0 points [-]

I don't think it saves Rolf's point:

The actual US armed forces are a few million. 5% would be a much better estimate. This aside, you are ignoring that "lethal autonomy" is nowhere near the same thing as "operational autonomy". A Predator drone requires more people to run it - fuelling, arming, polishing the paint - than a fighter aircraft does.

If you are getting >6x more flight-hours out of a drone for <2x as many people used as compared to its alternative, then by switching a fleet of alternatives entirely to drones, the effectiveness or lethality increases by >6x for an increased man power of <2x - even if you keep the manpower constant and shrink the size of the fleet to compensate for that <2x manpower penalty, you've still got a new fleet which is somewhere around 6x more lethal. Or you could take the tradeoff even further and have an equally lethal fleet with a small fraction of the total manpower, because each drone goes so much further than its equivalent. So a drone fleet off similar lethality does have more operational autonomy!

That's why per flight hour costs matter - because ultimately, the entire point of having these airplanes is to fly them.

Comment author: Epiphany 15 June 2013 01:24:31AM *  -2 points [-]

Would you happen to be able to provide these figures:

The ratio of human resources-to-firepower on the current generation of weapons.

The ratio of human resources-to-firepower on the weapons used during eras where oppression was common.

I'd like to compare them.

Hmm, "firepower" is vague. I think the relevant number here would be something along the lines of how many people can be killed or subdued in a conflict situation.

Comment author: gwern 15 June 2013 03:30:29AM 1 point [-]

I have no idea; as I said, my expectations are just guesses based on broad principles (slow planes are cheaper than ultra-fast planes; clunk planes are cheaper than ultra-maneuverable ones; machines whose failure do not immediately kill humans are cheaper to make than machines whose failure do entail human death; the cheapest, lightest, and easiest to maintain machine parts are the ones that aren't there). You should ask Rolf, since apparently he's knowledgeable in the topic.

Comment author: Epiphany 15 June 2013 04:28:58AM 0 points [-]

Thanks. I will ask Rolf.

Comment author: Epiphany 15 June 2013 04:29:16AM 0 points [-]

Would you happen to be able to provide these figures:

The ratio of human resources-to-firepower on the current generation of weapons.

The ratio of human resources-to-firepower on the weapons used during eras where oppression was common.

I'd like to compare them.

Hmm, "firepower" is vague. I think the relevant number here would be something along the lines of how many people can be killed or subdued in a conflict situation.

Comment author: Epiphany 15 June 2013 01:03:22AM *  -1 points [-]

Excuse me? You are taking the number of military-age males and using it as the number of soldiers!

Yes!

The actual US armed forces are a few million. 5% would be a much better estimate.

If the question here is "How many people are currently in the military" my figure is wrong. However, that's not the question. The question is "In the event that a robot army tries to take over the American population, how many American soldiers might there be to defend America?" You're estimating in a different context than the one in my comment.

This aside, you are ignoring that "lethal autonomy" is nowhere near the same thing as "operational autonomy"

Actually, if you're defining "operational autonomy" as "how many people it takes to run weapons", I did address that when I said "I'm not sure how many maintenance people and logistics people it would require, but even if we double that .22%, we still have only .44%." If you have better estimates, would you share them?

How? "Do as I say, or else I'll order you to fire up the drones on your base and have them shoot you!"

Method A. They could wait until the country is in turmoil and prey on people's irrationality like Hitler did.

Method B. They could get those people to operate the drones under the guise of fighting for a good cause. Then they could threaten to use the army to kill anyone who opposes them. This doesn't have to be sudden - it could happen quite gradually, as a series of small and oppressive steps and rules wrapped in doublespeak that eventually lead up to complete tyranny. If people don't realize that most other people disagree with the tyrant, they will feel threatened and probably comply in order to survive.

Method C. Check out the Milgram experiment. Those people didn't even need to be coerced to apply lethal force. It's a lot easier than you think.

Method D. If they can get just a small group to operate a small number of drones, they can coerce a larger group of people to operate more drones. With the larger group of people operating drones, they can coerce even more people, and so on.

Why will that work better with drones than with rifles?

This all depends on the ratio of people it takes to operate the weapons vs. number of people the weapons can subdue. Your perception appears to be that predator drones require more people to run them than a fighter aircraft. My perception is that it doesn't matter how many people it takes to operate a predator drone because war technology is likely to be optimized further than it is today, and if it is possible to decrease the number of people it requires to build/maintain/run/etc. the killer robots significantly below the number of people it would take to get the same amount of firepower otherwise, then of course they can take over a population more easily.

A high firepower to human resource ratio means takeovers would work better.

A lethally-autonomous robot is just a weapon whose operator is well out of range at the moment of killing.

That's not what Suarez says. Even if he's wrong do you deny that it's likely that technology will advance to the point where people can make robots capable of killing without a human making the decision? That's what this conversation is about. Don't let us get all mixed up like Eliezer warns us about in 37 ways words can be wrong. If we're talking about robots that can kill without a human's decision, those are a threat, and could potentially reduce the human resources-to-firepower ratio enough to threaten democracy. If you want to disagree with me about what words I should use to speak about this, that's great. In that case, though, I'd like to know where your credible sources are so that I can read authoritative definitions please.

and you still have to convince the people doing the work that it needs to be done.

Hitler.

Milgram experiment.

Number of sociopaths: 1 in 20.

Is rationality taught in school?: No.

Comment author: RolfAndreassen 16 June 2013 02:42:19AM 1 point [-]

Methods A, B, C and D

What prevents these methods from being used with rifles? What is special about robots in this context?

Even if he's wrong do you deny that it's likely that technology will advance to the point where people can make robots capable of killing without a human making the decision?

No, we already have those. The decision to kill has nothing to do with it. The decisions of where to put the robot, and its ammunition, and the fuel, and everything else it needs, so that it's in a position to make the decision to kill, is what we cannot yet do programmatically. You're confusing tactics and strategy. You cannot run an army without strategic decisionmakers. Robots are not in a position to do that for, I would guess, at least twenty years.

Hitler. Milgram experiment. Number of sociopaths: 1 in 20. Is rationality taught in school?: No.

Ok, so this being so, how come we don't already have oppressive societies being run with plain old rifles?