2 min read

31

What is the value of your life? Not a life, but your life. Would you consider yourself expendable?

The last time I almost died, it was over a missing shotgun. I was in Afghanistan at the time, somewhere closer to the border with Pakistan. The roads are rough, particularly when there aren't any, and during our travels we would occasionally have to stop and re-secure our cargo. During one such stop, it seems a weapon was left on the bed of the truck and then fell off somewhere. It was discovered the weapon was missing, and then we were ordered to go back out and find it.

It's important to recognize early on that there was no expectation of success here - this was a punishment. It was more than three hours' travel, through mountains and river valleys. Spotting a shotgun through the armored window was not likely. Up the chain of command went the observation that driving up and down the same road a bunch of times is a bad plan considering we were in Afghanistan. Down the chain of command came the orders to go out anyway. It was clear that we were expected to operate continuously until we were attacked, or the higher ups relented. Everyone knew. Everyone knew that everyone knew. It was common knowledge. These are moments that test men and their oaths: we looked at one another in grimly, seeing the hubris and stupidity at work; there was talk of refusing to go.

We went.

The fourth time we covered the ground, ~300lbs of home-made explosive went off under the back tire of the truck I was in. Where there should have been a sound there was a mighty shockwave, and everything went silent and in slow motion. The world was queerly bright, and I wondered why the smell of dust was so strong. Then I saw my arms floating in front of me, and the tied down equipment only held by its ties; I knew what had happened, and what was next. I thought very clearly: "This part is going to suck."

Everybody lived - the new trucks have all kinds of tricks for dispensing with kinetic energy, so every nut and bolt on the thing blew off and they found one of the tires 400 meters away. Those weighed 700lbs or so, with the wheel in them; we made a game of trying to flip one in our camp, and perhaps a quarter succeeded. Three of us left in a helicopter, myself included. One or more of us would probably have died but for a series of recent changes: we stopped having a rear gunner (three weeks); we left the emergency hatches cracked in order to avoid overpressure (two weeks); we made sure our seatbelts were as tight as possible (that day). Whether we would die was not subject to our control. That missing shotgun can be had for about $650. Last I checked - really checked - that was at the object level what I was willing to lay down my life for.

Guess the number that is 2/3 of the average of all guesses. Among rational guessers, that number is 0.

Gamble your life for the value that is less than the average of all gambles. Among warriors, that value is nothing.

New Comment
20 comments, sorted by Click to highlight new comments since:
[-]gjm170

Let's be clearer about what this little military farce shows about whose valuations of whose lives.

ryan_b himself did not have the option of giving up $650 instead of risking his life. No doubt he could have offered, but the offer would have been refused. His actual alternative to risking his life was something like getting kicked out of the military for insubordination, together with whatever knockon effects that might have had on (e.g.) other people's attitudes to him, his future career prospects, etc., plus any further impacts (positive and negative) on the other people involved (e.g., some other people might have been inspired by his example, and also been kicked out; the expedition might have continued with reduced personnel, perhaps putting other people in more danger; the higher-ups' cavalier attitude to the lives of their men might have been exposed, with interesting consequences for their careers, for morale, for future decisions in the future, etc.)

The higher-ups had the option of giving up $650 to avoid risking Ryan's (and others') lives, sort of. They could just have ordered a replacement rifle. But, as Ryan pointed out, retrieving the rifle wasn't the main point of this fiasco. The point was (I guess) to make the men fear their superiors, to motivate them to be more careful in future (hence hopefully less likely to lose equipment or to make other similar mistakes), to exercise power arbitrarily and thus reinforce the hierarchy of power and authority, and so forth. Perhaps they also hoped that an attack would give the men some live-action combat training in a way less likely to kill them than starting an actual battle. These may be good things or bad things for senior military officers to want (I say "bad" but clearly those officers, who know more about military life than I do, disagreed) but in any case the impact there is a lot more than $650.

I think it's fair to say that in this context both Ryan and his military superiors valued his life a lot lower than the usual millions-of-dollars figure. But it's hardly news that the military treats soldiers' lives as cheap, and trains them to do likewise.

But I mean, isn't it obvious that damage to the truck alone as a result of the attack would imply quite a higher cost than whatever the shotgun was worth? (And yes, I think this is clearly the case even when you consider that the probability of being attacked is quite a bit less than 100%.) I don't think this shows lives being insufficiently valued in the military; I think it just shows the sort of pervasive dysfunction we would expect in any large-scale organization lacking internal mechanisms to ensure accountability and proper response to incentives.

But I mean, isn't it obvious that damage to the truck alone as a result of the attack would imply quite a higher cost than whatever the shotgun was worth?

This still feels like focusing on the wrong aspect of the equation. The dollar value of the shotgun (and truck) is just completely irrelevant. The issues at stake here are:

1. (if you're being charitable) maintaining a hierarchy where you know that if command asks someone to risk their life, they will risk their life even if they disagree with command's goals (so that everyone going into battle knows that nobody else is going to defect).

(I think this is more important than the "don't forget your rifle next time", although there's at least a bit of that too)

2. (if you're being uncharitable) maintaining the hierarchy as a raw exercise in primate power for the sake it.

I had a conversation with a friend on IM about this, in which I fleshed out this position a bit. The friend said something about "the point of the post was to give people a taste of the warrior ethos". Which seemed sort of true but also not quite the point. (truth-be-told I'm not sure I understood my friend's comment, but I think my response mostly stands on it's own)

...

I saw the entire essay through the lens of "if you're going into combat, you absolutely need to be able to trust everyone to be following predictable procedures that allow you to make snap decisions in combat without worrying if other people are going to follow through."

The charitable/interesting version of this post, to me, isn't about warrior ethos and honor. Or, maybe it is (and I'm not steeped in warrior ethos enough to get it), but the relevant thing is about trust ethos, and common knowledge, and reliability.

It's not intrinsically the case that the way to accomplish this is by orders from high command getting followed absolutely – there are other ways you can come to trust the others in your battalion.

The Ender Wiggin way is to trust that people are doing things that make sense to them and to have a good model of what sort of procedures they'll be following.

I have no idea if the Ender Wiggin way actually works in practice. I suspect it depends on how accurate you think command's information is going to generally be in different circumstances and how quickly people in the change might need to make decisions in the face of new information. But the cost of having to model "are your allies who are not in communication going to stick to the plan?" can pretty high in the heat of the moment.

"Dollar value of an insult" is relevant insofar as it's a proxy for the ultimately utility cost: Can you win the war? Which in turn is directly affected by "can you trust that people will absolutely follow orders from command?"

I felt like the OP and several comments were missing the mark by not at least factoring this into the calculus.

(the alternate interpretation of "high command is having a stupid power trip" or "making a questionable judgment about attention-to-detail in particular" also seem quite plausible, just, those should not be the only things getting discussed here)

I like your lens! For my purposes you are both sitting firmly in true-but-not-quite-the-point, which I consider a good outcome; I realize it is normally good practice to clearly articulate what the larger point is at the outset, but I am walking deliberately into ineffable territory so the usual method of telling readers what the conclusion is seems disingenuous. That being said, I can provide more detail about the lens(es) from which I wrote the post:

1) I have different intuitions about the value of human life from most of the community. These intuitions have been heavily shaped by my experiences in war.

2) I can't assume anyone else has similar experiences, and they are famously difficult to communicate directly - I need a different way to bridge the inferential gulf.

3) I opted for a concrete question about the reader's perspective (are you expendable) and provided a concrete personal experience (which says yes) to start.

Speaking to the questions of warrior ethos etc, I put it to you that this is not as distant and exotic a thing as most people suspect. Rather it is made up of things which are closer and you already understand, like trust, common knowledge, and reliability. The hard-to-grok part is exactly how they are arranged, and why they are arranged that way. One important detail is that it does not require understanding, merely execution: in example, lots of soldiers wouldn't be able to articulate why common knowledge is important even to themselves, but they are perfectly good soldiers because they accept the knowledge-which-is-common and conduct themselves in such a way that it is maintained.

I have decided to write a follow up, which will include a little clarification and a counterfactual to help illustrate. I will continue to use game-theoretic metaphors, but my motivation is not to achieve agreement about any particular detail of military affairs but rather to interrogate the intuitions which allow one to accept them.

A math metaphor, which I may repeat if it makes sense: we could probably come to an agreement about the details of some combat-related point, but my interest is in communicating the parallax between our perspectives of that point.

I strongly suspect that if I do well enough at this, the shift in perspective will allow more nuance about the important problems we are concerned with and our relationship to them. I think this would be valuable to the community.

I'm interested in seeing where you go from here. With the old lesswrong demographic, I would predict you would struggle, due to cryonics/life extension being core to many people's identities.

I'm not so sure about current LW though. The fraction of the EA crowd that is total utilitarian probably won't be receptive.

I'm curious what it is that your intuitions do value highly. It might be better to start with that.

I am also uncertain. But it appears to me that even an informed rejection will still be valuable. Follow up here.

1 actually seems way worse than 2 because it concentrates risk. 2 can only get so bad. 1 ends in nuclear holocaust or worse.

I tend to agree with this view. I think that is also one of the aspects implied (sic) by the implicit and explicit communication post: The value of maintaining a highly cohesive and committed team may be a higher value (for a military force) than the risk of loss of life - because in a real war many more lives will be lost (at least that is the reasoning of the military I guess).

This was a really interesting read and I really liked the whole thing right up until the last two sentences which are for some reason asserted with no argument in favor of them? Or possibly I am not following how the preceding content is an argument in favor?

The warrior ethos is the result of an iterated hawk-dove game.

In the hawk-dove game, playing 'hawk' only works if the ratio at which you value winning over surviving is higher than your opponent's.

Gaining the 'warrior ethos' requires driving the amount that you value surviving down to epsilon, so that no one can successfully back you down with a threat to your life.

there was no expectation of success here - this was a punishment. ... Everybody lived

You really can't summarize this as "lay down my life for $650". This was a series of decisions about being in the military in Afghanistan in the first place, and a specific decision to accept a serious risk to life in order to set an example and stay out of jail.

Nowhere in this chain was the dollar value important.

[ and just to be clear, I'm not questioning the choices, just pointing out that "value" in decision-making is distinct from dollars - it's about the predicted state of the world in each choice-path. ]

You seem to have used the price of the shutgun directly without factoring in the likelihood of actually dying. I would be interested in how you would model the underlying probabilities.

This is true. I would rate the odds of actually dying 'very low', which my gut wants to ballpark in the <5% region. However, I do have some information that would let us make this a little more precise, so here goes:

I was in-country for a little under 90 days. The first 30 of them were acclimatization and specialized training, which didn't involve being exposed to any real risk of combat. According to the paperwork associated with the incident, we were involved in 19 ambushes while I was there, which includes things like IEDs, but also just people spraying the vehicles with their guns to show us they are not afraid (this is almost zero risk to us). This leaves 19 ambushes over 60 days of operations, going out on average once per day.

So my prior is ~1/3 chance of someone trying to kill us each trip. Now there are two additional factors to consider: one, trips along the same route are not independent; two, the likelihood of dying in an attack is low. As it happens, 'never take the same way twice' is the literal first rule they taught us about movement when I went through training, perhaps because all of the cadre had been deployed previously. The more you drive a route, the more likely the enemy is to be prepared for you to drive the route - so each time you go out, the risk of attack is higher than it was before. When I was trained, they told us it was a given you would be hit the second time around - this is definitely an exaggeration, but I'll say it is half again as likely as the previous trip at a guess, for a multiplier of 3/2. The likelihood of dying in an attack is low because everyone has first-responder training and medical kits, and we have medical evacuation helicopters, so dying usually needs one of three things: dying instantly; being so badly injured you die before medical evacuation; medical error (overlooked internal bleeding, or the like). Further, relatively few attacks are bad enough that someone is injured. I don't actually know how to model these, though I know the Army has statistics for all of them.

Naively I can take a stab and say something like 1/20 attacks where someone is injured lead to one of the death conditions, with the proviso that it is only a guess. I can make a better guess for the probability of injury given an attack - of the 19 ambushes, I think there were 3 events where someone was injured (each corresponding to an IED blowing up a truck, incidentally). So this is what we have currently:

Prior for being attacked on a given trip: 1/3

Additional trips on the same route multiplier: 3/2

Likelihood of someone being injured during an attack: 3/19

Estimated likelihood of death given injury: 1/20

So that looks like:

(1/3)(3/19)(1/20) + (1/3)(3/2)(3/19)(1/20) + (1/3)(3/2)(3/2)(3/19)(1/20) + (1/3)(3/2)(3/2)(3/2)(3/19)(1/20)

This comes out to a little over 0.02 probability of someone dying, for four trips over the same route. Such is the power of history's mightiest civilization.

Of course there's plenty of stuff we left out: dying isn't the only risk, being crippled is much more common; risk of injury is not equal throughout the group, and injuries tend to be clumpy because most of them come from trucks of people getting blown up; I knew none of these calculations at the time.

This doesn't change the underlying intuition. In example, in the rational guesser game it does not matter what fraction of the average-of-guesses is given; the difference between 2/3 and 99/100 is only the number of iterations to 0.

The United States Army is probably the foremost institution of its kind in history. It has high prestige, enormous resources, and a great deal of experience. The feedback loops of violence are tight; the risks to personnel are immediate; the decisions it makes are as close as humanity gets to a stock market for death. And yet, at the point of decision, we will risk our lives for zero gain.

This causes me to view the approach for valuation of lives in X-risk uneasily, and from the opposite direction of sacredness.

One or more of us would probably have died but for a series of recent changes: we stopped having a rear gunner (three weeks)

Why did not having the rear gunner help?

The blast rotated the truck 180 degrees on both the axes of the plane, so the crew compartment was facing back the way we came. The rear gunner position was to stand up out of a rear emergency hatch, on one of the seats. We just had a jerry-rigged mechanism to secure them to the truck. They would have no shock distribution from the seats or protection from the seatbelts (which are like the ones they use in Nascar).

As a consequence, anyone in that position might have been torn in half, or splattered against the roof of the truck as it came up to meet them, or if the make-shift harness failed been flung hundreds of meters away. We stopped after units started losing guys when trucks would roll over.

As it happens, that was my job until we changed the procedure.

The shotgun, or its price wasn't at stake - money is never at stake with a military that has F-35s, and the Taliban likely has little use for a shotgun. What was at stake was you being put in your proper place before a corrupt institution. And many, many warriors throughout history simply weighed their options in chance of dying and weight of loot - which is, if nothing else, smarter than being in the US Armed Forces.

Did this incident change your expected-value calculations about following such orders in the future? Was there a surprise (and subsequent bayesean update of your beliefs) from any of:

a) the order being confirmed by the higher-ups

b) your immediate superiors following THEIR orders in ordering you to go back

c) your teammates' willingness to follow this order

d) the actual outcome (blown up, survived)

a) Confidence in the ability and good faith of the small group of people in a position to issue the order went down. In particular, my estimation of their ability to understand risks and trade-offs.

b) No change. This was expected.

c) No change. This was expected. I would also like to flag this as the point that appears to be the largest inferential gulf between most of the audience and I.

d) Confidence in the hardware went up, confidence that my buddies would live through future engagements went up as a result. I've ridden in two prior generations of vehicle, and the odds of there being no fatalities in either approaches zero in my estimation. The progress of the MRAP series of vehicles is on the short list of success stories from those wars. Also, I updated in favor of seatbelts.

It is worth mentioning I had no notion of Bayes at the time of this event.

Promoted to frontpage.