(Part 7 of 8 in "Three Worlds Collide")

Standing behind his target, unnoticed, the Ship's Confessor had produced from his sleeve the tiny stunner - the weapon which he alone on the ship was authorized to use, if he made a determination of outright mental breakdown. With a sudden motion, his arm swept outward -

- and anesthetized the Lord Akon.

Akon crumpled almost instantly, as though most of his strings had already been cut, and only a few last strands had been holding his limbs in place.

Fear, shock, dismay, sheer outright surprise: that was the Command Conference staring aghast at the Confessor.

From the hood came words absolutely forbidden to originate from that shadow: the voice of command. "Lord Pilot, take us through the starline back to the Huygens system. Get us moving now, you are on the critical path. Lady Sensory, I need you to enforce an absolute lockdown on all of this ship's communication systems except for a single channel under your direct control. Master of Fandom, get me proxies on the assets of every being on this ship. We are going to need capital."

For a moment, the Command Conference was frozen, voiceless and motionless, as everyone waited for someone else do to something.

And then -

"Moving the Impossible now, my lord," said the Lord Pilot. His face was sane once again. "What's your plan?"

"He is not your lord!" cried the Master of Fandom. Then his voice dropped. "Excuse me. Confessor - it did not appear to me that our Lord Administrator was insane. And you, of all people, cannot just seize power -"

"True," said the one, "Akon was sane. But he was also an honest man who would keep his word once he gave it, and that I could not allow. As for me - I have betrayed my calling three times over, and am no longer a Confessor." With that same response, the once-Confessor swept back the hood -

At any other time, the words and the move and the revealed face would have provoked shock to the point of fainting. On this day, with the whole human species at stake, it seemed merely interesting. Chaos had already run loose, madness was already unleashed into the world, and a little more seemed of little consequence.

"Ancestor," said the Master, "you are twice prohibited from exercising any power here."

The former Confessor smiled dryly. "Rules like that only exist within our own minds, you know. Besides," he added, "I am not steering the future of humanity in any real sense, just stepping in front of a bullet. That is not even advice, let alone an order. And it is... appropriate... that I, and not any of you, be the one who orders this thing done -"

"Fuck that up the ass with a hedge trimmer," said the Lord Pilot. "Are we going to save the human species or not?"

There was a pause while the others figured out the correct answer.

Then the Master sighed, and inclined his head in assent to the once-Confessor. "I shall follow your orders... kiritsugu."

Even the Kiritsugu flinched at that, but there was work to be done, and not much time in which to do it.

In the Huygens system, the Impossible Possible World was observed to return from its much-heralded expedition, appearing on the starline that had shown the unprecedented anomaly. Instantly, without a clock tick's delay, the Impossible broadcast a market order.

That was already a dozen ways illegal. If the Impossible had made a scientific discovery, it should have broadcast the experimental results openly before attempting to trade on them. Otherwise the result was not profit but chaos, as traders throughout the market refused to deal with you; just conditioning on the fact that you wanted to sell or buy from them, was reason enough for them not to. The whole market seized up as hedgers tried to guess what the hidden experimental results could have been, and which of their counterparties had private information.

The Impossible ignored the rules. It broadcast the specification of a new prediction contract, signed with EMERGENCY OVERRIDE and IMMINENT HARM and CONFESSOR FLAG - signatures that carried extreme penalties, up to total confiscation, for misuse; but any one of which ensured that the contract would appear on the prediction markets at almost the speed of the raw signal.

The Impossible placed an initial order on the contract backed by nearly the entire asset base of its crew.

The prediction's plaintext read:

In three hours and forty-one minutes, the starline between Huygens and Earth will become impassable.
Within thirty minutes after, every human being remaining in this solar system will die.
All passage through this solar system will be permanently denied to humans thereafter.
(The following plaintext is not intended to describe the contract's terms, but justifies why a probability estimate on the underlying proposition is of great social utility:
ALIENS. ANYONE WITH A STARSHIP, FILL IT WITH CHILDREN AND GO! GET OUT OF HUYGENS, NOW!)

In the Huygens system, there was almost enough time to draw a single breath.

And then the markets went mad, as every single trader tried to calculate the odds, and every married trader abandoned their positions and tried to get their children to a starport.

"Six," murmured the Master of Fandom, "seven, eight, nine, ten, eleven -"

A holo appeared within the Command Conference, a signal from the President of the Huygens Central Clearinghouse, requesting (or perhaps "demanding" would have been a better word) an interview with the Lord Administrator of the Impossible Possible World.

"Put it through," said the Lord Pilot, now sitting in Akon's chair as the figurehead anointed by the Kiritsugu.

"Aliens?" the President demanded, and then her eye caught the Pilot's uniform. "You're not an Administrator -"

"Our Lord Administrator is under sedation," said the Kiritsugu beside; he was wearing his Confessor's hood again, to save on explanations. "He placed himself under more stress than any of us -"

The President made an abrupt cutting gesture. "Explain this - contract. And if this is a market manipulation scheme, I'll see you all tickled until the last sun grows cold!"

"We followed the starline that showed the anomalous behavior," the Lord Pilot said, "and found that a nova had just occurred in the originating system. In other words, my Lady President, it was a direct effect of the nova and thus occurred on all starlines leading out of that system. We've never found aliens before now - but that's reflective of the probability of any single system we explore having been colonized. There might even be a starline leading out of this system that leads to an alien domain - but we have no way of knowing which one, and opening a new starline is expensive. The nova acted as a common rendezvous signal, my Lady President. It reflects the probability, not that we and the aliens encounter each other by direct exploration, but the probability that we have at least one neighboring world in common."

The President was pale. "And the aliens are hostile."

The Lord Pilot involuntarily looked to the Kiritsugu.

"Our values are incompatible," said the Kiritsugu.

"Yes, that's one way of putting it," said the Lord Pilot. "And unfortunately, my Lady President, their technology is considerably in advance of ours."

"Lord... Pilot," the President said, "are you certain that the aliens intend to wipe out the human species?"

The Lord Pilot gave a very thin, very flat smile. "Incompatible values, my Lady President. They're quite skilled with biotechnology. Let's leave it at that."

Sweat was running down the President's forehead. "And why did they let you go, then?"

"We arranged for them to be told a plausible lie," the Lord Pilot said simply. "One of the reasons they're more advanced than us is that they're not very good at deception."

"None of this," the President said, and now her voice was trembling, "none of this explains why the starline between Huygens and Earth will become impassable. Surely, if what you say is true, the aliens will pour through our world, and into Earth, and into the human starline network. Why do you think that this one starline will luckily shut down?"

The Lord Pilot drew a breath. It was good form to tell the exact truth when you had something to hide. "My Lady President, we encountered two alien species at the nova. The first species exchanged scientific information with us. It is the second species that we are running from. But, from the first species, we learned a fact which this ship can use to shut down the Earth starline. For obvious reasons, my Lady President, we do not intend to share this fact publicly. That portion of our final report will be encrypted to the Chair of the Interstellar Association for the Advancement of Science, and to no other key."

The President started laughing. It was wild, hysterical laughter that caused the Kiritsugu's hood to turn toward her. From the corner of the screen, a gloved hand entered the view; the hand of the President's own Confessor. "My lady..." came a soft female voice.

"Oh, very good," the President said. "Oh, marvelous. So it's your ship that's going to be responsible for this catastrophe. You admit that, eh? I'm amazed. You probably managed to avoid telling a single direct lie. You plan to blow up our star and kill fifteen billion people, and you're trying to stick to the literal truth."

The Lord Pilot slowly nodded. "When we compared the first aliens' scientific database to our own -"

"No, don't tell me. I was told it could be done by a single ship, but I'm not supposed to know how. Astounding that an alien species could be so peaceful they don't even consider that a secret. I think I would like to meet these aliens. They sound much nicer than the other ones - why are you laughing?"

"My Lady President," the Lord Pilot said, getting a grip on himself, "forgive me, we've been through a lot. Excuse me for asking, but are you evacuating the planet or what?"

The President's gaze suddenly seemed sharp and piercing like the fire of stars. "It was set in motion instantly, of course. No comparable harm done, if you're wrong. But three hours and forty-one minutes is not enough time to evacuate ten percent of this planet's children." The President's eyes darted at something out of sight. "With eight hours, we could call in ships from the Earth nexus and evacuate the whole planet."

"My lady," a soft voice came from behind the President, "it is the whole human species at stake. Not just the entire starline network beyond Earth, but the entire future of humanity. Any incrementally higher probability of the aliens arriving within that time -"

The President stood in a single fluid motion that overturned her chair, moving so fast that the viewpoint bobbed as it tried to focus on her and the shadow-hooded figure standing beside. "Are you telling me," she said, and her voice rose to a scream, "to shut up and multiply?"

"Yes."

The President turned back to the camera angle, and said simply, "No. You don't know the aliens are following that close behind you - do you? We don't even know if you can shut down the starline! No matter what your theory predicts, it's never been tested - right? What if you create a flare bright enough to roast our planet, but not explode the whole sun? Billions would die, for nothing! So if you do not promise me a minimum of - let's call it nine hours to finish evacuating this planet - then I will order your ship destroyed before it can act."

No one from the Impossible spoke.

The President's fist slammed her desk. "Do you understand me? Answer! Or in the name of Huygens, I will destroy your ship -"

Her Confessor caught her President's body, very gently supporting it as it collapsed.

Even the Lord Pilot was pale and silent. But that, at least, had been within law and tradition; no one could have called that thinking sane.

On the display, the Confessor bowed her hood. "I will inform the markets that the Lady President was driven unstable by your news," she said quietly, "and recommend to the government that they carry out the evacuation without asking further questions of your ship. Is there anything else you wish me to tell them?" Her hood turned slightly, toward the Kiritsugu. "Or tell me?"

There was a strange, quick pause, as the shadows from within the two hoods stared at each other.

Then: "No," replied the Kiritsugu. "I think it has all been said."

The Confessor's hood nodded. "Goodbye."

"There it goes," the Ship's Engineer said. "We have a complete, stable positive feedback loop."

On screen was the majesty that was the star Huygens, of the inhabited planet Huygens IV. Overlaid in false color was the recirculating loop of Alderson forces which the Impossible had steadily fed.

Fusion was now increasing in the star, as the Alderson forces encouraged nuclear barriers to break down; and the more fusions occurred, the more Alderson force was generated. Round and round it went. All the work of the Impossible, the full frantic output of their stardrive, had only served to subtly steer the vast forces being generated; nudge a fraction into a circle rather than a line. But now -

Did the star brighten? It was only their imagination, they knew. Photons take centuries to exit a sun, under normal circumstances. The star's core was trying to expand, but it was expanding too slowly - all too slowly - to outrun the positive feedback that had begun.

"Multiplication factor one point oh five," the Engineer said. "It's climbing faster now, and the loop seems to be intact. I think we can conclude that this operation is going to be... successful. One point two."

"Starline instability detected," the Lady Sensory said.

Ships were still disappearing in frantic waves on the starline toward Earth. Still connected to the Huygens civilization, up to the last moment, by tiny threads of Alderson force.

"Um, if anyone has anything they want to add to our final report," the Ship's Engineer said, "they've got around ten seconds."

"Tell the human species from me -" the Lord Pilot said.

"Five seconds."

The Lord Pilot shouted, fist held high and triumphant: "To live, and occasionally be unhappy!"

This concludes the full and final report of the Impossible Possible World.

(To be completed.)

New Comment
91 comments, sorted by Click to highlight new comments since: Today at 8:18 AM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings
            MISSION COMPLETE

Ships    Ships   Planets   Planets
Destroyed  Lost   Destroyed   Lost
Humans      0        1        0         1
Babyeaters      0        1        0         0
Superhappies      1        0        0         0
                SD   SL    PD   PL

Humans       |   X |  X |   1 |  1

Babyeaters   |   0 |  Y |   0 |  Z

Superhappies | Y |  0 | Z | -Z

X= Ships unable to escape Huygens

Y= Ships in Babyeater Fleet

Z= Planets Babyeaters Have

over the last parts the pace is too fast, it feels rushed. This leads to a loss in quality of the fiction, imo. Besides, it glosses over some holes in the story, such as: why would akon keep his word under these circumstances? why would the happies not foresee the detonation of huygens? why 3 hours to evacuate...?

Cannibal, what exactly is your point and aren't you forgetting all the Babyeater casualties we'd expect in the next week?

The point is, that the Normal Ending is the most probable one.

If blowing up Huygens could be effective, why did it even occur to you to blow up Earth before you thought of this?

Hmm. I think I'd rather have agreed to the Superhappies' deal.

One reason is that with their rate of expansion -- which they might be motivated to increase now, too -- they'll probably surprisingly soon find an alternative starline route to humans anyway. (Though even if this was guaranteed not to happen, I probably still would rather have agreed to the deal.)

Also, I think I would prefer blowing up the nova instead. The babyeater's children's suffering is unfortunate no doubt but hey, I spend money on ice cream instead of saving starving children in Africa. The superhappies' degrading of their own, more important, civilization is another consideration.

(you may correctly protest about the ineffectiveness of aid - but would you really avoid ice cream to spend on aid, if it were effective and somehow they weren't saved already?)

Shutting up and multiplying suggests that we should neglect all effects except those on the exponentially more powerful species.

Cannibal: Heh.

Spuckblase: You know, you're right. I revised/returned some paragraphs that were deleted earlier, starting after "...called that thinking sane."

Simon: It just didn't happen to cross my mind; as soon as I actually generated the option to be evaluated, I realized its superiority.

Steven: I thought of that, but decided not to write up the resulting conversation about Babyeater populations versus Babyeater expansion rates versus humans etcetera, mostly because we then get into the issue of "What if we make a firm commitment to expand even faster?" The Superhappies can expand very quickly in principle, but it's not clear that they're doing so - human society could also choose a much higher exponential on its population growth, with automated nannies and creches.

Aleksei: Part of the background here (established in the opening paragraghs of chapter 1) is that the starline network is connected in some way totally unrelated to space as we know it - no star ever found is within reach of Earth's telescopes (or the telescopes of other colonies). Once the Huygens starline is destroyed, it's exceedingly unlikely that any human will find the Babyeaters or ... (read more)

The Superhappies can expand very quickly in principle, but it's not clear that they're doing so

We (or "they" rather; I can't identify with your fanatically masochist humans) should have made that part of the deal, then. Also, exponential growth quickly swamps any reasonable probability penalty.

I'm probably missing something but like others I don't get why the SHs implemented part of BE morality if negotiations failed.

The point is, that the Normal Ending is the most probable one.

Historically, humans have not typically surrendered to genocidal conquerors without an attempt to fight back, even when resistance is hopeless, let alone when (as here) there is hope. No, I think this is the true ending.

Nitpick: eight hours to evacuate a planet? I think not, no matter how many ships you can call. Of course the point is to illustrate a "shut up and multiply" dilemma; I'm inclined to think both horns of the dilemma are sharper if you change it to eight days.

But overall... (read more)

So what is next? 7/8 implies a next part, yet it also seems to be finished.

Steven: They're being nice. That's sort of the whole premise of the Superhappies - they're as nice as an alien species can possibly get and still be utterly alien and irrevocably opposed to human values. So nice, in fact, that some of my readers find themselves agreeing with their arguments. I do wonder how that's going to turn out in real life.

Russell: It's stated that most colony worlds are one step away from Earth (to minimize the total size of the human network). This means there's going to be a hell of a lot of ships passing through Earth space ... (read more)

Well, it's not like it's hard to see reason in Superhappies' values.

1) I, personally, don't have a terminal value of non-cannibalism. Actual reason I don't eat babies now is a result of multiple other values:

  • I value human life, so I consider killing a human to get some food a huge utility loss.
  • Any diseases initial owner of meat had contracted are almost 100% transferable to me. Any poisons that accumulated in initial meat owner's body will also accumulate in mine. Also, humans eat a lot of junk food. Eating humans is bad for one's health.

So, I don't have any problem with eating safe-to-eat human meat that is not produced by killing conscious human beings. I would be actually curious to taste, for example, vat-grown clone meat grown from my own cells sample. This position may not be held by an average human, but I don't think it's particularly disturbing from transhumanist point of view.

2) I consider humans' desire to keep their identity and humanity at least in part being status-quo bias. Also, humans don't really stay themselves for long. For example, 5 year old human is quite different from that same human at 10, 15, 20, 25, etc. Change is gradual, but it's real and quite bi... (read more)

Russell Wallace,

The good fable is something different. The most probable outcome (specially here) is another story. The non discussed advantages Superhappies have accumulated so far and are accumulated also at this very moment - are crucial.

Isn't this a win-win? The babyeaters get saved too, by the superhappies, who were not cut off from the babyeater starline. The only losers are the superhappies, who can't "save" the humans.

Julian,

And possibly billions of Huygens humans. Don't forget those.

It seems like an easy solution would be to just inform the superhappies a little more about oral sex (how humans "eat" their young). They could make a few tweaks, and we'd lose the least (some guys might consider that an improvement).

Anyone else thing the Superhappies sounded a whole lot like the borg?

Something like this possibility occurred to me, but I don't think this actually is better.

At least, I think I'd have to be walked through the reasoning, since right now I THINK I'd prefer Last Tears to Sacrificial Fire, conditioned on, well, the conditions I list in this comment holding.

ie, giving up pain/suffering in a non wireheading way, and being altered to want to eat non-and-never-have-been-conscious "pre humans" really doesn't seem all that bad to me, compared to the combined costs of defecting in single iteration PD (again, same ole metar... (read more)

Eliezer: How can the superhappies consider their offer fair if they made it up and accept it, and the babyeaters reject it? Why do they think that their payment to the babyeaters is in any way adequate?

It seems to me that they would have to at least ramp up the payment to/costs for the babyeaters, until there was an offer the babyeaters would accept, even if the superhappies would reject it. Then there are points to negotiate from.

But just to make an offer that you predict the other side will reject, and then blow them up? The babyeaters were nicer.

I agree with the President of Huygens; the Babyeaters seem much nicer than the Lotuseaters. Maybe that's just because they don't physically have the ability to impose their values on us, though.

Strange this siding with Babyeaters here ... strange.

I prefer the ending where we ally ourselves with the babyeaters to destroy the superhappies. We realize that we have more in common with the babyeaters, since they have notions of honor and justified suffering and whatnot, and encourage the babyeaters to regard the superhappies as flawed. The babyeaters will gladly sacrifice themselves blowing up entire star systems controlled by the superhappies to wipe them out of existence due to their inherently flawed nature. Then we slap all of the human bleeding-hearts that worry about babyeater children, we come... (read more)

Personally, I side with the Hamburgereaters. It's just that the Babyeaters are at the very least sympathetic, I can see viewing them as people. As they've said, the Babyeaters even make art!

The remarkable thing about this story is the conflicting responses in the stories. The fact that a relatively homogeneous group of humans can have totally different intuitions about which ending is better and which aliens they prefer, to me, means that actually aliens (or an AI, whatever) have the potential to be, well- alien, far in excess of what is described in this story. Both aliens have value systems which, while different from ours are almost entirely comprehensible. I think we might be vastly underestimating how radically alien aliens could be.

0sdmitch1612y
If the alien value systems weren't comprehensible how could we explain it in a story? Even if we didn't comprehend it, we could probably still figure out if they deceive. If they don't, we just figure out their demands and decide if their acceptable. If the demands aren't, we either try to wipe them out or flee. If they do deceive, we can either guess what their final plan is, or wipe them out or flee. We wouldn't fully understand their values and we don't fully understand other humans values. When I see moral dilemma I realize I don't fully understand my own values. The only way to understand another beings values would be to share thoughts and since we could never know if the thoughts were being shared accurately, we couldn't be sure what others really value.
1quintopia12y
How can incomprehensible value systems be represented in story form? With abortive attempts at those who hold them trying to explain them. Like a garuda trying to explain how "theft of choice (of when and with whom to have sex)" is a different crime than "rape" to a human (who doesn't value individual choice in the same way). Or like a superhappy who just knows that we'd absolutely love to be able to Untranslatable 4.
anyone else find it ironic that this blog has measures in place to prevent robots from posting comments?

Only for those stupid robots who can't read a few funny written letters. Babyeaters level robots can't talk here.

And now there will be many cults trying as hard as they can to make contact with the superhappies.

I say this because I witnessed many people discussing Brave New World as an actual utopia... Humans can have incompatible values too.

-2Eugene12y
A late response, but for what it's worth, it could be said that part of the point of the climax and "true" conclusion of this story was to demonstrate how rational actors, using human logic, can be given the same information and yet come up with diametrically opposing solutions.

Eliezer, I hope you'll consider expanding this story into a novel. I'd buy it.

I wonder, do the people preferring the Babyeaters over the Superhappies, remember that as a necessary consequence of Babyeater values nearly half* of their species is, at any given time, dying in severe pain?

*From part 2, ~10 children eaten/year/adult, ~1 month for digestion to complete.

Psy-Kosh: I don't see the final situation as a prisoner's dilemma - by destroying Huygens, humanity shows a preference for mutual "defection" over mutual "cooperation".

simon: err... descriptive, normative... ? Maybe you genuinely value ice cream over saving lives, but your behavior isn't a justificatory argument for this, or, given akrasia, even strong evidence.

Nick,

Behavior isn't an argument (except when it is), but it is evidence. And it's akrasia when you say, "Man, I really think spending this money on saving lives is the right thing to do, but I just can't stop buying ice cream" - not when you say "buying ice cream is the right thing to do". Even if you are correct in your disagreement with Simon about the value of ice cream, that would be a case of Simon being mistaken about the good, not a case of Simon suffering from akrasia. And I think it's pretty clear from context that Simon believes he values ice cream more.

And it sounds like that first statement is an attempt to invoke the naturalistic fallacy fallacy. Was that it?

It's evidence of my values which are evidence of typical human values. Also, I invite other people to really think if they are so different.

Eliezer tries to derive his morality from human values, rather than simply assuming that it is an objective morality, or asserting it as an arbitrary personal choice. It can therefore be undermined in principle by evidence of actual human values.

Also, I'm not at all confident that compromising with the Superhappies would be very bad, even before considering the probably larger benefit of them becoming more like us. I think I'd complain more about the abruptness and exogenousness of the change than the actual undesirability of the end state. As others have pointed out, though, a policy of compromise would lead to dilution of everyone's values into oblivion, and so may be highly undesirable.

More generally and importantly, though, I wonder if the use of wireheading as a standard example of "the ... (read more)

If the Super-Happies were going to turn us into orgasmium, I could see blowing up Huygens. Nor would it necessarily take such an extreme case to convince me to take that extreme measure. But this . . . ?

"Our own two species," the Lady 3rd said, "which desire this change of the Babyeaters, will compensate them by adopting Babyeater values, making our own civilization of greater utility in their sight: we will both change to spawn additional infants, and eat most of them at almost the last stage before they become sentient." ... &quo
... (read more)

Simon: "Eliezer tries to derive his morality from human values"

I would correct the above to "Eliezer tries to derive his morality from stated human values."

That's where many of his errors come from. Everyone is a selfish bastard. But Eliezer cannot bring himself to believe it, and a good fraction of the sorts of people whose opinions get taken seriously can't bring themselves to admit it.

Tyrrell: Agreed. As I said in what, well, I said, my acceptance of the SuperHappy bargain was conditional in part on, well, the change being engineered in such a way that it doesn't make the rest of our cognitive structure, values, etc go kablewey. But, given that the changes are as advertised, and there aren't hidden surprises of the "if I really thought through where this would lead, I'd see this is very very bad" variety, well, sure seems to me that the choice in this ending is the wrong one.

Nick: And to we really want, in general, defection t... (read more)

I'm not sure, but was this line:

But, from the first species, we learned a fact which this ship can use to shut down the Earth starline

supposed to read "the Huygens starline"?

Sure, I would turn this down if it were simply offered as a gift. But I really, really, cannot see preferring the death of fifteen billion people over it.

How many humans are there not on Huygens?

Psy-Kosh: Yeah, I meant to have a "as Psy-Kosh has pointed out" line in there somewhere, but it got deleted accidentally while editing.

ad:

How many humans are there not on Huygens?

I'm pretty sure that it wouldn't matter to me. I generally find on reflection that, with respect to my values, doing bad act A to two people is less than twice as bad as doing A to one person. Moreover, I suspect that, in many cases, the badness of doing A to n people converges to a finite value as n goes to infinity. Thus, it is possible that doing some other act ... (read more)

Oh, I'm starting to see why the Superhappies are not so right after all, what they lack, why they are alien, in the Normal Ending and in Eliezer's comments. I think this should have been explained in more detail in the story, because I initially failed to see their offer as anything but good, let alone bad enough to kill yourself. I want untranslatable 2!

Still, if I had been able to decide on behalf of humanity, I would have tried to make a deal - not outright accepted their offer, but negotiated to keep more of what matters to us, maybe by adopting more o... (read more)

Julian Morrison: The only losers are the superhappies, who can't "save" the humans.

You are ignoring the human children.

As the superhappies pointed out, they are in a comparable situation as the babyeater children - suffering before having internalized a philosophy that makes it okay, only because the adults want them to. (Which was the whole reason why the superhappies wanted to intervene.)

I think that this is the "right" ending in the sense that I think it's the kind of thing that typical present-day non-singularitarian humans would do: Be so afraid of being altered that they would consign a large number of their own kind to death rather than face alteration (correct or incorrect, this is the same kind of thinking you see in resistance to life extension and various other H+ initiatives). I'm not confident that it's what rational humans should do.

Small changes in the story could make me get off the fence in either direction. If the... (read more)

0Eugene12y
The only way - at least within the strangely convenient convergence happening in the story - to remove the Babyeater compromise from the bargain is for the humans to outwit the Superhappies such that they convince the Superhappies to be official go-betweens amongst all three species. This eliminates the necessity for humans to adopt even superficial Babyeater behavior, since the two incompatible species could simply interact exclusively through the Superhappies, who would be obligated by their moral nature to keep each side in a state of peace with the other. It should be taken as a given, after all, that the Superhappies will impose the full extent of their proposed compromises on themselves. They'd theoretically be the perfect inter-species ambassadors. That said - given the Superhappies' thinking speed, alien comprehension (plus their selfishness and unreasonable impatience, either of which could be a narrative accident) and higher technological advancement - I'm fairly confident that it would be impossible for this story's humans to outwit them.

Eliezer, thanks. I mostly read OB for the bias posts and don't enjoy narratives or stories, but this one was excellent.

Tyrrell, we aren't told how many humans exist. There could be 15 trillion, so the death of one system may not even equal the number of people who would commit suicide if the SHs had their way.

I don't find the SHs to be "nice" in any sense of the word. In my reading, they aren't interested in making humans happy. They can't be - they don't even understand the human brain. I think they are a biological version of Eliezer's smiley f... (read more)

I've enjoyed the story very much so far, Mr. Yudkowsky.

Incidentally, and fairly off-topic, there's a "hard" sci-fi roleplaying game that uses an idea similar to the starlines in this story. It can be found here:

http://phreeow.net/wiki/tiki-index.php?page=Diaspora

Come to think of it, I have no idea if there's //anyone// with an interest in roleplaying games is this forum...if there is, have fun!

Patrick (orthonormal), I'm fairly sure that "Earth" is correct. They haven't admitted that what they're going to do is blow up Huygens (though of course the President guesses), and the essential thing about what they're doing is that it stops the aliens getting to Earth (and therefore to the rest of humanity). And when talking to someone in the Huygens system, talk of "the Huygens starline" wouldn't make much sense; we know that there are at least two starlines with endpoints at Huygens.

Eliezer, did you really mean to have the "multiplication factor" go from 1.5 to 1.2 rather than to something bigger than 1.5?

(Second attempt at posting this. My first attempt vanished into the void. Apologies if this ends up being a near-duplicate.)

Patrick (orthonormal), I'm pretty sure "Earth" is right. If you're in the Huygens system already, you wouldn't talk about "the Huygens starline". And the key point of what they're going to do is to keep the Superhappies from reaching Earth; cutting off the Earth/Huygens starline irrevocably is what really matters, and it's just too bad that they can't do it without destroying Huygens. (Well, maybe keeping the Superhappies from finding out any more about the human race is important too.)

Are bodily pain and embarrassment really that important? I'm rather fond of romantic troubles, but that seems like the sort of thing that could be negotiated with the superhappies by comparing it to their empathic pain. It also seems like the sort of thing that could just be routed around, by removing our capacity to fall out of love and our preference for monogamy and heterosexuality.

Grant: I don't find the SHs to be "nice" in any sense of the word. ... They are offended by mankind's expression of pain (its a negative externality to them) and want to remove what offends them.

I'm not entirely sure how "they are offended by helpless victims being forced to suffer against their will and want to remove that" translates into "the SHs aren't nice in any sense of the word".

Manon, thanks for pointing that out - I'd left that out of my analysis entirely. I too would like untranslatable 2. It doesn't change my answer though, as it turns out.

if the SHs find humans via another colony world blowing up earth is still an option. I don't believe the SHs could have been bargained with. They showed no inclination towards compromise in any other sense than whichever one they have calculated as optimal based on their understanding of humans and babyeaters. Because the SHs don't seem to value the freedom to make sub-optimal choices (free will) they may also worry much less about making incorrect choices based on imperfect information (this is the only rational reason I can come up with for them wantin... (read more)

Kaj,

I'm not entirely sure how "they are offended by helpless victims being forced to suffer against their will and want to remove that" translates into "the SHs aren't nice in any sense of the word".
They aren't offended by suffering, but the expression of it. They don't even understand human brains, and can't exchange experiences with them via sex, so how could they? Maybe the SHs are able to survive and thrive without processing certain stimuli as being undesirable, but they never made an argument that humans could.

Psy-Kosh: I understand the metarationality arguments; my point is that we didn't defect in a prisoner's dilemma. PD requires C/C to be preferable to D/D; but if destroying Huygens is defecting for humans, that can only be the case (under the story's values) if cooperating for Superhappies involves modifying themselves and/or giving us their tech without us being modified. I don't think that was ever on the table. (BTW, I liked your explanation of why the deal isn't so bad.)

Simon: Eliezer tries to derive his morality from human values... Common mistake; see... (read more)

Nick,

There is a tendency for some folks to distinguish between descriptive and normative statements, in the sense of 'one cannot derive an ought from an is' and whatnot. A lot of this comes from hearing about the "naturalistic fallacy" and believing this to mean that naturalism in ethics is dead. Naturalists in turn refer to this line of thinking as the "naturalistic fallacy fallacy", as the strong version of the naturalistic fallacy does not imply that naturalism in ethics is wrong.

As for the fallacy you mention, I disagree that it's... (read more)

So, what about the fact that all of humanity now knows about the supernova weapon? How is it going to survive the next few months?

Reading the comments, I find that I feel more appreciation for the values of the Superhappies than I do for the values of some OB readers.

This probably mostly indicates that Eliezer's aliens aren't all that terribly alien, I suppose.

@Wei:
It's just another A-Bomb, only bigger. By now, they must have some kind of policy that limits problems from A-Bombs and whatever other destructive thingies they have. On the other hand, the damage from blowing up the Sol is even more catastrophic than just blowing up any world: it shatters the humanity, with no prospect of reunion.

Nick, note that he treats the pebblesorters in parallel with the humans. The pebblesorters' values lead them to seek primeness and Eliezer optimistically supposes that human values lead humans to seek an analogous rightness.

What Eliezer is trying to say in that post, I think, is that he would not consider it right to eat babies even conditional on humanity being changed by the babyeaters to have their values.

But the choice to seek rightness instead of rightness' depends on humans having values that lead to rightness instead of rightness'.

Simon: Well, the understanding I got from all this was that human development would be sufficiently tweaked so that the "Babies" that humans would end up eating would not actually be, nor ever have yet been conscious. Non conscious entities don't seem to really be too tied to any of my terminal values, near as I can tell.

Of course, if the alteration was going to lead to us eating conscious babies, that's a whole other thing, and if that was the case, I'd say "blow up Huygens twice as hard, then blow it up again just to be sure."

However,... (read more)

I agree with tarleton I think. Can someone briefly summarize what is so objectionable about the superhappy compromise? It seems like a great solution in my view.what of importance is humanity actually giving up? They have to eat non-sentient children. Hard to see why we should care about that when we will never once feel a Gag reflex and no pain is caused to anyone. Art and science will advance not retreat due to superhappy technology being applied to it. The sex will be better and there will be other cool new emotions which will have positive value to us.... (read more)

Dan: Obviously part 8 is the 'Weirdtopia' ending!

(I mean, we've had utopia, dystopia, and thus by Eliezer's previous scheme we are due for a weirdtopia ending.)

This lurker has objections to being made to eat his own children and being stripped of pain: SH plan is not a compromise, but an order. From the position of authority, they can make us agree to anything by debate or subterfuge or outright exercise of power; the mere fact that they seem so nice and reasonable changes nothing about their intentions, which we do not know and which we cannot trust. How do we know that the SH ship's crew are true representatives of the rest of their race? Why is it that they seemingly trust/accept Akon as the representative of ... (read more)

Don't expand this into a novel, it was superb but I'd rather see a wider variety of short works exploring many related themes.

Perhaps this is just me not buying the plot justifications that set up the strategic scenario, but I would be included to accept the SupperHappy deal because of a concern that the next species that comes along might have high technology and not be so friendly. I want the defense of the increased level of technology, stat. Sure it involves giving up some humanity but better than giving up all of humanity. Once I find that there ar... (read more)

I agree that this section of the story feels a bit rushed, but maybe that is the intention.

I don't really like how easily these people in high positions of authority are folding under the pressure. The President in particular was taken out with what was to me very little provocation.

Plus, I just can't relate to a human race that is suicidally attached to preserving its pain and hardships. The offer made by the Superhappies is just not that bad.

Eliezer tries to derive his morality from stated human values.

In theory, Eliezer's morality (at least CEV) is insensitive to errors along these lines, but when Eliezer claims "it all adds up to normality," he's making a claim that is sensitive to such an error.

I agree that deriving morality from stated human values is MUCH more ethically questionable than deriving it from human values, stated or not, and suggest that it is also more likely to converge. This creates a probable difficulty for CEV.

It seems to me that if it's worth destroying Huygens to stop the Superhappies it's plausibly worth destroying Earth instead to fragment humanity so that some branch experiences an infinite future so long as fragmentation frequency exceeds first contact frequency. Without mankind fragmented, the normal ending seems ine... (read more)

Psy-Kosh: I was using the example of pure baby eater values and conscious babies to illustrate the post Nick Tarleton linked to rather than apply it to this one.

Michael: if it's "inevitable" that they will encounter aliens then it's inevitable that each fragment will in turn encounter aliens, unless they do some ongoing pre-emptive fragmentation, no? But even then, if exponential growth is the norm among even some alien species (which one would expect) the universe should eventually become saturated with civilizations. In the long run, the only e... (read more)

It's interesting to note that those oh-so-advanced humans prefer to save children to saving adults, even though there don't seem to be any limits to natural lifespan anymore.
At our current tech-level this kind of thing can make sense because adults have less lifespan left; but without limits on natural lifespan (or neural degradation because of advanced age) older humans have, on average, had more resources invested into their development - and as such should on average be more knowledgeable, more productive and more interesting people.
It appears to me t... (read more)

3tamtrible12y
There are at least a couple of factors I see as relevant: choice, responsibility, and the notion of giving them a chance to live. Children, necessarily, have much of their life controlled for them. They are not allowed to make a lot of important choices for themselves, whether they want to or not. So, it is important for those making choices for them to make the right ones, to justify not allowing them that control. I'm not sure I'm quite articulating the concept here, but... It is the explicit social, legal, and moral obligation of parents to appropriately care for their children. In a broader sense, it is a general obligation of society to care for the weak, helpless, etc. Part of why the death of a young person is a greater relative tragedy today is that they have greater remaining potential lifespan, but part of it, in many peoples' mind, is that they have not yet had a chance to experience various major things. You'd feel a little sad for someone who, for example, died without ever having been in love, even if the person is 83, right? A little kid has missed a lot of experiences.

Sebastian,

Here there is an ambiguity between 'bias' and 'value' that is probably not going to go away. EY seems to think that bias should be eliminated but values should be kept. That might be most of the distinction between the two.

Are bodily pain and embarrassment really that important? I'm rather fond of romantic troubles, but that seems like the sort of thing that could be negotiated with the superhappies by comparing it to their empathic pain. It also seems like the sort of thing that could just be routed around, by removing our capacity to fall out of love and our preference for monogamy and heterosexuality.

The problem with much of the analysis is that the culture already has mutated enough to allow for forcible rape to become normative.

I'm not sure that the supperhappy changes as to "romantic troubles" are much more change than that.

humanity is doomed in this scenario. the Lotuseaters are smarter and the gap is widening. Theres no chance humans can militarily defeat them now or any point in the future. as galactic colonization continues exponentially, eventually they will meet again, perhaps in the far future. but the Lotusfolk will be even stronger relatively at that point. the only way humans can compete is developing an even faster strong-AI, which carries large chance of ending humanity on its own.
so the choices are:
-accept Lotusfolk offer now
-blow up the starline, continue exp... (read more)

This was an interesting story, though I wonder if the human capitulation either option offers is the only option - bluntly, the superhappys don't strike me as being that tough, even if their technology is higher and development is orders of magnitude faster than ours they are completely unwilling to accept suffering even if it comes through their own sense of empathy, all humans have to do is offer a credible threat of superhappy suffering and convince them to modify themselves not to care about our suffering. i.e. "We will resist you every step of the way thus maximizing our suffering, plus you cannot be 100% sure you'll be able to convert us without us inflicting at least some harm"

Hm I think the spam guard ate my last comment so I'll repeat:

I don't think the SH are really up to converting an unwilling humanity despite all their superiority they are fundamentally unwilling to be inconvenienced so humans only have to successfully argue their case by pointing out the probable mass suicides depicted in the alternate ending and that SH society might take some casualties, since they are almost completely risk averse even the possibility of losing a single ship might be enough to scare them off.

It's a bit like the world being unwilling to... (read more)

Bugger there's my original comment after all. Whoops.

The only real solutions for humanity seem to be either to supernova the colony worlds star, or, if this is unacceptable, to prepare supernova devices around all human stars and threaten to supernova everyone if the Super Happies enter our space.

I can't help but wonder why the humans in this story did not simply say "We long ago invented chemical means for individual humans to achieve perfect, undifferentiated happiness, but most individuals seem to consider themselves happier without their constant usage." This is perfectly true, and, if it perhaps would not have completely satisfied the Super-Happies (no doubt they would want immature humans anesthetized until they were old enough to choose) it might at least have served as a significant piece of evidence. I can hardly imagine a society that has legalized rape retaining a taboo against the use of Ecstasy or some future derivative thereof.

2CronoDAS13y
Well, the Superhappies would have already known that if they read the data dump correctly...
3thelittledoctor13y
In that case, I notice that I find myself confused.
2Desrtopa13y
The Superhappies don't have perfect, undifferentiated happiness. Note their shock and distress when they find out about the lifestyle of the Babyeaters. They've simply excised some sources of unhappiness from their psychology.
0[anonymous]13y
Which ones?
1Desrtopa13y
Embarassment, relationship anxiety.... I'd have to reread the story to remember the full list, it's been over a year since I read it.

But the individuals don't consider themselves happier without their constant usage. It's just that happiness isn't these individuals' supreme value, the same way it seems to be for the SuperHappies.

Consider a human mother who was told that she could take a pill and live in perfect happiness ever after, but her children would have to die for it. If she loves her children, she won't take the pill; it doesn't matter that she knows she would be happy with the pill, it's just that her children's well-being is more important to her than her own future happiness.

1thelittledoctor13y
Oh, I see. I've been confusing happiness as a state of present bliss with happiness as a positive feeling regarding a situation, which are not quite the same thing. Excellent reply, thank you.
1Psy-Kosh13y
Alternately, imagine the pill would alter the structure of her mind so that she would become the sort of being that would be happy about her children dying? So even in the case where it relates to situations, one might reject such a pill.

The thing I wonder is why humanity didn't insist that the superhappies refrained from acting on humanity until they had a better understanding of us. They made a snap judgement, that was obviously incomplete given what fraction of humanity opted for suicide under their plan--given more time, they likely could have come up with a plan that would reach their desired aims (not being made unhappy by humanity) with a minimum of distress to all parties...

I wonder to what degree civilization is going to fracture a few years after the shock. At the very least, I'd wager that several large deontological factions/communities/cults would spring up with a sentiment of "Look where utilitarianism led us!", possibly taking over some colonies. Violence or major secession are more questionable.