Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Normal Ending: Last Tears (6/8)

23 Post author: Eliezer_Yudkowsky 04 February 2009 08:45AM

(Part 6 of 8 in "Three Worlds Collide")

Today was the day.

The streets of ancient Earth were crowded to overbursting with people looking up at the sky, faces crowded up against windows.

Waiting for their sorrows to end.

Akon was looking down at their faces, from the balcony of a room in a well-guarded hotel.  There were many who wished to initiate violence against him, which was understandable.  Fear showed on most of the faces in the crowd, rage in some; a very few were smiling, and Akon suspected they might have simply given up on holding themselves together.  Akon wondered what his own face looked like, right now.

The streets were less crowded than they might have been, only a few weeks earlier.

No one had told the Superhappies about that part.  They'd sent an ambassadorial ship "in case you have any urgent requests we can help with", arriving hard on the heels of the Impossible.  That ship had not been given any of the encryption keys to the human Net, nor allowed to land.  It had made the Superhappies extremely suspicious, and the ambassadorial ship had disgorged a horde of tiny daughters to observe the rest of the human starline network -

But if the Superhappies knew, they would have tried to stop it.  Somehow.

That was a price that no one was willing to include into the bargain, no matter what.  There had to be that - alternative.

A quarter of the Impossible Possible World's crew had committed suicide, when the pact and its price became known.  Others, Akon thought, had waited only to be with their families.  The percentage on Earth... would probably be larger.  The government, what was left of it, had refused to publish statistics.  All you saw was the bodies being carried out of the apartments - in plain, unmarked boxes, in case the Superhappy ship was using optical surveillance.

Akon swallowed.  The fear was already drying his own throat, the fear of changing, of becoming something else that wasn't quite him.  He understood the urge to end that fear, at any price.  And yet at the same time, he didn't, couldn't understand the suicides.  Was being dead a smaller change?  To die was not to leave the world, not to escape somewhere else; it was the simultaneous change of every piece of yourself into nothing.

Many parents had made that choice for their children.  The government had tried to stop it.  The Superhappies weren't going to like it, when they found out.  And it wasn't right, when the children themselves wouldn't be so afraid of a world without pain.  It wasn't as if the parents and children were going somewhere together.  The government had done its best, issued orders, threatened confiscations - but there was only so much you could do to coerce someone who was going to die anyway.

So more often than not, they carried away the mother's body with her daughter's, the father with the son.

The survivors, Akon knew, would regret that far more vehemently, once they were closer to the Superhappy point of view.

Just as they would regret not eating the tiny bodies of the infants.

A hiss went up from the crowd, the intake of a thousand breaths.  Akon looked up, and he saw in the sky the cloud of ships, dispersing from the direction of the Sun and the Huygens starline.  Even at this distance they twinkled faintly.  Akon guessed - and as one ship grew closer, he knew that he was right - that the Superhappy ships were no longer things of pulsating ugliness, but gently shifting iridescent crystal, designs that both a human and a Babyeater would find beautiful.  The Superhappies had been swift to follow through on their own part of the bargain.  Their new aesthetic senses would already be an intersection of three worlds' tastes.

The ship drew closer, overhead.  It was quieter in the air than even the most efficient human ships, twinkling brightly and silently; the way that someone might imagine a star in the night sky would look close up, if they had no idea of the truth.

The ship stopped, hovering above the roads, between the buildings.

Other bright ships, still searching for their destinations, slid by overhead like shooting stars.

Long, graceful iridescent tendrils extended from the ship, down toward the crowd.  One of them came toward his own balcony, and Akon saw that it was marked with the curves of a door.

The crowd didn't break, didn't run, didn't panic.  The screams failed to spread, as the strong hugged the weak and comforted them.  That was something to be proud of, in the last moments of the old humanity.

The tendril reaching for Akon halted just before him.  The door marked at its end dilated open.

And wasn't it strange, now, the crowd was looking up at him.

Akon took a deep breath.  He was afraid, but -

There wasn't much point in standing here, going on being afraid, experiencing futile disutility.

He stepped through the door, into a neat and well-lighted transparent capsule.

The door slid shut again.

Without a lurch, without a sound, the capsule moved up toward the alien ship.

One last time, Akon thought of all his fear, of the sick feeling in his stomach and the burning that was becoming a pain in his throat.  He pinched himself on the arm, hard, very hard, and felt the warning signal telling him to stop.

Goodbye, Akon thought; and the tears began falling down his cheek, as though that one silent word had, for the very last time, broken his heart.

 

 

 

 

 

 

 

 

And he lived happily ever after.

Comments (64)

Sort By: Old
Comment author: Eliezer_Yudkowsky 04 February 2009 08:50:30AM 11 points [-]

This is the original ending I had planned for Three Worlds Collide.

After writing it, it seemed even more awful than I had expected; and I began thinking that it would be better to detonate Sol and fragment the human starline network, guaranteeing that, whatever happened in the future, true humans would continue somewhere.

Then I realized I didn't have to destroy the Earth - that, like so many other stories I'd read, my very own plot had a loophole. (I might have realized earlier, if I'd written part 5 before part 6, but the pieces were not written in order.)

Tomorrow the True Ending will appear, since it was indeed guessed in the comments yesterday.

If anyone wonders why the Normal Ending didn't go the way of the True Ending - it could be because the Superhappy ambassador ship got there too quickly and would have been powerful enough to prevent it. Or it could be because the highest decision-makers of humankind, like Akon himself, decided that the Superhappy procedure was the categorically best way to resolve such conflicts between species. The story does not say.

Comment author: AaronAgassi 14 February 2011 01:33:18PM 1 point [-]

In the spirit of true Soft Science Fiction, it seems more plausible that once they gain understanding of human interaction, the Supperhappy would simply make a technological gift of their communications modality, and allow social change to take it's course. The end result might be much the same, with the Confessor feeling progressively the more alienated as events unfold.

As for the Baby Eaters, 1quite frankly, they'd likely be Sadists. There is plenty of precedent in Human societies, of Sadism as a value, one way or another. But that might pose a conundrum even for the Superhappy.

Comment author: PhilGoetz 28 June 2011 08:06:48PM 1 point [-]

I like this original ending better; it's more thought-provoking (which is almost a synonym for more disturbing). And I'd like to see this submitted and published in a print SF magazine, likely Analog.

Comment author: wobster109 20 August 2011 07:08:41AM 4 points [-]

I've just a couple days ago returned home from Rationality Camp, and to the best of my estimates, about half the participants prefer this ending, and also, among rationalists that I encounter elsewhere, a non-trivial portion of them prefer this ending as well. What am I saying? Other than the mass suicides, it is not immediately obvious that this original ending is "awful" in any way.

Comment author: AndrewH 29 January 2012 02:48:12AM 4 points [-]

Other than the mass suicides...

And including the mass suicides? remember that in this story, 6 billion people become 1 in a million, and over 25% of people died in this branch of the story. Destroying Huygens resulted in 15 billion deaths.

As they say, shut up and multiply.

Comment author: Steve_Rayhawk 04 February 2009 09:11:41AM 0 points [-]

If those are the two endings, then that definition procedure for the term "True Ending" was not very meta-ethically instructive.

Comment author: Manuel_Mörtelmaier 04 February 2009 11:41:41AM 15 points [-]

Awww... I was so looking forward to a heartwarming description of a family feast some years later where Akon delivers a toast to the SHs for "finally restoring the natural order of things."

HAPPY ENDING.

Comment author: Eliezer_Yudkowsky 08 June 2009 05:12:08AM 14 points [-]

Upon due reflection, I have edited the story somewhat to include the final line, in accordance with your suggestion.

Comment author: Baughn 29 June 2010 01:23:08PM *  31 points [-]

And never before has that sentence looked quite so horrifying.

Comment author: Sara 26 August 2011 05:05:51PM *  0 points [-]

Seconded. For some reason, it makes this version of the ending feel an order of magnitute more horrific than it otherwise would have.

Comment author: infotropism 04 February 2009 12:00:07PM 3 points [-]

How long will the superhappy-human-babyeater conglomerate last ? How many other species will they meet in the universe ? How arbitrary can and will be the aesthetics, morality, utilities of those new species ? If they are arbitrary enough, and enough of them are met, what will the resulting following compromises look like ?

Depending on how many goals, values, etc. are more or less universal - and some would perhaps be, since after most if not all those species will have come into being through evolution in the same universe - then those are the only thing that'll remain, the only values and particularities. As the rest is arbitrary, the average will probably cancel any subtlety out.

The longer you go, the more monomaniacal and bland the resulting compromise will become. In the end, you'll have something like orgasmium, for maybe a handful of values that were shared between a majority of those species. The rest, noise. Would that be ok ?

Comment author: Andrew 04 February 2009 01:04:45PM 2 points [-]

Up next: Three Worlds/Unlimited Blade Works!

I hope the Confessor gets more face time. He's so badass.

Comment author: Svein_Ove2 04 February 2009 01:16:22PM 2 points [-]

I have my own doubts, but I don't think it would have exactly that effect.

Remember, the Superhappys actually adopted some of the values of the humans and baby-eaters; it seems to be a volume-conserving operation, not set-intersection. Not, I think, that that makes it very much better.

Comment author: Abigail 04 February 2009 01:30:56PM 6 points [-]

I have a very strong personal motivation for making the moral assertion, "Diversity is good". I am transsexual, often meet people who have never met a TS before and am rarely in a group which is majority TS. Yet, I do believe in it as a moral requirement. If we are all the same, we all have the same blind spots. If we are all different, we see different things, and this is good, and interesting, and in our interests.

I rather hope that the more powerful alien race we meet will also value diversity as a moral good. I even believe it is a moral good even when, for example during the Scramble for Africa, almost no-one or no-one at all believes it.

Comment author: Konkvistador 25 January 2011 11:50:41AM 1 point [-]

Upvoted.

I think we are a long way off of genuinely being able to value diversity among societies. The universalistic impulse to convert the infidels or enlighten the other is still very strong.

I hope we will allow a diverse range of minds to exist. And consequently I hope that humans will someday be ok with humanity branching off into several societies with different values. I value genetic and cultural diversity quite a bit.

Comment author: FiftyTwo 31 March 2011 01:58:01AM 4 points [-]

While I agree with the value of diversity in general and your points for it, I disagree that it is a good in itself. Consider the ways in which we are morally acceptable in limiting diversity and by a greater extension individual freedom. We limit the free choices of many people, the most relevant example here being child abusers. We don't value the diversity of a society which contains the viewpoints of child abusers anywhere near as highly as the value of a society where children are not abused.

The difference with the super-happies is that they are not just limiting humanities ability to harm one another, and to harm its children, but their ability to harm themselves. Analogously, we prevent people from committing suicide in most cases, prevent access to certain drugs and so on, whether this is moral is a separate question.

A classical Mill style liberal would say that an individual can be restricted from actions that affect only themselves only when they are either irrational or not in possession of all of the facts (e.g. a child or a mentally ill person is considered irrational, and we prevent people accidentally harmign themselves through ignorance).

So are the super-happies behaving morally under this remit? Assuming they consider us rational then they are not. A better solution would be to allow all of humanity the option to turn their pain on and off, and either prevent all children being born or prevent children feeling pain before emotional maturity. That would allow individuals to make a rational choice between super-happy and pain/pleasure ways of life, and humanity as a whole could absorb the information and gradually change.

Comment author: Eliezer_Yudkowsky 04 February 2009 01:41:15PM 6 points [-]

I don't expect humanity to ever encounter any aliens - I would guess that the explanation for the Fermi Paradox is that life is rare, but I can easily see how a civilization built out of colliding values could continue to occupy the fun border between complexity and chaos. If one of the contributing species valued that sort of thing, and the others didn't object.

Comment author: RussellThor 22 October 2012 11:20:23PM 3 points [-]

It could be the case that civilization always goes down something like the super happy route, but without such rationality. So rather than getting disappointed about not achieving space travel, they just turn off such disappointment. There would be no reason for ambition, you can just give yourself the feeling of satisfied ambition without actually achieving anything. Once you have access to your own source code, perhaps thing always end up that way.

Comment author: PrometheanFaun 25 May 2013 03:05:53AM *  0 points [-]

No. I personally exhibit a viable human idiogenetic strain which places no value on comfort or pleasure as end-goals - a living counterexample. I try to adhere to the essence of the dictum of life as closely as possible; survive and explore. I'd expect that to be a more enduring invariant shared by distinct lineages than a fear of pain.

Though if humanity were a species for which our agents truly couldn't resist merging thoughts in every moment- and we very nearly are- I wouldn't exist. But that still only speaks of humanity.

Comment author: Steve_Rayhawk 04 February 2009 02:16:40PM 0 points [-]

Or, wait... To find the plot hole that permits the other ending takes searching. If no commenter had recognized that they preferred the other ending strongly enough, they would not have searched deeply enough. Was the meta-ethics test only that?

Comment author: Eliezer_Yudkowsky 04 February 2009 02:20:33PM 6 points [-]

Steve, there's no incredibly deep metaethical lesson in the fact that I, as author, decided that the second ending would only be the True one "that actually happened" if a reader thought of it. I just wanted to take advantage of the blog format to offer a choice a bit more interactive than picking "1" or "2".

The most important advice you can offer to a rationalist is to avoid motivated skepticism; the second most important advice is not to overcomplicate things. Not everything I do is incredibly deep. Some things, sure, and even some things that aren't obvious at a first glance, but not everything.

On the other hand, no one has decoded the names of the ships yet, so I guess there's also something to be said for looking deeper.

Comment author: Emiezer_Shirouski 04 February 2009 02:23:08PM 6 points [-]

I am the core of my mind.
Belief is my body and choice is my blood.
I have revised over a thousand judgments.
Unaware of fear
Nor aware of hope.
Have withstood pain to update many times
Waiting for truth's arrival.
This is the one uncertain path.
My whole life has been...
Unlimited Bayes Works!

Comment author: Steve_Rayhawk 04 February 2009 02:40:12PM 0 points [-]

My whole life has been...

Should read:

So, as I strive...

(The original is idiomatic and hard to filk cleanly.)

Comment author: Anonymous48 04 February 2009 03:03:44PM 8 points [-]

It's rather funny to see this end desribed as awful by Eliezer, who, at the same time, endorses things such as In my head I have an image of the parliament of volitional shadows of the human species, negotiating a la Nick Bostrom. The male shadows and the female shadows are pretty much agreed that (real) men need to be able to better read female minds; but since this is a satisfaction of a relatively more "female" desire - making men more what women wish they were - the male shadows ask in return that the sex-drive mismatch be handled more by increasing the female sex drive, and less by decreasing male desire...

So, intraspecies convergence of values is somehow ok, but interspecies isn't?

Comment author: michael_vassar3 04 February 2009 03:21:27PM 5 points [-]

The trouble is that some years later Akon is not a super-happy baby-eating human but rather a hodge-podge of zillions of values. The super-happy population or resources can double in 35 hrs at current tech. Their tech advances much faster than human tech does at current population. This is their first encounter at current tech and population but in a year they will probably encounter and mix with over 2^240 new species!

More practically, severing the human starline system, in addition to being a cliche, seems very positive values utilitarian and very anti-CEV in that it imposes a decision to maintain disunion and thus the continued existence of true humans upon all future human generations. I see the appeal, but it doesn't seem to minimize the ratio of bad human worlds to good human worlds in a big universe. Really I can't seem to look away from the instant doom implications of a big universe with superluminal travel and exponentially growing populations of finite starting density.

Comment author: Kaj_Sotala 04 February 2009 03:24:34PM 9 points [-]

I don't see this ending as awful at all, except of course for the suicides. But a quarter of the ship's crew, with even higher rates among the general population? That strikes me as unrealistically high. For most people, it takes a lot to be pushed over the edge.

I also note that this is part 6. That means either that the true ending is in two parts, or that there'll be Something Completely Different as part eight, maybe an "author's comments" or some such.

Comment author: Urizen 04 February 2009 03:26:04PM 0 points [-]

Not everything I do is incredibly deep. Some things, sure, and even some things that aren't obvious at a first glance, but not everything.

Sometimes, a cigar is just a cigar..

Comment author: Martin4 04 February 2009 03:50:39PM 2 points [-]

How can the superhappies not see THAT happening?

Martin

Comment author: Peter_de_Blanc 04 February 2009 04:39:04PM 0 points [-]

Mike said: Really I can't seem to look away from the instant doom implications of a big universe with superluminal travel and exponentially growing populations of finite starting density.

Maybe the universe itself grows exponentially faster than populations of life.

Comment author: Faré 04 February 2009 04:51:58PM -2 points [-]

25% suicide rate? Over something completely abstract that they haven't felt yet?

You didn't tell us about humans having been overcome by some weird Death Cult.

But, now it makes sense why they would give power to the Confessor.

Obviously, in this fantasy of would-be-immortal 21st century abstract thinker, your immortal 21st century abstract thinkers are worshipped as gods. And unhappily, they were told too much about Masada and other Kool-Aid when they were young.

There comes your judeo-christian upbringing again, in addition to the intellectual masturbation.

Eliezer -- get a life! The worst thing that ever happened to your intelligence was to be disconnected from reality by too early success.

Comment author: Nebu_Pookins 04 February 2009 05:02:03PM 0 points [-]

"Just as they would regret not eating the tiny bodies of the infants." is one of the more moving passages I've read in a long time. Well done Eliezer.

Comment author: John_Maxwell2 04 February 2009 06:09:09PM 1 point [-]

Why did the SuperHappies adopt the Babyeater's ethics? I thought that they exterminated them. Or is 6/8 an alternative to 5/8 instead of its sequel?

It might be better to number the sections 1, 2, 3, 4, 5A, 6A, 5B, 6B.

Comment author: Ian_Maxwell 04 February 2009 07:02:23PM 3 points [-]

Has anyone else noticed that in this particular 'compromise', the superhappies don't seem to be actually sacrificing anything?

I mean, their highest values are being ultra super happy and having sex all the time, and they still get to do that. It's not as if they wanted not to create literature or eat hundreds of pseudochildren. Whereas humans will no longer get to feel frustrated or exhausted, and babyeaters will no longer get to eat real children.

I don't think the superhappies are quite as fair-minded as Akon thought. They agreed to take on traits of humanity and babyeating in an attempt to placate everyone, not because it was a fair trade.

Comment author: Furcas 04 February 2009 07:13:50PM 6 points [-]

Sure. To the Superhappies, letting a sentient being experience pain or discomfort is evil. Since they're the strongest, why would they willingly do something they consider to be evil?

Akon isn't entirely wrong, though. The Superhappies could have transformed humanity and the Babyeaters without changing themselves or their way of life in the slightest, and no one would have been able to stop them. But they didn't. That does show a certain degree of fair-mindedness that humans probably wouldn't have shown had they been in the same position.

Comment author: ad2 04 February 2009 08:01:24PM 0 points [-]

The Superhappies could have transformed humanity and the Babyeaters without changing themselves or their way of life in the slightest, and no one would have been able to stop them.

Why would I care about whether the Superhappies change themselves to appreciate literature or beauty? What I want is for them to not change me.

All their "fair-mindedness" does is guarantee that I will be changed again, also against my will, the next time they encounter strangers.

Comment author: complexmeme 26 December 2012 04:27:57PM 1 point [-]

that I will be changed again, also against my will, the next time

The next time, it presumably wouldn't be against your will, due to the first set of changes.

Comment author: Richard4 04 February 2009 08:29:38PM 0 points [-]

If we sufficiently value episodes of aesthetic appreciation (in general, not only when done by us), etc., then the "compromise" could be a net positive, even from the perspective of our current values.

(But perhaps the point is that our values are in fact not so agent-neutral.)

Comment author: Anonymous_Coward4 04 February 2009 10:32:57PM 0 points [-]

Regarding ship names in the koan....

Babyeaters: http://en.wikipedia.org/wiki/Midshipman's_Hope. Haven't read, just decoded from the name in the story.

But I'm having trouble figuring out the superhappys. I can think of a story with rational and emotional protagonists, a plot device relating to a 'charged particle', and the story is centered around a solar explosion (or risk of one). That story happens to involve 3 alien genders (rational, emotional, parental) who merge together to produce offspring. It should be known to many people on this thread but it's been about 10 years since I last read it. Asimov, the gods themselves.

Anonymous.

Comment author: Tamfang 14 August 2010 03:37:19AM 0 points [-]

Ah! I read it as "Sailor's Heart's Desire", with no particular significance.

Comment author: a_soulless_automaton 04 February 2009 10:48:34PM 4 points [-]

There seems to be a fairly large contingent of humanity who regard self-determination as the most significant terminal value to roughly the same single-minded extent that the Babyeaters view baby eating; including a willingness to sacrifice every other moral value to very large degrees in its favor. I assume many of the suicides fell into this group.

While not universal among humanity as baby eating is among the Babyeaters, the concept should have been fairly explicit in at least some of the cultural material transmitted. I wonder, were the Superhappies being willfully oblivious to this value, considering the extent to which they willingly violate it?

Comment author: Ryan 04 February 2009 10:51:30PM 9 points [-]

I'm with Kaj Sotala in not finding this ending to be awful.

The prospect of never feeling pain again would not in the least disturb me. Oh, I may 'enjoy' the pain of a good workout, but only because I believe it will help to reduce or postpone more pain later on.

The babyeating is weird, but we are talking about being transformed to want to do that, not being forced to do something we would actually find disgusting.

Whats the trouble there? I don't regret my past self being unable to forever prevent my current self from enjoying brussels sprouts.

Comment author: simon2 04 February 2009 11:59:54PM 2 points [-]

John Maxwell:

No, they are simply implementing the original plan by force.

When I originally read part 5, I jumped to the same conclusion you did, based presumably on my prior expectations of what a reasonable being would do. But then I read nyu2's comment which assumed the opposite and went back to look at what the text actually said, and it seemed to support that interpretation.

Comment author: simon2 05 February 2009 12:53:54AM 0 points [-]

Actually, I'm not sure if that's what I thought about their intentions towards the babyeaters, but I at least didn't originally expect them to still intend to modify themselves and humanity.

Comment author: simon2 05 February 2009 12:54:49AM 0 points [-]

...with babyeater values.

Comment author: infotropism 05 February 2009 01:34:23AM 1 point [-]

"But I'm having trouble figuring out the superhappys. I can think of a story with rational and emotional protagonists, a plot device relating to a 'charged particle', and the story is centered around a solar explosion (or risk of one). That story happens to involve 3 alien genders (rational, emotional, parental) who merge together to produce offspring."

Wouldn't that be the player of games from banks ? Would kinda make sense no ?

Comment author: Brian_Macker 05 February 2009 02:02:57AM 2 points [-]

"Why did the SuperHappies adopt the Babyeater's ethics? I thought that they exterminated them."

They only exterminated the one ship so that it wouldn't blow up the star.

Comment author: Martin4 05 February 2009 04:21:41AM 0 points [-]

Regarding the ships names: Impossible possible worlds would point to Heinleins: Number of the beast.

Comment author: Doug_S. 05 February 2009 06:04:50AM 1 point [-]

"But I'm having trouble figuring out the superhappys. I can think of a story with rational and emotional protagonists, a plot device relating to a 'charged particle', and the story is centered around a solar explosion (or risk of one). That story happens to involve 3 alien genders (rational, emotional, parental) who merge together to produce offspring."

The story you're thinking of is The Gods Themselves by Isaac Asimov, the middle section of which stars the aliens you describe.

Comment author: Daniel_Franke 05 February 2009 06:36:33AM 0 points [-]

Hmm, I just noticed that there's a slight contradiction here:

"I know. Believe me, I know. Only youth can Administrate. That is the pact of immortality."

Then how is it possible for there to be such person as a Lord Administrator, if the title takes 100 years to obtain? While a civilization of immortals would obviously redefine their concept of youth, it seems like a stretch to call a centenarian young if 500 is still considered mind-bogglingly old.

Comment author: Eliezer_Yudkowsky 05 February 2009 10:41:50AM 1 point [-]

Daniel, is it a stretch to call a 20-year-old young if you would be impressed to meet a 100-year-old? Though the actual relation would be more like "Akon is 30 years old, the Confessor is a 90-year-old survivor of a famous catastrophe."

Comment author: Anonymous_Coward4 05 February 2009 03:15:02PM 1 point [-]

"But I'm having trouble figuring out the superhappys. I can think of a story with rational and emotional protagonists, a plot device relating to a 'charged particle', and the story is centered around a solar explosion (or risk of one). That story happens to involve 3 alien genders (rational, emotional, parental) who merge together to produce offspring."

The story you're thinking of is The Gods Themselves by Isaac Asimov, the middle section of which stars the aliens you describe.

Yes, I believe I already identified the story in the final sentence of my post. But thanks anyway for clarifying it for those that didn't keep reading till the end :-)

Anonymous.

Comment author: Nominull3 05 February 2009 03:42:13PM 0 points [-]

"Normal" End? I don't know what sort of visual novels you've been reading, but it's rare to see a Bad End worse than the death of humanity.

Comment author: Aaron_D._Ball 05 February 2009 05:31:59PM 2 points [-]

"Ion" Banks was cute. I'm finally catching up on this series days late, so it's astonishing that nobody else got that one. (But that's the only one I got.)

Comment author: Nebu_Pookins 06 February 2009 05:46:05AM 1 point [-]

Why would I care about whether the Superhappies change themselves to appreciate literature or beauty? What I want is for them to not change me.

The bargain that the Superhappies are offering is to change you less than if they had just changed you by force. I'm guessing if the humans didn't agree to the deal, the Superhappies would have either exterminated the humans completely, or convert them completely to superhappy values.

The benefit of Superhappies changing themselves to appreciate literature and beauty is that when they convert you, you get to keep the part of you that appreciated literature and beauty.

All their "fair-mindedness" does is guarantee that I will be changed again, also against my will, the next time they encounter strangers.

Actually, it won't be against your will, because you will have had the same values as them (you're all merged now, remember?)

Comment author: rickc 17 August 2012 10:06:16PM 0 points [-]

First-time commenter.

Perhaps I missed this in a comment to a previous part, but I don't see why we have to assume the super-happies are honoring the original plan. If their negotiations with the baby-eaters failed, the SH owe the BE nothing. They have no reason not to forcibly modify the BE, and, consequently, no reason to alter themselves or the humans to eat babies. (They could have also simply wiped out the BE, but genocide seems like a worse solution than "fixing" the BEs.)

Comment author: kybernetikos 16 July 2013 09:51:37PM *  0 points [-]

The point is that they are the kind of species to deal with situations like this in a more or less fairminded way. That will stand them in good stead in future difficult negotiatons with other aliens.

Comment author: Origin64 04 November 2012 11:07:42PM 0 points [-]

I see somewhat of an analogy now between this and Clarke's Cradle. The theme of a very physically and abruptly changing humanity.

What I do wonder, though, is why, in this entire story, nobody ever seriously considered the option of just leaving each other be. Live and let live and all that. It seemed rather obvious to me. Three species meet, exchange what they will, and go their own separate ways. After all, morality is subjective, and any species that understands the prisoners dilemma should understand that as well. All they had to do was walk away.

Comment author: Baughn 15 February 2013 04:32:59PM *  0 points [-]

Well. Morality may be subjective, but morality encodes preferences over states of the universe.

Yours may discount states that are far away and don't reach you; mine doesn't, so I'm with the Lord Pilot here. It would be impossible for me to satisfy my sense of ethics without doing something about the Babyeaters, even if that requires splitting humanity.

Comment author: fractalman 08 June 2013 07:05:14AM 0 points [-]

because both humans and super-happies agree: baby-eating needs to STOP ASAP!

Comment author: Zephyr1011 07 July 2013 01:59:19PM 2 points [-]

Honestly, I think I prefer this ending over the other one

Comment author: EndlessStrategy 11 December 2013 12:16:51AM 2 points [-]

25% of the population suicided? I'm sorry, but that just seems...extremely unrealistic. Like it was tacked on to cement this as the bad ending.