One book suggestion. "On Intelligence" by Jeff Hawkins.
Although there is a plug for his own research model, I would summarise the book as:
Enjoyable book actually, regardless of what you think of his own preferred AI technique.
Interesting, and I must admit I am surprised.
Regardless of personal preferences though... it seems the closest match for the topic at hand. But hey, it's your story...
"Excession; something excessive. Excessively aggressive, excessively powerful, excessively expansionist; whatever. Such things turned up or were created now and again. Encountering an example was one of the risks you ran when you went a-wandering..."
Still puzzled by the 'player of games' ship name reference earlier in the story... I keep thinking, surely Excession is a closer match?
"But I'm having trouble figuring out the superhappys. I can think of a story with rational and emotional protagonists, a plot device relating to a 'charged particle', and the story is centered around a solar explosion (or risk of one). That story happens to involve 3 alien genders (rational, emotional, parental) who merge together to produce offspring."The story you're thinking of is The Gods Themselves by Isaac Asimov, the middle section of which stars the aliens you describe.
Yes, I believe I already identified the story in the final sentence of my post. But thanks anyway for clarifying it for those that didn't keep reading till the end :-)
Anonymous.
Regarding ship names in the koan....
Babyeaters: http://en.wikipedia.org/wiki/Midshipman's_Hope. Haven't read, just decoded from the name in the story.
But I'm having trouble figuring out the superhappys. I can think of a story with rational and emotional protagonists, a plot device relating to a 'charged particle', and the story is centered around a solar explosion (or risk of one). That story happens to involve 3 alien genders (rational, emotional, parental) who merge together to produce offspring. It should be known to many people on this thread but it's been about 10 years since I last read it. Asimov, the gods themselves.
Anonymous.
Since this is fiction (thankfully, seeing how many might allow the superhappy's the chance they need to escape the box)... an alternative ending.
The Confessor is bound by oath to allow the young to choose the path of the future no matter how morally distasteful.
The youngest in this encounter are clearly the babyeaters, technologically (and arguably morally).
Consequently the Confessor stuns everyone on board, pilots off to Baby Eater Prime and gives them the choice of how things should proceed from here.
The End
Your defection isn't. There are no longer any guarantees of anything whenever a vastly superior technology is definitely in the vicinity. There are no guarantees while any staff member of the ship is still conscious besides the Confessor and it is a known fact (from the prediction markets and people in the room) that at least some of humanity is behaving very irrationally.
Your proposal takes unnecessary ultimate risk (the potential freezing, capture or destruction of the human ship upon arrival, leading to the destruction of humanity - since we don't know what the superhappys will REALLY do, after all) in exchange for unnecessary minimal gain (so we can attempt to reduce the suffering of a species whose technological extent we don't truly know and whose value system we know to be in at least one place substantially opposed to our own, and whom we can remain ignorant of, as a species, by anaesthetised self-destruction of the human ship).
It is more rational to take action as soon as possible to guarantee a minimum acceptable level of safety for humankind and its value system, given the unknown but clearly vastly superior technological capabilities of the superhappys if no action is immediately taken.
If you let an AI out of the box and it tells you its value system is opposed to humanity's and that it intends to convert all humanity to a form that it prefers, then it FOOLISHLY trusts you and steps back inside the box for a minute, then what you do NOT do is:
Anonymous
Attempting to paraphrase the known facts.
You and your family and friends go for a walk. You walk into an old building with 1 entrance/exit. Your friends/family are behind you.
You notice the door has a irrevocable self-locking mechanism if it is closed.
You have a knife in your pocket.
As you walk in you see three people dressed in 'lunatic's asylum' clothes.
Two of them are in the corner; one is a guy who is beating up a woman. He appears unarmed but may have a concealed weapon.
The guy shouts to you that 'god is making him do it' and suggests that you should join in and attack your family who are still outside the door.
The 3rd person in the room has a machine gun pointed at you. He tells you that he is going to give you and your family 1000000 pounds each if you just step inside, and he says he is also going to stop the other inmate from being violent.
You can choose to close the door (which will lock). What will happen next inside the room will then be unknown to you.
Or you can allow your family and friends into the room with the lunatics at least one of whom is armed with a machine gun.
Inside the room, as long as that machine gun exists, you have no control over what actually happens next in the room.
Outside the room, once the door is locked, you also have no control over what happens next in the room.
But if you invite your family inside, you are risking that they may be killed by a machine or may be given 1 million pounds. But the matter is in the hands of the machine gun toting lunatic.
Your family are otherwise presently happy and well adjusted and do not appear to NEED 1 million pounds, though some might benefit from it a great deal.
Personally in this situation I wouldn't need to think twice; I would immediately close the door. I have no control over the unfortunate situation the woman is facing either way, but at least I don't risk a huge negative outcome (the death of myself and my family at the hands of a machine gun armed lunatic).
It is foolish to risk what you have and need for what you do not have, do not entirely know, and do not need.
Wait a week for a Superhappy fleet to make the jump into Babyeater space, then set off the bomb.
You guys are very trusting of super-advanced species who already showed a strong willingness to manipulate humanity with superstimulus and pornographic advertising.
I'm going to presume you've drank tea, or taken medicine, and under that presumption I can say 'Yes you did'. It's just that the drugs you chose were the ones that adults in your culture had decided were safe... things like caffeine say. Had you grown up in a mormon culture or Amish culture, you might not be able to write the same thing you just did, so isn't what you just wrote an accident of birth rather than a conscious choice about your use of particular chemical structures inside your body?
I would imagine that by choice of locale, you may have passively taken nicotine, too, albeit in small quantities.
, never lost control to hormones....... really? Never got angry then, or too depressed to work? Crikey. Or do you mean you only lost control in the way that your parents and culture approved of; again, nothing more than an accident of birth?