One book suggestion. "On Intelligence" by Jeff Hawkins.
Although there is a plug for his own research model, I would summarise the book as:
Enjoyable book actually, regardless of what you think of his own preferred AI technique.
Interesting, and I must admit I am surprised.
Regardless of personal preferences though... it seems the closest match for the topic at hand. But hey, it's your story...
"Excession; something excessive. Excessively aggressive, excessively powerful, excessively expansionist; whatever. Such things turned up or were created now and again. Encountering an example was one of the risks you ran when you went a-wandering..."
Still puzzled by the 'player of games' ship name reference earlier in the story... I keep thinking, surely Excession is a closer match?
"But I'm having trouble figuring out the superhappys. I can think of a story with rational and emotional protagonists, a plot device relating to a 'charged particle', and the story is centered around a solar explosion (or risk of one). That story happens to involve 3 alien genders (rational, emotional, parental) who merge together to produce offspring."The story you're thinking of is The Gods Themselves by Isaac Asimov, the middle section of which stars the aliens you describe.
Yes, I believe I already identified the story in the final sentence of my post. But thanks anyway for clarifying it for those that didn't keep reading till the end :-)
Anonymous.
Regarding ship names in the koan....
Babyeaters: http://en.wikipedia.org/wiki/Midshipman's_Hope. Haven't read, just decoded from the name in the story.
But I'm having trouble figuring out the superhappys. I can think of a story with rational and emotional protagonists, a plot device relating to a 'charged particle', and the story is centered around a solar explosion (or risk of one). That story happens to involve 3 alien genders (rational, emotional, parental) who merge together to produce offspring. It should be known to many people on this thread but it's been about 10 years since I last read it. Asimov, the gods themselves.
Anonymous.
Since this is fiction (thankfully, seeing how many might allow the superhappy's the chance they need to escape the box)... an alternative ending.
The Confessor is bound by oath to allow the young to choose the path of the future no matter how morally distasteful.
The youngest in this encounter are clearly the babyeaters, technologically (and arguably morally).
Consequently the Confessor stuns everyone on board, pilots off to Baby Eater Prime and gives them the choice of how things should proceed from here.
The End
Your defection isn't. There are no longer any guarantees of anything whenever a vastly superior technology is definitely in the vicinity. There are no guarantees while any staff member of the ship is still conscious besides the Confessor and it is a known fact (from the prediction markets and people in the room) that at least some of humanity is behaving very irrationally.
Your proposal ...
Attempting to paraphrase the known facts.
You and your family and friends go for a walk. You walk into an old building with 1 entrance/exit. Your friends/family are behind you.
You notice the door has a irrevocable self-locking mechanism if it is closed.
You have a knife in your pocket.
As you walk in you see three people dressed in 'lunatic's asylum' clothes.
Two of them are in the corner; one is a guy who is beating up a woman. He appears unarmed but may have a concealed weapon.
The guy shouts to you that 'god is making him do it' and suggests that
Wait a week for a Superhappy fleet to make the jump into Babyeater space, then set off the bomb.
You guys are very trusting of super-advanced species who already showed a strong willingness to manipulate humanity with superstimulus and pornographic advertising.
I misread the story and thought the superhappys had flown off to deal with them first. But in fact, the superhappys are 'returning to their home planet' before going to deal with the babyeaters. "This will make it considerably easier to sweep through their starline network when we return.". Oops.
In any event, if the ship's crew is ...
But standing behind his target, unnoticed, the Ship's Confessor had produced from his sleeve the tiny stunner - the weapon which he alone on the ship was authorized to use, if he made a determination of outright mental breakdown. With a sudden motion, the Confessor's arm swept out...
... and anaesthetised everyone in the room. He then went downstairs to the engine room, and caused the sun to go supernova, blocking access to earth.
Regardless of his own preferences, he takes the option for humanity to 'painlessly' defect in inter-stellar prisoners dilemma, knowing apriori that the superhappys chose to co-operate.
It's worse than you think. You have to find a counterparty that will never, or seldom, engage in '100-1' type bets to other people that might threaten your chances of getting future money. Yet who is offering 100-1 odds right now.
As Buffett says: it's not who you sleep with, it's who THEY're sleeping with, that is the problem.
Eliezer: The problem is not finding a 100-1 bet. The problem is finding a counterparty offering such a bet that is highly probably to be solvent and willing to pay up, after a 30 year depression/recession.
In fact, if anything, it makes more sense to be on the 'cash in hand now' side of such bets. As Warren Buffett is.
What can I say, apart from "Progress Quest"
http://www.progressquest.com/ http://www.progressquest.com/info.php
Officially voted the Top Role Playing Game for Post-Singularity Sentient Beings.
Anonymous.
Jeff - thanks for your comment on evolutionary algorithms. Gave me a completely new perspective on these.
I had a similar problem during my PhD. Basically I had to be a workaholic in order to get through it. However, I still wanted to have some kind of life and occasionally relax my brain. I found that when I tried to watch a DVD, I would either have an idea, or I would start feeling guilty about not working. And then I'd stop...
Although it has been years, and Anonymous may never see this, I just want to point out to any future readers that have their best thoughts in the shower that decent waterproof notepads now exist. "AquaNotes" is one I have tried, and it works exactly as advertised. And the paper isn't unreasonably thick either...
An interesting modern analogy is the invention of the CDO in finance.
Its development lead to a complete change of the rules of the game.
If you had asked a bank manager 100 years ago to envisage ultimate consequences assuming the availability of a formula/spreadsheet for splitting up losses over a group of financial assets, so there was a 'risky' tier and a 'safe' tier, etc., I doubt they would have said 'The end of the American Financial Empire'.
Nonetheless it happened. The ability to sell tranches of debt at arbitary risk levels lead to the banks lending ...
Only if you ignore Colossus, the computer whose impact on the war was so great that in the UK, they destroyed it afterwards rather than risk it falling into enemy hands.
"By the end of the war, 10 of the computers had been built for the British War Department, and they played an extremely significant role in the defeat of Nazi Germany, by virtually eliminati...
Only if you completely ignore The Colossus.
"By the end of the war, 10 of the computers had been built for the British War Department, and they played an extremely significant role in the defeat of Nazi Germany, by virtually eliminating the ability of German Admiral Durnetz to sink American convoys, by undermining German General Irwin Rommel in Northern Afri...
And you'd have been right. (Ever try running Bit Torrent on a 9600 bps modem? Me neither. There's a reason for that.)
Not sure I see your point. All the high speed connections were built long before bittorrent came along, and they were being used for idiotic point-to-point centralised transfers.
All that potential was achieving not much, before the existance of the right algorithm or approach to exploit it. I suspect a strong analogy here with future AI.
If you'd asked me in 1995 how many people it would take for the world to develop a fast, distributed system for moving films and TV episodes to people's homes on an 'when you want it, how you want it' basis, internationally, without ads, I'd have said hundreds of thousands. In practice it took one guy with the right algorithm, depending...
Eliezer, this is probably the most useful blog on the internet. Don't stop writing...
Throughout these replies there is a belief that theory 1 is 'correct through skill'. With that in mind it is hard to come to any other conclusion than 'scientist 1 is better'.
Without knowing more about the experiments, we can't determine if theory 1's 10 good predictions were simply 'good luck' or accident.
If your theory is that the next 10 humans you meet will have the same number of arms as they have legs, for example...
There's also potential for survivorship bias here. If the first scientist's results had been 5 correct, 5 wrong, we wouldn't be having t...
What exactly is meant here by 'believe'? I can imagine various interpretations.
a. Which do we believe to be 'a true capturing of an underlying reality'? b. Which do we believe to be 'useful'? c. Which do we prefer, which seems more plausible?
a. Neither. Real scientists don't believe in theories, they just test them. Engineers believe in theories :-)
b. Utility depends on what you're trying to do. If you're an economist, then a beautifully complicated post-hoc explanation of 20 experiments may get your next grant more easily than a si...
I'm going to presume you've drank tea, or taken medicine, and under that presumption I can say 'Yes you did'. It's just that the drugs you chose were the ones that adults in your culture had decided were safe... things like caffeine say. Had you grown up in a mormon culture or Amish culture, you might not be able to write the same thing you just did, so isn't what you just wrote an accident of birth rather than a conscious choice about your use of particular chemical structures inside your body?
I would imagine that by choice ... (read more)