In case that wasn't a rhetorical question, you almost certainly did: your Introduction to Bayesian Reasoning is the fourth Google hit for "Bayesian", the third Google hit for "Bayes", and has a pagerank of 5, the same as the Cryonics Institute's main website.
"Would they take the next step, and try to eliminate the unbearable pain of broken hearts, when someone's lover stops loving them?"
We already have an (admittedly limited) counterexample to this, in that many Westerners choose to seek out and do somewhat painful things (eg., climbing Everest), even when they are perfectly capable of choosing to avoid them, and even at considerable monetary cost.
"Some ordinary young man in college suddenly decides that everyone around them is staring at them because they're part of the conspiracy."
I don't think that this is at all crazy, assuming that "they" refers to you (people are staring at me because I'm part of the conspiracy), rather than everyone else (people are staring at me because everyone in the room is part of the conspiracy). Certainly it's happened to me.
"Poetry aside, a human being isn't the seed of a god."
A human isn't, but one could certainly argue that humanity is.
"But with a sufficient surplus of power, you could start doing things the eudaimonic way. Start rethinking the life experience as a road to internalizing new strengths, instead of just trying to keep people alive efficiently."
It should be noted that this doesn't make the phenomenon of borrowed strength go away, it just outsources it to the FAI. If anything, given the kind of perfect recall and easy access to information that an FAI would have, the ratio of cached historical information to newly created information should be much higher than that...
"By now, it's probably true that at least some people have eaten 162,329 potato chips in their lifetimes. That's even less novelty and challenge than carving 162,329 table legs."
Nitpick: it takes much less time and mental energy to eat a potato chip than to carve a table leg, so the total quantity of sphexishness is much smaller.
"Or, to make it somewhat less strong, as if I woke up one morning to find that banks were charging negative interest on loans?"
They already have, at least for a short while.
"We are currently living through a crisis that is in large part due to this lack of appreciation for emergent behavior. Not only people in general but trained economists, even Nobel laureates like Paul Krugman, lack the imagination to understand the emergent behavior of free monetary systems."
"Emergence", in this instance, is an empty buzzword, see http://lesswrong.com/lw/iv/the_futility_of_emergence/. "Imagination" also seems likely to be an empty buzzword, in the sense of http://lesswrong.com/lw/jb/applause_lights/.
"pre...
"It is not clear this can be shown to be true. 'Improvement' depends on what is valued, and what the context permits. In the real world, the value of an algorithm depends on not only its abstract mathematical properties but the costs of implementing it in an environment for which we have only imperfect knowledge."
Eliezer specifically noted this in the post:
"Sometimes it is too expensive to take advantage of all the knowledge that we could, in theory, acquire from previous tests. Moreover, a complete enumeration or interval-skipping algorith...
"This may not sound like a profound insight, since it is true by definition. But consider - how many comic books talk about "mutation" as if it were a source of power? Mutation is random. It's the selection part, not the mutation part, that explains the trends of evolution."
I think this is a specific case of people treating optimization power as if it just drops out of the sky at random. This is certainly true for some individual humans (eg., winning the lottery), but as you point out, it can't be true for the system as a whole.
"...
I will not be there due to a screwup by Continental Airlines, my apologies.
See everyone there.
"As far as my childhood goes I created a lot of problems for myself by trying to force myself into a mold which conflicted strongly with the way my brain was setup."
"It's interesting that others have shared this experience, trying to distance ourselves from, control, or delete too much of ourselves - then having to undo it. I hadn't read of anyone else having this experience, until people started posting here."
For some mysterious reason, my younger self was so oblivious to the world that I never experienced (to my recollection) a massiv...
"Would you kill babies if it was the right thing to do? If no, under what circumstances would you not do the right thing to do? If yes, how right would it have to be, for how many babies?"
I would have answered "yes"; eg., I would have set off a bomb in Hitler's car in 1942, even if Hitler was surrounded by babies. This doesn't seem to be a case of corruption by unethical hardware; the benefit to me from setting off such a bomb is quite negative, as it greatly increases my chance of being tortured to death by the SS.
"But what if you were "optimistic" and only presented one side of the story, the better to fulfill that all-important goal of persuading people to your cause? Then you'll have a much harder time persuading them away from that idea you sold them originally - you've nailed their feet to the floor, which makes it difficult for them to follow if you yourself take another step forward."
Hmmm... if you don't need people following you, could it help you (from a rationality standpoint) to lie? Suppose that you read about AI technique X. Techniq...
"Human beings, who are not gods, often fail to imagine all the facts they would need to distort to tell a truly plausible lie."
One of my pet hobbies is constructing metaphors for reality which are blatantly, factually wrong, but which share enough of the deep structure of reality to be internally consistent. Suppose that you have good evidence for facts A, B, and C. If you think about A, B, and C, you can deduce facts D, E, F, and so forth. But given how tangled reality is, it's effectively impossible to come up with a complete list of humanly-de...
"I am willing to admit of the theoretical possibility that someone could beat the temptation of power and then end up with no ethical choice left, except to grab the crown. But there would be a large burden of skepticism to overcome."
If all people, including yourself, become corrupt when given power, then why shouldn't you seize power for yourself? On average, you'd be no worse than anyone else, and probably at least somewhat better; there should be some correlation between knowing that power corrupts and not being corrupted.
I volunteer to be the Gatekeeper party. I'm reasonably confident that no human could convince me to release them; if anyone can convince me to let them out of the box, I'll send them $20. It's possible that I couldn't be convinced by a transhuman AI, but I wouldn't bet $20 on it, let alone the fate of the world.
"To accept this demand creates an awful tension in your mind, between the impossibility and the requirement to do it anyway. People will try to flee that awful tension."
More importantly, at least in me, that awful tension causes your brain to seize up and start panicking; do you have any suggestions on how to calm down, so one can think clearly?
"Eliezer2000 lives by the rule that you should always be ready to have your thoughts broadcast to the whole world at any time, without embarrassment."
I can understand most of the paths you followed during your youth, but I don't really get this. Even if it's a good idea for Eliezer_2000 to broadcast everything, wouldn't it be stupid for Eliezer_1200, who just discovered scientific materialism, to broadcast everything?
"If everyone were to live for others all the time, life would be like a procession of ants following each other around in a ci...
"3WC would be a terrible movie. "There's too much dialogue and not enough sex and explosions", they would say, and they'd be right."
Hmmm.. Maybe we should put together a play version of 3WC; plays can't have sex and explosions in any real sense, and dialogue is a much larger driver.