Nick, I'm now sitting here being inappropriately amused at the idea of Hal Finney as Dark Lord of the Matrix.
Eliezer, thanks for responding to that. I'm never sure how much to bring up this sort of morbid stuff. I agree as to what the question is.
Also, steven points out for the benefit of altruists that if it's not you who's tortured in the future dystopia, the same resources will probably be used to create and torture someone else.
It was Vladimir who pointed that out, I just said it doesn't apply to egoists. I actually don't agree that it applies to altru...
Does nobody want to address the "how do we know U(utopia) - U(oblivion) is of the same order of magnitude as U(oblivion) - U(dystopia)" argument? (I hesitate to bring this up in the context of cryonics, because it applies to a lot of other things and because people might be more than averagely emotionally motivated to argue for the conclusion that supports their cryonics opinion, but you guys are better than that, right? right?)
Carl, I believe the point is that until I know of a specific argument why one is more likely than the other, I have no c...
Vladimir, hell is only one bit away from heaven (minus sign in the utility function). I would hope though that any prospective heaven-instigators can find ways to somehow be intrinsically safe wrt this problem.
There are negative possibilities (woken up in dystopia and not allowed to die) but they are exotic, not having equal probability weight to counterbalance the positive possibilities.
Expected utility is the product of two things, probability and utility. Saying the probability is smaller is not a complete argument.
The Superhappies can expand very quickly in principle, but it's not clear that they're doing so
We (or "they" rather; I can't identify with your fanatically masochist humans) should have made that part of the deal, then. Also, exponential growth quickly swamps any reasonable probability penalty.
I'm probably missing something but like others I don't get why the SHs implemented part of BE morality if negotiations failed.
Shutting up and multiplying suggests that we should neglect all effects except those on the exponentially more powerful species.
Peter, destroying Huygens isn't obviously the best way to defect, as in that scenario the Superhappies won't create art and humor or give us their tech.
If they're going to play the game of Chicken, then symbolically speaking the Confessor should perhaps stun himself to help commit the ship to sufficient insanity to go through with destroying the solar system.
Well... would you prefer a life entirely free of pain and sorrow, having sex all day long?
False dilemma.
Can a preference against arbitrariness ever be stable? Non-arbitrariness seems like a pretty arbitrary thing to care about.
I would greatly prefer that there be Babyeaters, or even to be a Babyeater myself, than the black hole scenario, or a paperclipper scenario.
Seems to me it depends on the parameter values.
For what it's worth, I've always enjoyed stories where people don't get hurt more than stories where people do get hurt. I don't find previously imagined utopias that horrifying either.
I agree with Johnicholas. People should do this over IRC and call it "bloggingheadlessnesses".
In view of the Dunbar thing I wonder what people here see as a eudaimonically optimal population density. 6 billion people on Mars, if you allow for like 2/3 oceans and wilderness, means a population density of 100 per square kilometer, which sounds really really high for a cookie-gatherer civilization. It means if you live in groups of 100 you can just about see the neighbors in all directions.
"boreana"
This means "half Bolivian half Korean" according to urbandictionary. I bet I'm missing something.
Perhaps we should have a word ("mehtopia"?) for any future that's much better than our world but much worse than could be. I don't think the world in this story qualifies for that; I hate to be negative guy all the time but if you keep human nature the same and "set guards in the air that prohibit lethal violence, and any damage less than lethal, your body shall repair", they still may abuse one another a lot physically and emotionally. Also I'm not keen on having to do a space race against a whole planet full of regenerating vampires.
The fact that this future takes no meaningful steps toward solving suffering strikes me as a far more important Utopia fail than the gender separation thing.
Or "what if you wake up in Dystopia?" and tossed out the window.
What is the counterargument to this? Maybe something like "waking up in Eutopia is as good as waking up in Dystopia is bad, and more probable"; but both of those statements would have to be substantiated.
So could it be said that whenever Eliezer says "video game" he really means "RPG", as opposed to strategy games which have different principles of fun?
Probably the space you could visit at light speed in a given subjective time would be unreasonably large, depending on speedup and miniaturization.
Eliezer, "more AIs are in the hurting class than in the disassembling class" is a distinct claim from "more AIs are in the hurting class than in the successful class", which is the one I interpreted Yvain as attributing to you.