I laughed: SMBC comic.
I laughed: SMBC comic.
SMBC comic: poorly programmed average-utility-maximizing AI
It's a total-utility maximising AI.
if it was a total utility maximizing AI it would clone the utility monster (or start cloning everyone else if the utility monster is super linear) edit: on the other hand, if it was average utility maximizing AI it would kill everyone else leaving just the utility monster. In any case there'd be some serious population 'adjustment'.
Felix means happy (or lucky), and is the origin of the word felicity. It took me a while to realize this, so I thought I would note it. Is it obvious for all native English speakers?
The latest SMBC comic is now an illustrated children's story which more or less brings up parallel thoughts to Cynical about Cynicism.
Everyone's talking about this as if it was a hypothetical, but as far as I can tell it describes pretty accurately how hierarchical human civilizations tend to organize themselves once they hit a certain size. Isn't a divine ruler precisely someone who is more deserving and more able to absorb resources? Aren't the lower orders people who would not appreciate luxuries and indeed have fully internalized such a fact ("Not for the likes of me")
If you skip the equality requirement, it seems history is full of utilitarian societies.
Felix is 3^^^3 units happy. And no dust speck in his eyes. What is torturing millions for this noble goal?
I, of course, reject that "sequence" which preaches exactly this.
That's because your brain doesn't have the ability to imagine just how happy Felix is and fails to weigh his actual happiness against humanity's.
Look. You have one person, under terrible torture for 50 years on one side and a gazillion of people with a slight discomfort every year or so on the other side.
It is claimed that the first is better.
Now, you have a small humanity as is, only enslaved for pyramid building for Felix. He has eons of subjective time to enjoy this pyramids and he is unbelievably happy. More happy than any man, woman or child could ever be. The amount of happiness of Felix outweights the misery of billion of people by a factor of a million.
What's the fundamental difference between those two cases? I don't see it, do you?
The only similarity between those cases is that they involve utility calculations you disagree with. Otherwise every single detail is completely different. (e. g. the sort of utility considered, two negative utilities being traded against each other vs. trading utility elsewhere (positive and negative) for positive utility, which side of the trade the single person with the large individual utility difference is on, the presence of perverse incentives, etc, etc).
If anything it would be more logical to equate Felix with the tortured person and treat this as a reductio ad absurdum of your position on the dust speck problem. (But that would be wrong too, since the numbers aren't actually the problem with Felix, the fact that there's an incentive to manipulate your own utility function that way is (among other things).)
People are individual survival machines, that's why. Each bastard in the Omelas knows at the gut level (not in some abstract way) that there's a child being miserable specifically for a tiny bit of his happiness. His personally. He will then kill for larger bit of his happiness. He isn't society. He's an individual. It is all between him and that child. At very best, between him&his family, and that child. The society ain't part of equation. (And if it is, the communism should of worked perfectly in that universe) [assuming that the individual believes he won't be caught]
edit: also i think you don't understand the story. They didn't take the child apart for much needed organs to save other folks in Omelas. The child is miserable for the purpose of bringing sense of unity into the commune, for the purpose of making them value their happiness. That is already very irrational, and not only that but also entirely contrary to how homo sapiens behave when exposed to gross injustice.
edit: To explain my use of language. We are not talking about rational agents and what they ought to decide. We are taking of irrational agents that are supposedly (premise of the story) made more well behaved by participation in a pointless and evil ritual, which is the opposite of the known effect of direct participation in that sort of ritual, on populace. That's why the story makes a poor case against utilitarianism. Because the consequence is grossly invalid.
Tapping out, inferential distance too wide.