I laughed: SMBC comic.
I laughed: SMBC comic.
SMBC comic: poorly programmed average-utility-maximizing AI
It's a total-utility maximising AI.
if it was a total utility maximizing AI it would clone the utility monster (or start cloning everyone else if the utility monster is super linear) edit: on the other hand, if it was average utility maximizing AI it would kill everyone else leaving just the utility monster. In any case there'd be some serious population 'adjustment'.
Felix means happy (or lucky), and is the origin of the word felicity. It took me a while to realize this, so I thought I would note it. Is it obvious for all native English speakers?
The latest SMBC comic is now an illustrated children's story which more or less brings up parallel thoughts to Cynical about Cynicism.
Everyone's talking about this as if it was a hypothetical, but as far as I can tell it describes pretty accurately how hierarchical human civilizations tend to organize themselves once they hit a certain size. Isn't a divine ruler precisely someone who is more deserving and more able to absorb resources? Aren't the lower orders people who would not appreciate luxuries and indeed have fully internalized such a fact ("Not for the likes of me")
If you skip the equality requirement, it seems history is full of utilitarian societies.
Felix is 3^^^3 units happy. And no dust speck in his eyes. What is torturing millions for this noble goal?
I, of course, reject that "sequence" which preaches exactly this.
That's because your brain doesn't have the ability to imagine just how happy Felix is and fails to weigh his actual happiness against humanity's.
Look. You have one person, under terrible torture for 50 years on one side and a gazillion of people with a slight discomfort every year or so on the other side.
It is claimed that the first is better.
Now, you have a small humanity as is, only enslaved for pyramid building for Felix. He has eons of subjective time to enjoy this pyramids and he is unbelievably happy. More happy than any man, woman or child could ever be. The amount of happiness of Felix outweights the misery of billion of people by a factor of a million.
What's the fundamental difference between those two cases? I don't see it, do you?
The only similarity between those cases is that they involve utility calculations you disagree with. Otherwise every single detail is completely different. (e. g. the sort of utility considered, two negative utilities being traded against each other vs. trading utility elsewhere (positive and negative) for positive utility, which side of the trade the single person with the large individual utility difference is on, the presence of perverse incentives, etc, etc).
If anything it would be more logical to equate Felix with the tortured person and treat this as a reductio ad absurdum of your position on the dust speck problem. (But that would be wrong too, since the numbers aren't actually the problem with Felix, the fact that there's an incentive to manipulate your own utility function that way is (among other things).)
It only implies so if your AI is totally omniscient.
edit: Anyhow, I can of course think of AI that can do better than humanity: the AI sits inside Jupiter, and nudges away any incoming comets and asteroids, and that's it (then as sun burns up then burns out, moves Earth around). The problem starts when you make the AI discriminate between very similar worlds. edit: and even that asteroid stopping AI may be a straitjacket to intelligent life as it may be that the mankind is a wrong thing entirely, and should be permitted to kill itself, and then the meteorite impacts should be allowed so that ants get a chance.
I don't know much about my own extrapolated preferences but I can reason that as my preferences are the product of noise in the evolutionary process, reality is unlikely to align with them naturally. It's possible that my preferences consider "mankind a wrong thing entirely"; but that they would align with whatever the universe happens to produce next on earth (assuming the ... (read more)