You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Lukas_Gloor comments on Physicists To Test If Universe Is A Computer Simulation (link) - Less Wrong Discussion

4 Post author: D_Alex 17 April 2013 02:23AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (48)

You are viewing a single comment's thread. Show more comments above.

Comment author: Lukas_Gloor 17 April 2013 08:54:41PM 1 point [-]

Therefore, in this scenario, every human being would have a solemn duty to make the world as interesting as possible.

Great post but this is where you lost me. I have a hard time prioritzing "interesting" over reducing suffering, and I find it repugnant that some beings created a universe where quintillions of sentient creatures have been suffering and dying for half a billion years on this planet alone. OK, maybe the creators had the decency to "shortcut" all the suffering so it wasn't actually experienced, that's the upside of the thought.

Hmm, that makes for a good religion too, you only remember the suffering, but during the actual moments you were zombified, you're misremembering!

Comment author: Decius 19 April 2013 04:06:44AM 1 point [-]

My trouble was in figuring out what "interesting" means to the beings which can model a universe.

Comment author: Decius 19 April 2013 04:18:26AM 0 points [-]

I find it repugnant that some beings created a universe where quintillions of sentient creatures have been suffering and dying for half a billion years on this planet alone.

Meanwhile, they find something utterly alien about solar fusion repugnant yet utterly fascinating.

Comment author: Lukas_Gloor 19 April 2013 07:30:03AM 0 points [-]

Yes, in which case their evaluation doesn't correspond to any first-person-evaluations other than their own (because solar fusion likely doesn't have any of that), whereas my evaluation reflects all the first-person-perspectives out there. I'm being altruistic, they aren't. Sure, they might not care about that, and indeed, if the creators themselves aren't capable of suffering, they might not even realize they're being a**holes, but otherwise they'd obviously be total jerks in a very objective sense -- for whatever that's worth.

Comment author: Decius 20 April 2013 01:28:32AM -1 points [-]

What if they have first-person perspectives which are objectively comparable to us in the same way that we are comparable to solar fusion?

What are the necessary and sufficient conditions to be "total jerks" in any objective sense?

Comment author: Lukas_Gloor 20 April 2013 06:33:19AM *  0 points [-]

Then, if I understand the question correcly, the creators would be being partially altruistic, which we'd mistake for being non-altruistic because we don't understand that solar fusions can suffer.

"Not taking other-regarding reasons for actions seriously" makes you a total jerk. "Others" are beings with a first-person perspective, the only type of entities for which things can go well or not well in a sense that is more than just metaphorical. You could say that it is "bad for a rock" if the rock is split into parts, but there isn't anything there to mind the splitting so at best you're saying that you find it bad if rocks are split.

The above view fits into LW-metaethics the following way: No matter their "terminal values", everyone can try to answer which action-guiding set of principles best reflects what is good or bad for others. So once you specify what the goalpost of ethics in this sense is, everyone can play the game. Some agents will however state that they don't care about ethics if defined like that, which implies that their "terminal value" doesn't include altruism (or at least that they think it doesn't, which may sometimes happen if people are too quick to declare things their "terminal value" -- it's kind of a self-fulfilling prophecy if you think about it).

Comment author: Decius 21 April 2013 03:34:05AM 0 points [-]

Would it be immoral to fully simulate a single human with brain cancer if there was an expected return of saving more than one actual human with brain cancer? What if there was an expectation of saving less than one actual human? (Say, a one-in-X chance of saving fewer than X patients) What if there was no chance of saving an actual patient at all as a result of the simulation? Assume that simulating the human and cancer well enough requires that the simulated human simulate saying that he is self-aware, among other things.

Comment author: TheOtherDave 21 April 2013 06:52:38AM 2 points [-]

I've never quite understood, in cases like this, how "fully simulate a single human with brain cancer" and "create a single human with brain cancer" are supposed to differ from one another. Because boy do my intuitions about the situation change when I change the verb.

Comment author: gryffinp 17 April 2013 10:51:25PM 0 points [-]

I find it repugnant that some beings created a universe where quintillions of sentient creatures have been suffering and dying for half a billion years on this planet alone.

Isn't that an inevitable conclusion of the basic "the universe is a simulation" premise?