Posts

Sorted by New

Wiki Contributions

Comments

Sorted by
Prolorn00

I am asking a single question: Is there (or can we build) a morality that can be derived with logic from first principles that are obvious to everyone and require no Faith?

Perhaps you've already encountered this, but your question calls to mind the following piece by Yudkowsky: No Universally Compelling Arguments, which is near the start of his broader metaethics sequence.

I think it's one of Yudkowsky's better articles.
(On a tangential note, I'm amused to find on re-reading it that I had almost the exact same reaction to The Golden Transcendence, though I had no conscious recollection of the connection when I got around to reading it myself.)

Prolorn10

It's not a binary distinction. If an identical copy was made of one mind and tortured, while the other instance remained untortured, they would start to differentiate into distinct individuals. As rate of divergence would increase with degree of difference in experience, I imagine torture vs non-torture would spark a fairly rapid divergence.

I haven't had opportunity to commit to reading Bostrom's paper, but in the little I did read Bostrom thought it was "prima facie implausible and farfetched to maintain that the wrongness of torturing somebody would be somehow ameliorated or annulled if there happens to exist somewhere an exact copy of that person’s resulting brain-state." That is, it seemed obvious to Bostrom that having two identical copies of a tortured individual must be worse than one instance of a tortured individual (actually twice as bad, if I interpret correctly). That does not at all seem obvious to me, as I would consider two (synchronized) copies to be one individual in two places. The only thing worse about having two copies that occurs to me is a greater risk of divergence, leading to increasingly distinct instances.

Are you asking whether it would be better to create a copy of a mind and torture it rather than not creating a copy and just getting on with the torture? Well, yes. It's certainly worse than not torturing at all, but it's not as bad as just torturing one mind. Initially, the individual would half-experience torture. Fairly rapidly later, the single individual will separate into two minds, one being tortured and one not. This is arguably still better from the perspective of the pre-torture mind than the single-mind-single-torture scenario, since at least half the mind's experiences downstream is not-tortured, vs 100%-torture in other case.

If this doesn't sound convincing, consider a twist: would you choose to copy and rescue a mind-state from someone about to, say, be painfully sucked into a black hole, or would it be ethically meaningless to create a non-sucked-into-black-hole copy? Granted, it would be best to not have anyone sucked into a black hole, but suppose you had to choose?

Prolorn00

That doesn't resolve quanticle's objection. Your cutoff still suggests that a reasonably individualistic human is just as valuable as, say, the only intelligent alien being in the universe. Would you agree with that conclusion?

Prolorn30

Does this apply to legal assisted suicide within the US as well?