Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Dagon 29 March 2017 01:14:06AM 0 points [-]

That implies that you view replacing the death penalty in US states with 'death followed by uploading into an indefinite long-term simulation of confinement' to be less heinous?

Clearly it's less harsh, and most convicts would prefer to experience incarceration for an indefinite time over a simple final death. This might change after a few hundred or million subjective years, but I don't know - it probably depends on what activities the em has access to.

Whether it's "heinous" is harder to say. Incarceration is a long way from torture, and I don't know what the equilibrium effect on other criminals will be if it's known that a formerly-capital offense now enables a massively extended lifespan, albeit in jail.

Comment author: RedMan 29 March 2017 08:00:53PM 0 points [-]

The suicide rate for incarcerated Americans is three times that of the general population, anecdotally, many death row inmates have expressed the desire to 'hurry up with it'. Werner Herzog's interviews of George Rivas and his co-conspirators are good examples of the sentiment. There's still debate about the effectiveness of the death penalty as a deterrent to crime.

I suspect that some of these people may prefer the uncertain probability of confinement to hell by the divine, to the certain continuation of their sentences at the hands of the state.

Furthermore, an altruist working to further the cause of secure deletion may be preventing literal centuries of human misery. Why is this any less important than feeding the hungry, who at most will suffer for a proportion of a single lifetime?

Comment author: Dagon 27 March 2017 03:29:48PM 2 points [-]

I do care about his reasoning, and disagree with it (most notably the "any torture -> infinite torture" part, with no counterbalancing "any pleasure -> ?) term in the calculation.

but I'm with Iahwran on the conclusion: destroying the last copy of someone is especially heinous, and nowhere near justified by your reasoning. I'll join his precommittment to punish you if you commit crimes in pursuit of these wrong beliefs (note: plain old retroactive punishment, nothing acausal here).

Comment author: RedMan 29 March 2017 12:09:57AM 0 points [-]

Under paragraph 2, destroying the last copy is especially heinous. That implies that you view replacing the death penalty in US states with 'death followed by uploading into an indefinite long-term simulation of confinement' to be less heinous? The status quo is to destroy the only copy of the mind in question.

Would it be justifiable to simulate prisoners with sentences they are expected to die prior to completing, so that they can live out their entire punitive terms and rejoin society as Ems?

Thank you for the challenging responses!

Comment author: lahwran 27 March 2017 04:58:29AM 2 points [-]

of course not, you're not destroying the primary copy of me. But that's changing the case you're making; you specifically said that killing now is preferable. I would not be ok with that.

Comment author: RedMan 29 March 2017 12:05:21AM 0 points [-]

Correct, that is different from the initial question, you made your position on that topic clear.

Would the copy on the satellite disagree about the primacy of the copy not in the torture sim? Would a copt have the right to disagree? Is it morally wrong for me to spin up a dozen copies of myself and force them to fight to the death for my amusement?

I'm guessing based on your responses that you would agree with the statement 'copies of the same root individual are property of the copy with the oldest timestamped date of creation, and may be created, destroyed, and abused at the whims of that first copy, and no one else'

If you copy yourself, and that copy commits a crime, are all copies held responsible, just the 'root' copy, or just the 'leaf' copy?

Thank you for the challenging responses!

Comment author: RobinHanson 26 March 2017 11:18:00PM 6 points [-]

If it is the possibility of large amounts of torture that bothers you, instead of large ratios of torture experience relative to other better experience, then any growing future should bother you, and you should just want to end civilization. But if it is ratios that concern you, then since torture isn't usually profitable, most em experience won't be torture. Even if some bad folks being rich means they could afford a lot of torture, that would still be a small fraction of total experience.

Comment author: RedMan 28 March 2017 11:59:29PM 0 points [-]

Thank you for your reply to this thought experiment professor!

I accept your assertion that the ratio of aggregate suffering to aggregate felicity has been trending in the right direction, and that this trend is likely to continue, even into the Age of Em. That said, the core argument here is that as humans convert into Ems, all present day humans who become Ems have a high probability of eventually subjectively experiencing hell. The fact that other versions of the self, or other Ems are experiencing euphoria will be cold comfort to one so confined.

Under this argument, the suffering of people in the world today can be effectively counterbalanced by offering wireheading to Americans with a lot of disposable income--it doesn't matter if people are starving, because the number of wireheaded Americans is trending upwards!

An Age of Em is probably on balance a good thing, even though I see the possibility of intense devaluation of human life, and the possibility of some pretty horrific scenarios, I think that mitigating the latter is important, even if the proposed (controversial!) mechanism is inappropriate.

After all, if we didn't use cars, nobody would be harmed in car accidents.

Comment author: jmh 27 March 2017 02:54:47PM 1 point [-]

The answer seems fairly simple to me. You're not in any position to decide the risks others assume. If you're concerned about the potential torture the only mind you can really do anything about is yours -- you don't run around killing everyone else, just yourself.

Comment author: RedMan 28 March 2017 11:46:56PM 0 points [-]

The question asks if ensuring secure deletion is an example of effective altruism. If I have the power to dramatically alter someone's future risk profile (say, funding ads enxouraging smoking cessation, even if the person is uninterested in smoking cessation at present), isn't it my duty as an effective altruist to atrempt to do so?

Comment author: HungryHobo 28 March 2017 03:12:48PM 0 points [-]

This sounds like the standard argument around negative utility.

if you weight negative utility quite highly then you could also come to the conclusion that the moral thing to do is to set to work on a virus to kill all humans as fast as possible.

You don't even need mind-uploading. If you weight suffering highly enough then you could decide that it's the right thing to do taking a trip to a refugee camp full of people who, on average, are likely to have hard, painful lives, and leaving a sarin gas bomb.

Put another way: if you encountered an infant with epidermolysis bullosa would you try to kill them, even against their wishes?

Comment author: RedMan 28 March 2017 11:44:10PM *  0 points [-]

Negative utility needs a non-zero weight. I assert that it is possible to disagree with your scenarios (refugees, infant) and still be trapped by the OP, if negative utility is weighted to a low but non-zero level, such that avoiding the suffering of a human lifespan is never adequate to justify suicide. After all, everyone dies eventually, no need to speed up the process when there can be hope for improvement.

In this context, can death be viewed as a human right? Removing the certainty of death means that any non-zero weight to negative utility can result in an arbtrarily large aggregate negative utility in the (potentially unlimited) lifetime of an individual confined in a hell simulation.

Comment author: entirelyuseless 26 March 2017 07:12:56PM 1 point [-]

Presumably someone who accepted the argument would be happy with this deal.

Comment author: RedMan 26 March 2017 07:46:57PM 0 points [-]

Correct, this is very much an 'I'll pray for you' line of reasoning. To use a religious example, it is better to martyr a true believer (who will escape hell) than to permit a heretic to live, as the heretic may turn others away from truth, and thus curse them to hell. So if you're only partially sure that someone is a heretic, it is safer for the community to burn them. Anyone who accepts this line of argument would rather be burnt than allowed to fall into heresy.

Unfortunately, mind uploading gives us an actual, honest road to hell, so the argument cannot be dispelled with the statement that the risk of experiencing hell is unquantifiable or potentially zero. As I argue here, it is non-zero and potentially high, so using moral arguments that humans have used previously, it is possible to justify secure deletion in the context of 'saving souls'. This does not require a blender, a 'crisis uploading center' may do the job just as well

Comment author: lahwran 26 March 2017 07:08:51PM 2 points [-]

morality is about acausal contracts between counterfactual agents, and I do not want my future defended in this way. I don't care what you think of my suffering; if you try to kill me to prevent my suffering, I'll try to kill you back.

Comment author: RedMan 26 March 2017 07:35:02PM *  0 points [-]

I discover evidence that some sadistic jerk has stolen copies of both our minds, uploaded them to a toture simulation, and placed the torture simulation on a satellite orbiting the sun with no external communication inputs and a command to run for as long as possible at maximum speed. Rescue via spaceship is challenging and would involve tremendous resources that we do not have available to us.

I have a laser I can use to destroy the satellite, but a limited window in which to do it (would have to wait for orbits to realign to shoot again).

Would you be upset if I took the shot without consulting you?

Comment author: SquirrelInHell 25 March 2017 09:42:33PM 1 point [-]

If you take the economic perspective (such as I understand R. Hanson's version to be), the only simulations we will ever run at scale are those that generate profits.

Torture is a money-sink with no economic value other than blackmail.

So torture in simulations will necessarily be marginalized (esp. so if humanity becomes better at pre-commitment to not respond to blackmail).

Comment author: RedMan 26 March 2017 11:39:18AM 0 points [-]

As stated in a separate comment, the human mind runs at 20W, so that's probably a reasonable design goal for the power consumption of an emulation. Keeping a few copies of minds around for torture will eventually be a cheap luxury, comparable to leaving a lightbulb on.

Comment author: Dagon 25 March 2017 11:57:43PM 1 point [-]

Umm, stop waving your hands and start putting some estimates down. Especially when you say things like

Over a long enough timeline, the probability of a copy of any given uploaded mind falling into the power of a sadistic jerk approaches unity.

You show an inability to actually figure out the relative frequencies that would make this true or false. There's lots of ways this could be false - most notably there may be dozens of orders of magnitude more uploaded minds than sadistic jerks, and any nonzero cost of running a mind means the SJs simply can't afford to torture most of them.

Once an uploaded mind has fallen under the power of a sadistic jerk, there is no guarantee that it will ever be 'free', and the quantity of experienced sufferring could be arbitrarily large, due in part to the embarrassingly parallel nature of torture enabled by running multiple copies of a captive mind.

More unstated assumptions (with which I think I disagree). How are you aggregating suffering (or value generally) for minds? Do you think that identical tortures for two copies of a mind is different than torture of one? Why? Do you think that any amount of future potential torture can remove the value of current pleasure? Why?

Even if you try to just quantify "value * experienced-seconds" and simply multiply, it's going to be hard to think anyone is better off NOT being uploaded.

Feel free to make choices for yourself, and even to advocate others to securely erase their information-patterns before it's too late. But without a lot more clear probability estimates and aggregation methodology, I think I'll take my chances and seek to continue living.

Comment author: RedMan 26 March 2017 11:34:27AM *  0 points [-]

For the sake of argument, some numbers to match the assumptions you named. Let's base these assumptions on some numbers available to Americans today, rounded to even numbers in the direction least favorable to my argument.

Percentage of population that are psychopaths: 1% (two orders of magnitude more non psychopaths than psychopaths exist today) Probability of being victim of violent crime varies a lot based on demographics, 10 per 1000 per year is reasonable...so 1% Power consumption of human mind: 20W (based on the human brain, we will not hit this immediately, but it is a design goal, and may even be exceeded in efficiency as we get better) Power consumed by typical American household: 900kWh per month (100 years in brain-seconds) Number of humans available for uploading: 10 billion.

Over a hundred thousand years, that's a lot of terrible people, a lot of spare capacity for evil, and a high probability of everyone eventually experiencing a violent crime, like upload-torment. Changes to those numbers unfavorable to this scenario require incredible optimism about social developments, and pessimism about technical developments.

I feel like just about anyone, even without a stanford prison experiment like environment, can muster up the will to leave a lightbulb on for a while out of spite.

Arguably, once 'captured', the aggregate total time spent experiencing torture for a given future copy of you may vastly exceed the time spent on anything else.

Anyone who argues in favor of 'merciful' euthanasia for people on the way to horrific medical problems would likely argue in favor of secure deletion to avoid an eternity in hell.

View more: Next