You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

V_V comments on 2012 Less Wrong Census Survey: Call For Critiques/Questions - Less Wrong Discussion

20 Post author: Yvain 19 October 2012 01:12AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (479)

You are viewing a single comment's thread.

Comment author: V_V 20 October 2012 01:03:59PM *  5 points [-]

Lord Anthony of the House Stark has developed a technology to create copies of people. He offers you to make 99999 copies of yourself, in exchange you and your copies will have to become his serfs and live the rest of your lives as medieval subsistence farmers. Assume that:

  • Living as a subsistence farmer is less desirable than your current lifestyle, but not as much undesirable that you would wish to kill yourself.

  • If you refuse his offer, your lifestyle is not going to be disrupted by extreme events such as catastrophes or technolgical singularities.

Questions:

1) Do you accept his offer?

2) Do you believe that accepting the offer is moral, immoral, or morally neutral?

(first appeared here)

Comment author: shminux 20 October 2012 05:56:35PM 3 points [-]

Living as a subsistence farmer is less desirable than your current lifestyle, but not as much undesirable that you would wish to kill yourself.

There is a lot of room between the two. It might be worth specifying something more concrete along the EY's proposal of "lives barely worth celebrating".

Comment author: [deleted] 21 October 2012 02:31:50PM 1 point [-]

Or maybe specify what is the number p such that I'd be indifferent between becoming a subsistence farmer with probability 1 and killing myself with probability p.

Comment author: shminux 21 October 2012 04:20:01PM 0 points [-]

How many pebbles form a heap?

Comment author: RobertLumley 21 October 2012 07:42:51PM 0 points [-]

In Java, the default heap is 128 megabytes, or 1073741824 bits. If you assume that half the heap will have a 1 instead of a zero, ie. a pebble as opposed to no pebble, I would say that around 536870912 pebbles form a heap.

Comment author: V_V 20 October 2012 07:24:35PM 1 point [-]

"Lives barely worth celebrating" doesn't sound very concrete to me. Do you have a better proposal?

Comment author: ArisKatsaris 21 October 2012 11:19:21AM 3 points [-]

I think the difference between "Lives worth living" and "Lives worth celebrating" is basically a difference between "I opt to not mercy-kill this person." and "I opt to bring this person into existence" -- the precise levels of happiness/utility required are of course subjective, but the latter is generally considered to be higher than the former...

Comment author: [deleted] 21 October 2012 06:30:43PM 2 points [-]

The main reason I don't kill miserable people in the real world (other than ethical injunctions) is that it would sadden/have negative externalities on other people. ISTM that certain thoughts experiments yield preposterous results as a result of neglecting this.

Comment author: V_V 21 October 2012 01:44:57PM 1 point [-]

The question asks if you opt to bring 99,999 people into existence. Adding the assmption that it is worth to bring those people into existence would beg the question.

Comment author: ArisKatsaris 21 October 2012 03:29:05PM *  2 points [-]

Yeah, my formulation of this was a bit clumsy. Perhaps instead of
a1) "I opt to not mercy-kill this person." and
b1) "I opt to bring this person into existence"
we could have
a2) "I prefer it that this person continues living." and
b2) "I prefer that this person existed in the first place from the counterfactual in which they never existed."

This detaches slightly the decision (the verb "opt") from the statement-of-preferences.

Also even with the earlier formulation, there are I guess, nitpicks which can be made: bringing the same person in existence 99,999 times may not be valued in the same way that bringing 99,999 different persons into existence would.

Comment author: [deleted] 21 October 2012 02:33:52PM 0 points [-]

No, as you'd also be taking your current life as a person-better-off-than-a-subsistence-farmer out of existence.

Comment author: shminux 20 October 2012 09:30:09PM 2 points [-]
Comment author: NancyLebovitz 21 October 2012 03:14:59PM 2 points [-]

What's Lord Anthony of House Stark up to? I bet there's a utilitarian loss somewhere in his plans.

Comment author: [deleted] 21 October 2012 06:10:01PM 2 points [-]

That's what I immediately thought about, too, but for the sake of the hypothetical I assumed he isn't doing anything extraordinarily good or extraordinarily evil.

Comment author: V_V 21 October 2012 10:43:11PM *  1 point [-]

Assume that the utility Lord Stark gains from the servitude of 100,000 instances of you approximately balances the costs he incurs in order to create the 99,999 copies, although he gets a small net gain. He would not break even if he offered to create 99,998 copies.

The utility of people other than you, your copies and Lord Stark is not affected by the transaction (there are no externalities).

Comment author: ArisKatsaris 21 October 2012 02:15:12AM *  2 points [-]

1) No.

2) Probably morally neutral, at least in the sense that all self-inflicted harm can be considered morally neutral.

Comment author: [deleted] 21 October 2012 01:29:57AM 2 points [-]

Of course not. Why the hell would I?

Comment author: V_V 21 October 2012 12:45:19PM 1 point [-]

If you were a total utilitarianist you would likely believe that accepting the offer is the only moral option.

Comment author: wedrifid 21 October 2012 01:31:31PM *  2 points [-]

If you were a total utilitarianist you would likely believe that accepting the offer is the only moral option.

You your specification doesn't make this necessarily true. You set the bounds on the utility of the subsistence farmers to "> 0", rather than "> current_you/100,000". Of course total utilitarians being what they are (crazy), it is actually only required that "bonus_utility_for_Stark + subsistence_utility * 100,000 > current_you_utility". ie. The total utilitarian would willingly submit 100,000 instances of himself to a negative utility fate worse than death if it made Stark (sufficiently) happy.

(Note the usage "total utilitarian" rather than "total utilitarianist".)

Comment author: prase 21 October 2012 03:32:01PM 0 points [-]

(Note the usage "total utilitarian" rather than "total utilitarianist".)

Is "total utilitarianist" a thing (distinct from "total utilitarian)"?

Comment author: Kindly 21 October 2012 06:55:18PM 2 points [-]

The word "utilitarian" is already terrible (everything past the first four letters is a jumble of suffix); even if "utilitarianist" were a real word, it would be better not to use it.

I wonder how hard it would be to convince everyone (or at least a substantial minority of everyone) to switch to "utilist" or something equally concise.

Comment author: wedrifid 22 October 2012 10:09:39PM *  2 points [-]

I wonder how hard it would be to convince everyone (or at least a substantial minority of everyone) to switch to "utilist" or something equally concise.

I'd prefer to switch everyone to abandoning "utilitarian" entirely as a ridiculous (and abhorrent) value system that doesn't deserve the privilege it seems to be granted by frequent reference.

Comment author: wedrifid 21 October 2012 04:50:25PM 1 point [-]

Is "total utilitarianist" a thing (distinct from "total utilitarian)"?

Not that either I or google have heard of.

Comment author: [deleted] 21 October 2012 01:01:28PM *  1 point [-]

I'm not sure copies of the same person would count. Yes, they would diverge in a while, but one of them would still have very much less relative complexity given another than different people raised as different people would.

Comment author: drethelin 20 October 2012 01:35:07PM 2 points [-]

I don't care about total utility, so arbitrarily many copies of myself with a worse life is strictly worse than one with a better life to me. The subjective experience of each one will be that they exchanged a better life for a worse one, and each one will be identical. I do not accept the offer. I think the morality of accepting this offer depends from person to person.

On the other hand, I think a lot of people would take this offer if they themselves were paid handsomely and did not have to become a serf, but their copies did.

Comment author: Vladimir_Nesov 20 October 2012 04:39:18PM *  1 point [-]

I don't care about total utility

It's not clear what this means, and for reasonable guesses about that there seems to be no way for you to know the truth or falsity of this statement with significant certainty.

(Unless you mean that your emotional response or cached opinion is this way, which answers the original question to some extent, but in that case the specific phrase "I don't care about total utility" seems to be pretending to be an additional argument that justifies the emotion/opinion, which it doesn't seem to be doing.)

Comment author: drethelin 20 October 2012 07:23:58PM 1 point [-]

It's an emotional claim, but not unthought about.

But what I mean is I do not see adding entities that slightly prefer being alive to dying as worth doing. I don't think the total count of utility that exists is important. I value utility for existing entities. I would prefer a world of 10 thousand very happy people to 10 billion slightly happy people.