Comment author: Lumifer 21 January 2015 04:04:42PM *  2 points [-]

putting fake data into a computer system feels disvirtuous enough to put me off doing it further.

Interesting. I consider poisoning big surveillance/marketing databases to be virtuous X-D

Comment author: Leonhart 21 January 2015 11:08:14PM *  1 point [-]

I don't like to frustrate the poor databases' telos, it is not at fault for the use humans put its data to.

(Yes, I realise this is silly. It's still an actual weight in the mess I call a morality; just a small one.)

Comment author: Elo 21 January 2015 09:20:21AM 4 points [-]

facebook still does not have my phone number. Not sure what you did to need a phone number verification...

Comment author: Leonhart 21 January 2015 10:57:57PM 2 points [-]

I misremembered, you are correct. I was possibly instead frustrated with finding a temporary email that it would accept (they block the most common disposables I think).

Comment author: mwengler 20 January 2015 01:04:19PM 8 points [-]

Why would you not create a sockpuppet facebook account for the purposes of reading posts you want to read?

Comment author: Leonhart 21 January 2015 08:54:48AM 0 points [-]

Not speaking for above poster: because that's not actually trivial - you need a real fake phone number to receive validation on, etc. Also, putting fake data into a computer system feels disvirtuous enough to put me off doing it further.

Comment author: ike 15 January 2015 09:53:04PM 0 points [-]

OP is assuming selfishness, which makes this True. Any PD is TPD for a selfish person. Is it still the obvious thing to do if you're selfish?

Comment author: Leonhart 15 January 2015 10:04:59PM 0 points [-]

Yes, for a copy close enough that he will do everything that I will do and nothing that I won't. In simple resource-gain scenarios like the OP's, I'm selfish relative to my value system, not relative to my locus of consciousness.

Comment author: Manfred 15 January 2015 08:59:30PM 0 points [-]

I think that's a totally okay preference structure to have (or to prefer with metapreferences or whatever).

Comment author: Leonhart 15 January 2015 09:46:35PM 0 points [-]

Delicious reinforcement! Thank you, friend.

Comment author: ike 15 January 2015 08:57:49PM 0 points [-]

If we raised different hands, I do think it would quickly cause us to completely diverge in terms of how many body movements are equal. That doesn't mean we would be very different, or that I'm fragile. I'm pretty much the same as I was a week ago, but my movements now are different. I was just pointing out that "decisions" isn't that much more well defined than what it was coming to define (divergent).

I would automatically cooperate

In a True Prisoner's Dilemma, or even in situations like the OP? The divergence there is that one person knows they are "A" and the other "B", in ways relevant to their actions.

Comment author: Leonhart 15 January 2015 09:44:08PM *  1 point [-]

Ah, I see. We may not disagree, then. My angle was simply that "continuing to agree on all decisions" might be quite robust versus environmental noise, assuming the decision is felt to be impacted by my values (i.e. not chocolate versus vanilla, which I might settle with a coinflip anyway!)

In the OP's scenario, yes, I cooperate without bothering to reflect. It's clearly, obviously, the thing to do, says my brain.

I don't understand the relevance of the TPD. How can I possibly be in a True Prisoner's Dilemma against myself, when I can't even be in a TPD against a randomly chosen human?

Comment author: ike 15 January 2015 04:06:50PM 0 points [-]

10% or more of all decisions

Then we have the problem of deciding what counts as a decision. Even very minor changes will invalidate a broad definition like "body movements", as most body movements will be different after the 2 diverge.

My prefered diverging point is as soon as the cloning happens. I'm open to accepting that as long as they are identical, they can cooperate, but that can be justified by pure TDT without invoking "caring for the other". But any diverging stops this; that's my Schelling point.

Comment author: Leonhart 15 January 2015 08:51:48PM 1 point [-]

Do you really think your own nature that fragile?

(Please don't read that line in a judgemental tone. I'm simply curious.)

I would automatically cooperate with a me-fork for quite a while if the only "divergence" that took place was on the order of raising a different hand, or seeing the same room from a different angle. It doesn't seem like value divergence would come of that.

I'd probably start getting suspicious in the event that "he" read an emotionally compelling novel or work of moral philosophy I hadn't read.

Comment author: Manfred 15 January 2015 03:52:01PM 1 point [-]

Hm, this points out to me that I could have made this post more stand-alone. The idea was that you eat the candy and experience a non-transferrable reward. But let me give an example of what I mean by selfish preferences.

If someone made a copy of me and said they could either take me hang-gliding, or take my copy, I'd prefer that I go hang-gliding. Selfishly :P

Comment author: Leonhart 15 January 2015 05:49:48PM 0 points [-]

Assuming we substitute something I actually want to do for hang-gliding...

("Not the most fun way to lose 1/116,000th of my measure, thanks!" say both copies, in stereo)

...and that I don't specifically want to avoid non-shared experiences, which I probably do...

("Why would we want to diverge faster, anyway?" say the copies, raising simultaneous eyebrows at Manfred)

...that's what coinflips are for!

(I take your point about non-transferability, but I claim that B-me would press the button even if it was impossible to share the profits.)

Comment author: Leonhart 15 January 2015 09:42:57AM *  3 points [-]

I am confident that, in this experiment, my B-copy would push the button, my A-copy would walk away with 60 candies, and shortly thereafter, if allowed to confer, they would both have 30. And that this would happen with almost no angst.

I'm puzzled as to you why you think this is difficult. Are people being primed by fiction where they invariably struggle against their clones to create drama?

Comment author: FrameBenignly 05 January 2015 08:38:12PM 1 point [-]

It depends on your definition of supernatural, and most people on LessWrong seem to have a very narrow definition of supernatural. I think Eliezer once wrote a post about it, but I don't believe he cited any references. Some definitions of supernatural would require many people on here to revise their estimate significantly upward. I took the lack of a definition to mean we should use any and all possible definitions of supernatural when considering the question, which is why I picked 100 percent. There's actually been a discussion on whether simulations imply God, and most answered no. I thought the reasoning some used for that was rather peculiar. That discussion of course didn't include any citations either.

Comment author: Leonhart 06 January 2015 12:02:41AM 3 points [-]

You're thinking of this one, and he cited Carrier, and we have this argument after every survey. At this point it's a Tradition, and putting "ARGH LOOK JUST USE CARRIER'S DEFINITION" on the survey itself would just spoil it :)

View more: Prev | Next