facebook still does not have my phone number. Not sure what you did to need a phone number verification...
I misremembered, you are correct. I was possibly instead frustrated with finding a temporary email that it would accept (they block the most common disposables I think).
Why would you not create a sockpuppet facebook account for the purposes of reading posts you want to read?
Not speaking for above poster: because that's not actually trivial - you need a real fake phone number to receive validation on, etc. Also, putting fake data into a computer system feels disvirtuous enough to put me off doing it further.
OP is assuming selfishness, which makes this True. Any PD is TPD for a selfish person. Is it still the obvious thing to do if you're selfish?
Yes, for a copy close enough that he will do everything that I will do and nothing that I won't. In simple resource-gain scenarios like the OP's, I'm selfish relative to my value system, not relative to my locus of consciousness.
I think that's a totally okay preference structure to have (or to prefer with metapreferences or whatever).
Delicious reinforcement! Thank you, friend.
If we raised different hands, I do think it would quickly cause us to completely diverge in terms of how many body movements are equal. That doesn't mean we would be very different, or that I'm fragile. I'm pretty much the same as I was a week ago, but my movements now are different. I was just pointing out that "decisions" isn't that much more well defined than what it was coming to define (divergent).
I would automatically cooperate
In a True Prisoner's Dilemma, or even in situations like the OP? The divergence there is that one person knows they are "A" and the other "B", in ways relevant to their actions.
Ah, I see. We may not disagree, then. My angle was simply that "continuing to agree on all decisions" might be quite robust versus environmental noise, assuming the decision is felt to be impacted by my values (i.e. not chocolate versus vanilla, which I might settle with a coinflip anyway!)
In the OP's scenario, yes, I cooperate without bothering to reflect. It's clearly, obviously, the thing to do, says my brain.
I don't understand the relevance of the TPD. How can I possibly be in a True Prisoner's Dilemma against myself, when I can't even be in a TPD against a randomly chosen human?
10% or more of all decisions
Then we have the problem of deciding what counts as a decision. Even very minor changes will invalidate a broad definition like "body movements", as most body movements will be different after the 2 diverge.
My prefered diverging point is as soon as the cloning happens. I'm open to accepting that as long as they are identical, they can cooperate, but that can be justified by pure TDT without invoking "caring for the other". But any diverging stops this; that's my Schelling point.
Do you really think your own nature that fragile?
(Please don't read that line in a judgemental tone. I'm simply curious.)
I would automatically cooperate with a me-fork for quite a while if the only "divergence" that took place was on the order of raising a different hand, or seeing the same room from a different angle. It doesn't seem like value divergence would come of that.
I'd probably start getting suspicious in the event that "he" read an emotionally compelling novel or work of moral philosophy I hadn't read.
Hm, this points out to me that I could have made this post more stand-alone. The idea was that you eat the candy and experience a non-transferrable reward. But let me give an example of what I mean by selfish preferences.
If someone made a copy of me and said they could either take me hang-gliding, or take my copy, I'd prefer that I go hang-gliding. Selfishly :P
Assuming we substitute something I actually want to do for hang-gliding...
("Not the most fun way to lose 1/116,000th of my measure, thanks!" say both copies, in stereo)
...and that I don't specifically want to avoid non-shared experiences, which I probably do...
("Why would we want to diverge faster, anyway?" say the copies, raising simultaneous eyebrows at Manfred)
...that's what coinflips are for!
(I take your point about non-transferability, but I claim that B-me would press the button even if it was impossible to share the profits.)
I am confident that, in this experiment, my B-copy would push the button, my A-copy would walk away with 60 candies, and shortly thereafter, if allowed to confer, they would both have 30. And that this would happen with almost no angst.
I'm puzzled as to you why you think this is difficult. Are people being primed by fiction where they invariably struggle against their clones to create drama?
It depends on your definition of supernatural, and most people on LessWrong seem to have a very narrow definition of supernatural. I think Eliezer once wrote a post about it, but I don't believe he cited any references. Some definitions of supernatural would require many people on here to revise their estimate significantly upward. I took the lack of a definition to mean we should use any and all possible definitions of supernatural when considering the question, which is why I picked 100 percent. There's actually been a discussion on whether simulations imply God, and most answered no. I thought the reasoning some used for that was rather peculiar. That discussion of course didn't include any citations either.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Interesting. I consider poisoning big surveillance/marketing databases to be virtuous X-D
I don't like to frustrate the poor databases' telos, it is not at fault for the use humans put its data to.
(Yes, I realise this is silly. It's still an actual weight in the mess I call a morality; just a small one.)