Comment author: MichaelVassar 23 July 2010 03:55:04PM 0 points [-]

I don't actually feel hostile. I'm not offended at all. I don't think you are being dishonest, hostile, lazy, or in any sense a jerk. What I feel is a desire for you to practice critical thinking somewhere with lower standards (and especially by reading discussions where people actually change their mind, admittedly those are difficult to find) before posting here. I'd like you to actually learn to think and write more skillfully and then come back.

My comment was lazy and unhelpful. It deserves downvotes (given that opinion, should I just delete it?). As noted though, what CousinIt said is only the tip of the iceberg. I really don't want to try to explain all of what I think is wrong with it.

Comment author: Tenek 23 July 2010 05:41:51PM 5 points [-]

How is WrongBot going to learn to think and write more skillfully by moving to a place that's collectively worse at doing so?

Comment author: Clippy 22 July 2010 05:56:19PM *  13 points [-]

Well, it is sort of appealing, to be able to carefully contemplate my actions without the influence of emotion, and to get a paperclip on top of that! But then, I don't want to become some horrible robot that doesn't truly care about paperclips.

Comment author: Tenek 22 July 2010 08:43:04PM 8 points [-]

That doesn't help maximize paperclips, though. If you make all decisions based on two criteria - paperclip count and emotions - then the only situation in which those decisions differ from what you would have decided based solely on paperclip count is one in which you choose an outcome with fewer paperclips but a better emotional result.

If you were to refuse my offer, you would not only be losing a paperclip now, but also increasing the likelihood that in the future, you will decide to sacrifice paperclips for emotion's sake. Perhaps you will one day build a paperclip-creator that creates one paperclip per second, and I will threaten to destroy a paperclip unless you shut it down. If you care too much about the threatened paperclip you might comply, and then where would you be? Sitting in an empty room where paperclips should have been.

Comment author: Clippy 22 July 2010 01:53:25PM 11 points [-]

I understand that, for the forseeable future, reasonable humans and clippys will disagree about the relative merit of different amounts of paperclips. But that does not justify such trollish article titles, which seem designed to do nothing but inflame our base emotions.

Comment author: Tenek 22 July 2010 03:28:31PM 8 points [-]

Would you trade those base emotions for a paperclip?

Comment author: Tenek 15 July 2010 02:21:02PM 8 points [-]

Well, I would have done some research and gotten a warm fuzzy feeling out of expanding my knowledge, but if you're going to displace that motivation with only a chance at a measly $10 I guess it's not worth my time.

http://naggum.no/motivation.html

Comment author: kraryal 07 July 2010 04:00:32AM 5 points [-]

I think that the idea is good, and the engineering is fine for back-of-the-envelope, but can we please call it a "vault" or something instead of a grave? Cryonics already has an image problem, and we don't want to suggest the people in the grave are permanently dead.

Comment author: Tenek 07 July 2010 06:04:26PM 2 points [-]

Then we can suggest that they're temporarily dead, but they're still dead, so it's a "grave". Religions have been saying that death is temporary for thousands of years anyways, it wouldn't be anything new.

Comment author: John_Maxwell_IV 22 May 2010 07:24:48PM 2 points [-]

If you both pre-commit simultaneously, you both lose.

How about making a pre-commitment that only applies if the other person hasn't made one?

Comment author: Tenek 23 May 2010 06:13:58AM 0 points [-]

Because you need to know if they've made a commitment, and using old information can get you burned if as stated, you pre-commit simultaneously.

Comment author: billswift 22 May 2010 09:17:53AM 2 points [-]

The statement that you trust someone absolutely, more often heard of future spouses than any other time, is one of the most arrogant things you can say. You are not only saying you trust the other person, which is quite reasonable, but you are also saying you could not possibly be mistaken. Given the rate at which people actually do make mistakes, especially when their emotions are running high, a pre-nup strikes me as quite reasonable insurance.

Comment author: Tenek 23 May 2010 06:10:44AM *  0 points [-]

Then the statement of absolute trust is accounted for by the significant rate of mistakes people make.

Alternatively, you can make that statement as part of a strategy to maximize your expected return on a marriage - if the increase in marriage quality from placing absolute trust in your spouse is greater than the expected cost of being disadvantaged in the divorce negotiaions (if your spouse turns out to be untrustworthy), then you might rationally do it anyways.

In response to Blame Theory
Comment author: mattnewport 19 May 2010 10:42:18PM *  0 points [-]

What is this 'guilt' you speak of? Are you a Catholic?

In response to comment by mattnewport on Blame Theory
Comment author: Tenek 20 May 2010 03:01:37PM 3 points [-]

Guilt is an added cost to making decisions that benefit you at the expense of others. (Ideally, anyways.) It encourages people to cooperate to everyone's benefit. Suppose we have a PD matrix where the payoffs are: (defect, cooperate) = (3, 0) (defect, defect) = (1, 1) (cooperate, cooperate) = (2, 2) (cooperate, defect) = (0, 3) Normally we say that 'defect' is the dominant strategy since regardless of the other person's decision, your 'defect' option payoff is 1 higher than 'cooperate'.

Now suppose you (both) feel guilty about betrayal to the tune of 2 units: (defect, cooperate) = (1, 0) (cooperate, cooperate) = (2, 2) (defect, defect) = (-1, -1) (cooperate, defect) = (0, 1)

The situation is reversed - 'cooperate' is the dominant strategy. Total payoff in this situation is 4. Total payoff in the guiltless case is 2 since both will defect. In the OP $10-button example the total payoff is $-90, so people as a group lose out if anyone pushes the button. Guilt discourages you from pushing the button and society is better for it.

Comment author: ata 07 May 2010 03:10:26PM *  0 points [-]

I'm not convinced that 1/2 is the right answer. I actually started out thinking it was obviously 1/2, and then switched to 1/3 after thinking about it for a while (I had thought of Bostrom's variant (without the disclosure bit) before I got to that part).

Let's say we're doing the Extreme version, no disclosure. You're Sleeping Beauty, you just woke up, that's all the new information you have. You know that there are 1,000,001 different ways this could have happened. It seems clear that you should assign tails a probability of 1,000,000/1,000,001.

Now I'll go think about this some more and probably change my mind a few more times.

Comment author: Tenek 07 May 2010 03:34:54PM 6 points [-]

We can tweak the experiment a bit to clarify this. Suppose the coin is flipped before she goes to sleep, but the result is hidden. If she's interviewed immediately, she has no reason to answer other than 1/2 - at this point it's just "flip a fair coin and estimate P(heads)". What information does she get the next time she's asked that would cause her to update her estimate? She's woken up, yes, but she already knew that would happen before going under and still answered 1/2. With no new information she should still guess 1/2 when woken up.

View more: Prev