Comment author: phob 10 January 2011 02:21:17AM 2 points [-]

Thank you so much for posting this! I use anki a lot, and your Mysterious Questions deck has been a great help =]

In response to Circular Altruism
Comment author: Brian_Hollar 26 January 2008 07:34:00AM 2 points [-]

I don't think this is too difficult to understand. Both in both situations, the deciders don't want to be think of themselves as possibly responsible for avoidable death. In the first scenario, you don't want to be the guy who made a gamble and everyone dies. In the second, you don't want to choose for 100 people to die. People make different choices in the two situations because they want to minimize moral culpability.

Is that rational? Strictly speaking, mabye not. Is it human? Absolutely!

Comment author: phob 04 January 2011 08:11:38PM *  0 points [-]

Rational yes, if other people know of the decision. If you never find out the result of the gamble, are not held responsible and have your memory wiped, then all confounding interests are wiped except the desire for people not to die. Only then are the irrational options actually irrational.

In response to Circular Altruism
Comment author: Roland2 23 January 2008 01:07:42AM 1 point [-]

I'm betting 10 credibility units on Yudkowsky publicly admitting that he was wrong on this one.

In response to comment by Roland2 on Circular Altruism
Comment author: phob 04 January 2011 05:58:40PM *  3 points [-]

Want to put a time scale on that?

In response to Circular Altruism
Comment author: Ben_Jones 22 January 2008 11:37:22PM 19 points [-]

So we can keep doing this, gradually - very gradually - diminishing the degree of discomfort...

Eliezer, your readiness to assume that all 'bad things' are on a continuous scale, linear or no, really surprises me. Put your enormous numbers away, they're not what people are taking umbrage at. Do you think that if a googol doesn't convince us, perhaps a googolplex will? Or maybe 3^^^3? If x and y are finite, there will always be a quantity of x that exceeds y, and vice versa. We get the maths, we just don't agree that the phenomena are comparable. Broken ankle? Stubbing your toe? Possibly, there is certainly more of a tangible link there, but you're still imposing your judgment on how the mind experiences and deals with discomfort on us all and calling it rationality. It isn't.

Put simply - a dust mote registers exactly zero on my torture scale, and torture registers fundamentally off the scale (not just off the top, off) on my dust mote scale.

You're asking how many biscuits equal one steak, and then when one says 'there is no number', accusing him of scope insensitivity.

Comment author: phob 04 January 2011 05:55:15PM *  2 points [-]

So you wouldn't pay one cent to prevent 3^^^3 people from getting a dust speck in their eye?

In response to Circular Altruism
Comment author: Adam_Safron 22 January 2008 09:37:56PM 9 points [-]

Eliezer, as I'm sure you know, not everything can be put on a linear scale. Momentary eye irritation is not the same thing as torture. Momentary eye irritations should be negligible in the moral calculus, even when multiplied by googleplex^^^googleplex. 50 years of torture could break someone's mind and lead to their destruction. You're usually right on the mark, but not this time.

Comment author: phob 04 January 2011 05:53:47PM 9 points [-]

Would you pay one cent to prevent one googleplex of people from having a momentary eye irration?

Torture can be put on a money scale as well: many many countries use torture in war, but we don't spend huge amounts of money publicizing and shaming these people (which would reduce the amount of torture in the world).

In order to maximize the benefit of spending money, you must weigh sacred against unsacred.

In response to comment by [deleted] on Rationality Quotes: December 2010
Comment author: Tiiba 03 December 2010 06:46:28PM 12 points [-]

And the answer is, "Yes! I run the world's biggest honeypot for teenage idiots who want to post pics of themselves racing on a freeway with a suspended license and a beer in the cupholder."

Comment author: phob 04 December 2010 04:37:15PM 8 points [-]

I suspect the answer is "making as much money as I possibly can", and he's doing much better than all of us. He can convert that to other forms of value later.

Comment author: NancyLebovitz 25 April 2010 09:25:58AM 7 points [-]

A bias I've noticed: People are a lot more likely to believe a bad event which was claimed to be an accident actually was an accident if it was done by someone you feel allied with, and to believe it was malice or culpable negligence if it was done by someone you already mistrust.

It's actually rather a hard call if you don't have solid information.

Comment author: phob 29 September 2010 04:45:12PM 10 points [-]

Is that really a bias? The fact that they are allied or not with you is some information about what they are likely to do.

Comment author: Daniel_Burfoot 27 July 2010 04:54:58PM *  6 points [-]

unreasonable effectiveness of mathematics in the natural sciences, especially in physics

Note that with respect to the power of mathematics, it's as easy to view the cup as half-empty as half-full. Here's Jaynes on the issue:

Phenomena whose very existence is unknown to the vast majority of the human race (such as the diff erence in ultraviolet spectra of Iron and Nickel) can be explained in exhaustive mathematical detail but all of modern science is practically helpless when faced with the complications of such a commonplace fact as the growth of a blade of grass.

Comment author: phob 27 July 2010 05:01:23PM 10 points [-]

A priori, as intelligent beings, we expect the universe at our scale to be immensely complex, since it produced us. I don't view our inability to explain fully phenomena at our scale as unreasonable non-effectiveness.

Comment author: Tom_McCabe 30 October 2007 08:21:00PM 4 points [-]

"For those who would pick SPECKS, would you pay a single penny to avoid the dust specks?"

Yes. Note that, for the obvious next question, I cannot think of an amount of money large enough such that I would rather keep it than use it to save a person from torture. Assuming that this is post-Singularity money which I cannot spend on other life-saving or torture-stopping efforts.

"You probably wouldn't blind everyone on earth to save that one person from being tortured, and yet, there are (3^^^3)/(10^17) >> 7*10^9 people being blinded for each person you have saved from torture."

This is cheating, to put it bluntly- my utility function does not assign the same value to blinding someone and putting six billion dust specks in everyone's eye, even though six billion specks are enough to blind people if you force them into their eyes all at once.

"I'd still take the former. (10**(10**100))/(3^^^3) is still so close to zero that there's no way I can tell the difference without getting a larger universe for storing my memory first."

The probability is effectively much greater than that, because of complexity compression. If you have 3^^^^3 people with dust specks, almost all of them will be identical copies of each other, greatly reducing abs(U(specks)). abs(U(torture)) would also get reduced, but by a much smaller factor, because the number is much smaller to begin with.

Comment author: phob 27 July 2010 04:46:02PM 2 points [-]

People are being tortured, and it wouldn't take too much money to prevent some of it. Obviously, there is already a price on torture.

Comment author: Eliezer_Yudkowsky 30 October 2007 06:58:00PM 2 points [-]

By "pay a penny to avoid the dust specks" I meant "avoid all dust specks", not just one dust speck. Obviously for one speck I'd rather have the penny.

Comment author: phob 27 July 2010 04:43:27PM 2 points [-]

So if someone would pay a penny, they should pick torture if it were 3^^^^3 people getting dust specks, which makes it suspect that they understood 3^^^3 in the first place.

View more: Prev | Next