Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Adam_Safron 21 February 2008 03:34:26PM 0 points [-]

And why Bayesian Positivism? Why now? For what purpose?

Eliezer recognizes the necessity for clear thought in the light of the unique challenges facing us in the years to come. We are going to need to make complex judgments about matters of existential risk and a common ground will help us avoid a quagmire of unproductive discussions.

Comment author: Adam_Safron 21 February 2008 03:17:40PM 0 points [-]

I believe Eliezer is developing a kind of Bayesian Positivism. He is attempting to describe a way of talking/thinking about things that is thoroughgoing, radically empirical, and thus grounded. I think these posts on language and definitions are essential for what he is trying to do. In this fashion, he should be able to cut through many of the Gordian Knots of philosophy. If Eliezer succeeds, and if people take notice, this could be an important moment in the history of thought.

Comment author: Adam_Safron 11 February 2008 03:42:53PM 2 points [-]

Once again, great post.

Eliezer: "We know where Pluto is, and where it's going; we know Pluto's shape, and Pluto's mass - but is it a planet? And yes, there were people who said this was a fight over definitions..."

It was a fight over definitions. Astronomers were trying to update their nomenclature to better handle new data (large bodies in the Kuiper belt). Pluto wasn't quite like the other planets but it wasn't like the other asteroids either. So they called it a dwarf-planet. Seems pretty reasonable to me. http://en.wikipedia.org/wiki/Dwarf_planet

Comment author: Adam_Safron 30 January 2008 06:43:00AM 0 points [-]

But I guess the utility could be considered to be non-0 and without further impact if some individual would choose for it not to happen to them. All else being equal, I would rather not have my eye irritated (even if I had no further consequences). And even if cost is super-astronomically small, Eliezer could think up a super-duper astronomically large number by which it could be multiplied. I guess he was right.
I'm confused.
I think I'm done.

Comment author: Adam_Safron 30 January 2008 03:45:00AM -1 points [-]

The answer is simple. If you accept the bounds of the dust-speck argument where there is no further consequence of the dust-speck beyond the moment of irritation, then the cost of the irritation cannot be distinguished from 0 cost. If I can be assured that an event will have no negative consequences in my life beyond the quality of a moment of experience, then I wouldn't even think that the event is worth my consideration. Utility = 0. Multiply any number by 0, and you get 0. The only way for the dust-speck to have negative utility is if it has some sort of impact on my life beyond that moment. The dust-speck argument can't work without violating its own assumptions. Torture is worse. Case closed.

Comment author: Adam_Safron 28 January 2008 09:03:25PM 1 point [-]

Correction: What I said: "one-second of irritation is less than 3^^^3-times as bad as the 50 years of torture." What I meant: "50 years of torture is more than 3^^^3-times as bad as 1-second of eye-irritation." Apologies for the mis-type (as well as for saying "you're" when I meant "your").

But the point is, if there are no additional consequences to the suffering, then it's irrelevant. I don't care how many people experience the 1-second of suffering. There is no number large enough to make it matter.

Eliezer had a good point. It works if we're considering lives saved. It doesn't work with dust-specks and torture. It's not because torture is a taboo that only hard-headed rationalists are able to consider with a clear-mind. It's because something that's non-consequential is non-consequential, even when you multiply it by unimaginably large numbers. But we can't imagine torture and 50 years of lost time as being non-consequential for good reason. The example was bad. We should move on to more productive endeavors.

Comment author: Adam_Safron 28 January 2008 07:07:11PM 6 points [-]

"Well, when you're dealing with a number like 3^^^3 in a thought experiment, you can toss out the event descriptions. If the thing being multiplied by 3^^^3 is good, it wins. If the thing being multiplied by 3^^^3 is bad, it loses. Period. End of discussion. There are no natural utility differences that large."

Let's assume the eye-irritation lasts 1-second (with no further negative consequences). I would agree that 3^^^3 people suffering this 1-second irritation is 3^^^3-times worse than 1 person suffering thusly. But this irritation should not be considered to be equal to 3^^^3 seconds of wasted lives. In fact, this scenario is so negligibly bad, as to not be worth the mental effort to consider it.

And for the torture option, let's assume that the suffering stops the instant the person finishes their 50 years of pain (the person leaves in exactly the same psychological state they were in before they found out that they would be tortured). However, in this case, 50 years of being tortured is not (50 years * 365 days * 24 hours * 3600 seconds)-times worse than 1-second of torture. It is much (non-linearly) worse than that. There are other variables to consider. In those 50-years, the person will miss 50 years of life. Unlike the dust-speck irritation distributed across 3^^^3 people, 50 years of torture is worth considering.

Adding experiences across people should linearly impact the estimated utility, but things do not add linearly when considering the experiences of a single person. Even if it doesn't lead to further negative consequences, the one-second of irritation is less than 3^^^3-times as bad as the 50 years of torture.

If you're multiplication has taken you so far afield of your intuitions, re-check the math. If it still comes out the same way, check your assumptions. If it still comes out the same way, go with the calculations.

Comment author: Adam_Safron 28 January 2008 05:51:12PM 0 points [-]

Eliezer, to be clear, do you still think that 3^^^3 people having momentary eye irritations--from dust-specs--is worth torturing a single person for 50 years, or is there a possibility that you did the math incorrectly for that example? A proper utilitarian needs to consider the full range of outcomes--and their probabilities--associated with different alternatives. If the momentary eye irritation leads to a greater than 1/3^^^3 probability that someone will have an accident that leads to an outcome worse than 50 years of torture, then the dust specs are preferable. But if the chance of further negative consequences from momentary eye-irritation is so small as to be negligible, then we can consider the cost of the dust specs to be the linear sum of the hedonic loss across all of the people afflicted. The torture, on the other hand, has a significant probability of leading to further negative consequences that could persist across a life-span and impact those who care about that individual. If the tortured individual has a significant probability of committing suicide, then we need to consider all of the potential experiences and accomplishments that the person would have had over the course of their life-time--which could be indefinitely long, depending on how technology progresses--and the impact that the person would have had on others. And finally, as I think you would agree, we wouldn't want to use an ethical utility function that only considered basic hedonic experience and ignored higher-level meaning. If you merely integrated all of the moments of pleasure/pain across a life-span, you wouldn't have come close to calculating the value of that life. Music is worth more than the sum of the notes that went into the song. While your basic argument is valid and important, you probably--depending on the details of the argument--came to the wrong conclusion with respect to dust-specs and torture.

In response to Circular Altruism
Comment author: Adam_Safron 22 January 2008 10:10:21PM 9 points [-]

Eliezer's point would have been valid, had he chosen almost anything other than momentary eye irritation. Even the momentary eye-irritation example would work if the eye irritation would lead to serious harm (e.g. eye inflammation and blindness) in a small proportion of those afflicted with the speck of dust. If the predicted outcome was millions of people going blind (and then you have to consider the resulting costs to society), then Eliezer is absolutely right: shut-up and do the math.

In response to Circular Altruism
Comment author: Adam_Safron 22 January 2008 09:37:56PM 9 points [-]

Eliezer, as I'm sure you know, not everything can be put on a linear scale. Momentary eye irritation is not the same thing as torture. Momentary eye irritations should be negligible in the moral calculus, even when multiplied by googleplex^^^googleplex. 50 years of torture could break someone's mind and lead to their destruction. You're usually right on the mark, but not this time.

View more: Next