All of Unknown3's Comments + Replies

Eliezer, why didn't you answer the question I asked at the beginning of the comment section of this post?

Unknown3-20

I would greatly prefer that there be Babyeaters, or even to be a Babyeater myself, than the black hole scenario, or a paperclipper scenario. This strongly suggests that human morality is not as unified as Eliezer believes it is... like I've said before, he will horrified by the results of CEV.

Or the other possibility is just that I'm not human.

Unknown3-10

About the comments on compromise: that's why I changed my mind. The functions are so complex that they are bound to be different in the complex portions, but they also have simplifying terms in favor of compromise, so it is possible that everyone's morality will end up the same when this is taken into account.

As for the probability that Eliezer will program an AI, it might not be very low, but it is extremely low that his will be the first, simply because so many other people are trying.

I wonder if Eliezer is planning to say that morality is just an extrapolation of our own desires? If so, then my morality would be an extrapolation of my desires, and your morality would be an extrapolation of yours. This is disturbing, because if our extrapolated desires don't turn out to be EXACTLY the same, something might be immoral for me to do which is moral for you to do, or moral for me and immoral for you.

If this is so, then if I programmed an AI, I would be morally obligated to program it to extrapolate my personal desires-- i.e. my personal desi... (read more)

For all those who have said that morality makes no difference to them, I have another question: if you had the ring of Gyges (a ring of invisibility) would that make any difference to your behavior?

Some people on this blog have said that they would do something different. Some people on this blog have said that they actually came to that conclusion, and actually did something different. Despite these facts, we have commenters projecting themselves onto other people, saying that NO ONE would do anything different under this scenario.

Of course, people who don't think that anything is right or wrong also don't think it's wrong to accuse other people of lying, without any evidence.

Once again, I most certainly would act differently if I thought that nothi... (read more)

Pablo, according to many worlds, even if it is now raining in Oxford, yesterday "it will rain in Oxford tomorrow" and "it will not rain in Oxford tomorrow" were both equally true, or both equally false, or whatever. In any case, according to many worlds, there is no such thing as "what will happen", if this is meant to pick some particular possibility like rain in Oxford.

Nick Tarleton, what is your definition of free will? You can't even say the concept is incoherent without a definition. According to my definition, randomness definitely gives free will.

Z.M. Davis, "I am consciously aware that 2 and 2 make 4" is not a different claim from "I am aware that 2 and 2 make 4." One can't make one claim without making the other. In other words, "I am unconsciously aware that 2 and 2 make 4" is a contradiction in terms.

If an AI were unconscious, it presumably would be a follower of Daniel Dennett; i.e. it would admit that it had no qualia, but would say that the same was true of human beings. But then it would say that it is conscious in the same sense that human beings are. Likewise... (read more)

0pnrjulius
If it says I have no qualia, it's wrong. Headaches fucking HURT, dammit. That's a quale. And here's the point you seem to be missing: Yes, it can make the statement; but no, that does not mean it actually has the required capacities to make the statement true. It's trivially easy to write a computer program that prints out all manner of statements.

Ben, what do you mean by "measurable"? In the zombie world, Ben Jones posts a comment on this blog, but he never notices what he is posting. In the real world, he knows what he is posting. So the difference is certainly noticeable, even if it isn't measurable. Why isn't "noticeable" enough for the situation to be a useful consideration?

Paul: "we are morally obliged to kill everyone we meet" has no scientific implications, but it definitely has moral implications. To speak plainly, your position is false, and obviously so.

Some children (2-4 years of age) assume that other human beings are zombies, because they do not meet their model of a conscious observer, e.g. adults don't go and eat the ice cream in the freezer, even though no one can stop them, and even though any conscious being would of course eat that ice cream, if it were in its power.

This fact actually is one proof tha... (read more)

Roko, I strongly suspect that a limitedly caring universe just reduces to a materialist universe with very complex laws. For example, isn't it kind of like magic that when I want to lift my hand, it actually moves? What would be the difference if I could levitate or change lead into gold? If the universe obeys my will about lifting my hand, why shouldn't it obey in other things, and if it did, why would this be an essential difference?

Jeffrey, do you really think serial killing is no worse than murdering a single individual, since "Subjective experience is restricted to individuals"?

In fact, if you kill someone fast enough, he may not subjectively experience it at all. In that case, is it no worse than a dust speck?

Jeffrey, on one of the other threads, I volunteered to be the one tortured to save the others from the specks.

As for "Real decisions have real effects on real people," that's absolutely correct, and that's the reason to prefer the torture. The utility function implied by preferring the specks would also prefer lowering all the speed limits in the world in order to save lives, and ultimately would ban the use of cars. It would promote raising taxes by a small amount in order to reduce the amount of violent crime (including crimes involving torture... (read more)

Eliezer, I have a question about this: "There is no finite amount of life lived N where I would prefer a 80.0001% probability of living N years to an 0.0001% chance of living a googolplex years and an 80% chance of living forever. This is a sufficient condition to imply that my utility function is unbounded."

I can see that this preference implies an unbounded utility function, given that a longer life has a greater utility. However, simply stated in that way, most people might agree with the preference. But consider this gamble instead:

A: Live 5... (read more)

0thrawnca
If this was the only chance you ever get to determine your lifespan - then choose B. In the real world, it would probably be a better idea to discard both options and use your natural lifespan to search for alternative paths to immortality.

About the slugs, there is nothing strange in asserting that the utility of the existence of something depends partly on what else exists. Consider chapters in a book: one chapter might be useless without the others, and one chapter repeated several times would actually add disutility.

So I agree that a world with human beings in it is better than one with only slugs: but this says nothing about the torture and dust specks.

Eisegetes, we had that discussion previously in regard to the difference between comparing actions and comparing outcomes. I am fairly su... (read more)

Z.M. Davis, that's an interesting point about the slugs, I might get to it later. However, I suspect it has little to do with the torture and dust specks.

Doug, here's another problem for your proposed function: according to your function, it doesn't matter whether a single person takes all the pain or if it is distributed, as long as it sums to the same amount according to your function.

So let's suppose that the pain of solitary confinement without anything interesting to do can never add up to the pain of 50 years torture. According to this, would you hon... (read more)

Notice that in Doug's function, suffering with intensity less than 0.393 can never add up to 50 years of torture, even when multiplied infinitely, while suffering of 0.394 will be worse than torture if it is sufficiently multiplied. So there is some number of 0.394 intensity pains such that no number of 0.393 intensity pains can ever be worse, despite the fact that these pains differ by 0.001, stipulated by Doug to be the pain of a dust speck. This is the conclusion that I pointed out follows with mathematical necessity from the position of those who prefer the specks.

Doug, do you actually accept this conclusion (about the 0.393 and 0.394 pains), or you just trying to show that the position is not logically impossible?

Ben P: the arrangement of the scale is meant to show that the further you move the lever toward 3^^^3 dust specks, the worse things get. The torture decreases linearly simply because there's no reason to decrease it by more; the number of people increases in the way that it does because of the nature of 3^^^3 (i.e. the number is large enough to allow for this). The more we can increase it at each stop, the more obvious it is that we shouldn't move the lever at all, but rather we should leave it at torturing 1 person 50 years.

Ben, here's my new and improved lever. It has 7,625,597,484,987 settings. On setting 1, 1 person is tortured for 50 years plus the pain of one dust speck. On setting 2, 3 persons are tortured for 50 years minus the pain of (50-year torture/7,625,597,484,987), i.e. they are tortured for a minute fraction of a second less than 50 years, again plus the pain of one dust speck. On setting 3, 3^3 persons, i.e. 27 persons, are tortured for 50 years minus two such fractions of a second, plus the pain of one dust speck. On setting 4, 3^27, i.e. 7,625,597,484,987 pe... (read more)

To your voting scenario: I vote to torture the terrorist who proposes this choice to everyone. In other words, asking each one personally, "Would you rather be dust specked or have someone randomly tortured?" would be much like a terrorist demanding $1 per person (from the whole world), otherwise he will kill someone. In this case, of course, one would kill the terrorist.

I'm still thinking about the best way to set up the lever to make the point the most obvious.

Ben: suppose the lever has a continuous scale of values between 1 and 3^^^3. When the lever is set to 1, 1 person is being tortured (and the torture will last for 50 years.). If you set it to 2, two people will be tortured by an amount less the first person by 1/3^^^3 of the difference between the 50 years and a dust speck. If you set it to 3, three people will be tortured by an amount less than the first person by 2/3^^^3 of the difference between the 50 years and the dust speck. Naturally, if you pull the lever all the way to 3^^^3, that number of people... (read more)

Adam, by that argument the torture is worth 0 as well, since after 1,000,000 years, no one will remember the torture or any of its consequences. So you should be entirely indifferent between the two, since each is worth zero.

Also: I wonder if Robin Hanson's comment shows concern about the lack of comments on his posts?

Eisegetes, would you pull the lever if it would stop someone from being tortured for 50 years, but inflict one day of torture on each human being in the world? And if so, how about one year? or 10 years, or 25? In other words, the same problem arises as with the specks. Perhaps you can defend one punch per human being, but there must be some number of human beings for whom one punch each would outweigh torture.

Salutator, I never said utilitarianism is completely true.

Ben: as I said when I brought up the sand example, Eliezer used dust specks to illustrate the "least bad" bad thing that can happen. If you think that it is not even a bad thing, then of course the point will not be apply. In this case you should simply move to the least thing which you consider to be actually bad.

Paul : "Slapping each of 100 people once each is not the same as slapping one person 100 times."

This is absolutely true. But no one has said that these two things are equal. The point is that it is possible to assign each case a value, and these values are comparable: either you prefer to slap each of 100 people once, or you prefer to slap one person 100 times. And once you begin assigning preferences, in the end you must admit that the dust specks, distributed over multiple people, are preferable to the torture in one individual. Your only alter... (read more)

Caledonian, offering an alternative explanation for the evidence does not imply that it is not evidence that Eliezer expends some resources overcoming bias: it simply shows that the evidence is not conclusive. In fact, evidence usually can be explained in several different ways.

Eliezer's question for Paul is not particularly subtle, so I presume he won't mind if I give away where it is leading. If Paul says yes, there is some number of dust specks which add up to a toe stubbing, then Eliezer can ask if there is some number of toe stubbings that add up to a nipple piercing. If he says yes to this, he will ultimately have to admit that there is some number of dust specks which add up to 50 years of torture.

Rather than actually going down this road, however, perhaps it would be as well if those who wish to say that the dust specks a... (read more)

1RST
But consider this: the last exemplars of each species of hominids could reproduce with the firs exemplars of the following. However, we probably wouldn’t be able to reproduce with Homo habilis. This shows that small differences sum as the distance between the examined subjects increases, until we can clearly see that the two subjects are not part of the same category anymore. The pains that are similar in intensity are still comparable. But there is too much difference between dust specks in the eye/stubbed toe and torture to consider them as part of the same category

The fact that Eliezer has changed his mind several times on Overcoming Bias is evidence that he expends some resources overcoming bias; if he didn't, we would expect exactly what you say. It is true that he hasn't changed his mind often, so this fact (at least by itself) is not evidence that he expends many resources in this way.

Ben and Mitchell: the problem is that "meaningless inconvenience" and "agony" do not seem to have a common boundary. But this is only because there could be many transitional stages such as "fairly inconvenient" and "seriously inconvenient," and so on. But sooner or later, you must come to stages which have a common boundary. Then the problem I mentioned will arise: in order to maintain your position, you will be forced to maintain that pain of a certain degree, suffered by any number of people and for any length of ... (read more)

Unknown3270

I agree that as you defined the problems, both have problems. But I don't agree that the problems are equal, for the reason stated earlier. Suppose someone says that the boundary is that 1,526,216,123,000,252 dust specks is exactly equal to 50 years of torture (in fact, it's likely to be some relatively low number like this rather than anything like a googleplex.) It is true that proving this would be a problem. But it is no particular problem that 1,526,216,123,000,251 dust specks would be preferable to the torture, while the torture would be preferable t... (read more)

0RST_duplicate0.8641504549196835
Suppose that the qualitative difference is between bearable and unbearable, in other words things that are over o below the pain tolerance. A pain just below pain tolerance when experienced for a small quantity of time will remain bearable; however, if it is prolonged for lots of time it will become unbearable because human patience is limited. So, even if we give importance to qualitative differences, we can still choose to avoid torture and your scenario, without going against our intuitions, or be incoherent. Now, let's assume that the time will be quite short (5 second for example), in this case I think it is really better to let billions of people suffer 5 second of bearable pain than to let one person suffer 5 second of unbearable pain. After all, people can stand a bearable pain by definition. However, pain tolerance is subjective and in real life we don't know exactly where the threshold is in every person, so we can prefer, as heuristic rule, the option with less people involved when the pains are similar to each other (maybe we have evolved some system to make such approximations, a sort of threshold insensitivity).
1RST
Suppose that the qualitative difference is between bearable and unbearable, in other words things that are over o below the pain tolerance. A pain just below pain tolerance when experienced for a small quantity of time will remain bearable; however, if it is prolonged for lots of time it will become unbearable because human patience is limited. So, even if we give importance to qualitative differences, we can still choose to avoid torture and your second scenario, without going against our intuitions, or be incoherent. Moreover, we can describe qualitative differences as the colors on the spectrum of visible light: their edges are nebulous but we can still agree that the grass is green and the sea is blue. This means that two very close points on the spectrum appear as part of the same color, but when their distance increases they became part of two different colors. 1,525,122 and 1,525,123 are so close that we can see them as shades of the same qualitative category. On the other hand, dust speck and torture are very distant from each other and we can consider them as part of two different qualitative categories.
3Voltairina
I'm not totally convinced - there may be other factors that make such qualitative distinctions important. Such as exceeding the threshold to boiling. Or putting enough bricks in a sack to burst the bottom. Or allowing someone to go long enough without air that they cannot be resuscitated. It probably doesn't do any good to pose /arbitrary/ boundaries, for sure, but not all such qualitative distinctions are arbitrary...

Ben, I think you might not have understood what I was saying about the poll. My point was that each individual is simply saying that he does not have a problem with suffering a dust speck to save someone from torture. But the issue isn't whether one individual should suffer a dust speck to save someone, but whether the whole group should suffer dust specks for this purpose. And it isn't true that the whole group thinks that the whole group should suffer dust specks for this purpose. If it were, there wouldn't be any disagreement about this question, since ... (read more)

Caledonian, of course that cannot be demonstrated. But who needs a demonstration? Larry D'anna said, "A googolplex of dusty eyes has the same tiny negative utility as one dusty eye as far as I'm concerned." If this is the case, does a billion deaths have the same negative utility as one death?

To put it another way, everyone knows that harms are additive.

Caledonian, can you give an example of someone who has never held a daft belief?

Other than yourself, of course, since such a suggestion would seem to indicate bias. On the other hand, your disrespect towards all others with whom you disagree (which seems to be everyone on some topic or other) seems to suggest that you believe that they all hold daft beliefs.

Caledonian, if you add the premise that some people should be respected, it is a logically valid and necessary conclusion (given that all people at some point have daft beliefs) that not all people who hold daft beliefs should be disrespected.

However, that is certainly not the best I can do. I could think of a long list of reasons for respecting such people: much the same reasons why you would do much better to show some respect for the authors and readers of Overcoming Bias. For one thing, you would have a much better chance of persuading them of your pos... (read more)

Caledonian, one reason to do that is that everyone has daft beliefs once in a while. It isn't surprising that you ask the question, however, since you show no respect for those with whom you disagree on Overcoming Bias. Since you disagree with them, you presumably think that their beliefs are false, and consequently (according to your logic) that they themselves are unworthy of respect.

Steve, maybe this was your point anyway, but the incidents you mention indicate that the existence of flying saucer cults is evidence for the existence of aliens (namely by showing that the cults were based on seeing something in the real world.) No doubt they aren't much evidence, especially given the prior improbability, but they are certainly evidence.

I should add that this is true about self-contradictory religions as well. For the probability that I mistakenly interpret the religion to be self-contradictory is greater than the probability that the chocolate cake is out there.

In general, any claim maintained by even a single human being to be true will be more probable, simply based on the authority of that human being, than some random claim such as the chocolate cake claim, which is not believed by anyone.

There are possibly some exceptions to this (and possibly not), but in general there is no particular reason to include religions as exceptions.