Comment author: Jiro 28 March 2015 07:54:12PM *  20 points [-]

I'm going to repost something I posted there:

I think that Scott is looking at Phil Robertson’s literal words and ignoring context, implication, and connotation. It is possible to parse what Phil Robertson said as a thought experiment which questions the logical consequences of an atheistic position.

But even though his literal words have the form of such a thought experiment, that’s not what he’s doing. He’s stringing together a set of applause lights meant to tell his audience that he fantasizes about the outgroup getting punished for being the outgroup in a way that is their own fault.

It is a scourge of the Internet that people are too literal. Scott is falling victim to this trend here. The way Phil Robertson phrased that, and the circumstance surrounding it, make it very clear that it is not just a thought experiment even if you can take it apart and say “well, a thought experiment has A, and B, and C, and Phil is also using A, and B, and C and in exactly the right order."

Yes, people can use extreme scenarios when they are legitimately trying to argue a point. No, this is not a case of that. It's not even a case of atheists in the audience getting mindkilled. It's a case of atheists in the audience correctly understanding what he's saying. In the real world outside LW, most hypotheticals of this sort are attacks and not sincere attempts to make a philosophical point.

Comment author: private_messaging 29 March 2015 07:06:04AM 4 points [-]

Precisely. It's also implying that atheists are moral nihilists. Which is BS. Plenty of religious people believe in god who will grant them passage to heaven irrespective of their moral conduct just as long as they repent and accept Jesus; and a plenty of atheists are not moral nihilists.

Comment author: Jiro 29 March 2015 02:33:36AM 6 points [-]

Isn't this also confounded by the fact that judges and juries like to go easy on women, so that women who do commit murder are less likely to be convicted? It may be that measures of what fraction of women are convicted of murder are not the same as what fraction of women are actually murderers.

Comment author: private_messaging 29 March 2015 06:18:01AM *  1 point [-]

Prosecutors may also be less likely to accuse women. I wonder what is the female rate of being accused of murder - if it is 1/10 just as the murder rate is, then this 1/10 can cancel out in the courtroom.

The prosecutor is already using what ever priors they wish, including racist and sexist priors, when they select the suspects to bring to the court; if the court is to do the same, they'll be double-counting.

Ultimately it's all in the wash once you start accounting for things like her trying to frame Lumumba.

Keep in mind also that there's evidence available to prosecution but unavailable to you. Knox claiming that she got slapped during interrogation, and other claims that those present at the interrogation know for certain to be true or not.

I can see it going either way: if I were the police present at the interrogation and then I see her completely lying about how interrogation went, then the reference class is not cute girls it's psychopaths and not very smart ones either. On the other hand maybe she didn't lie about the interrogation. I can't know but those present at the interrogation would know.

edit: also the thing is that a lot of the physical evidence was not reported on by the US media.

Basically there is a lot of physical evidence that if valid would massively overpower any "cute girl" priors. So the question is not about those priors but about the possible alternative explanations for said evidence and said evidence's validity.

In response to 2014 Survey Results
Comment author: private_messaging 27 March 2015 11:05:00PM 0 points [-]

I think it's interesting to note the lack of significant correlation between either IQ or calibration(as a proxy for rationality and/or sanity) and various beliefs such as many worlds. It's a common sentiment here that beliefs are a gauge of intelligence and rationality, but that doesn't seem to be true.

It would be interesting to include a small set of IQ test like questions, to confirm that there is a huge correlation between IQ and correct answers in general.

Comment author: [deleted] 27 March 2015 02:40:59PM 1 point [-]

I consider entities in computationally distinct universes to also be distinct entities, even if the arrangements of their neurons are the same. If I have an infinite (or sufficiently large) set of physical constants such that in those universes human beings could emerge, I will also have enough human beings.

edit: also again, pseudomath, because you could have C(dustspeck, n) = 1-1/(n+1) , your property holds but it is bounded, so if the c(torture, 1)=2 then you'll never exceed it with dust specks.

No. I will always find a larger number which is at least ε greater. I fixed ε before I talked about n,m. So I find numbers m1,m2,... such that C(dustspeck,m_j) > jε.

Besides which, even if I had somehow messed up, you're not here (I hope) to score easy points because my mathematical formalization is flawed when it is perfectly obvious where I want to go.

In response to comment by [deleted] on Torture vs. Dust Specks
Comment author: private_messaging 27 March 2015 06:40:57PM *  0 points [-]

Well, in my view, some details of implementation of a computation are totally indiscernible 'from the inside' and thus make no difference to the subjective experiences, qualia, and the like.

I definitely don't care if there's 1 me, 3^^^3 copies of me, or 3^^^^3, or 3^^^^^^3 , or the actual infinity (as the physics of our universe would suggest), where the copies are what thinks and perceives everything exactly the same over the lifetime. I'm not sure how counting copies as distinct would cope with an infinity of copies anyway. You have a torture of inf persons vs dust specks in inf*3^^^3 persons, then what?

Albeit it would be quite hilarious to see if someone here picks up the idea and starts arguing that because they're 'important', there must be a lot of copies of them in the future, and thus they are rightfully an utility monster.

Comment author: Quill_McGee 27 March 2015 03:09:14AM 0 points [-]

exactly! No knock-on effects. Perhaps you meant to comment on the grandparent(great-grandparent? do I measure from this post or your post?) instead?

Comment author: private_messaging 27 March 2015 12:34:48PM 0 points [-]

yeah, clicked wrong button.

Comment author: Kindly 27 March 2015 12:17:18AM 0 points [-]

For one thing N=1 T=1 trivially satisfies your condition...

Obviously I only meant to consider values of T and N that actually occur in the argument we were both talking about.

Comment author: private_messaging 27 March 2015 12:33:53PM 0 points [-]

Well I'm not sure what's the point then. What you're trying to induct from it.

Comment author: [deleted] 26 March 2015 11:31:26PM 0 points [-]

You're right, I don't. And I do not really need it in this case.

What I need is a cost function C(e,n) - e is some event and n is the number of people being subjected to said event, i.e. everyone gets their own - where for ε > 0: C(e,n+m) > C(e,n) + ε for some m. I guess we can limit e to "torture for 50 years" and "dust specks" so this generally makes sense at all.

The reason why I would want to have such a cost function is because I believe that it should be more than infinitesimally worse for 3^^^^3 people to suffer than for 3^^^3 people to suffer. I don't think there should ever be a point where you can go "Meh, not much of a big deal, no matter how many more people suffer."

If however the number of possible distinct people should be finite - even after taking into account level II and level III multiverses - due to discreteness of space and discreteness of permitted physical constants, then yes, this is all null and void. But I currently have no particular reason to believe that there should be such a bound, while I do have reason to believe that permitted physical constants should be from a non-discrete set.

In response to comment by [deleted] on Torture vs. Dust Specks
Comment author: private_messaging 27 March 2015 12:22:10PM *  -1 points [-]

Well, within the 3^^^3 people you have every single possible brain replicated a gazillion times already (there's only that many ways you can arrange the atoms in the volume of human head, sufficiently distinct as to be computing something subjectively different, after all, and the number of such arrangements is unimaginably smaller than 3^^^3 ).

I don't think that e.g. I must massively prioritize the happiness of a brain upload of me running on multiple redundant hardware (which subjectively feels the same as if it was running in one instance; it doesn't feel any stronger because there's more 'copies' of it running in perfect unison, it can't even tell the difference. It won't affect the subjective experience if the CPUs running the same computation are slightly physically different).

edit: also again, pseudomath, because you could have C(dustspeck, n) = 1-1/(n+1) , your property holds but it is bounded, so if the c(torture, 1)=2 then you'll never exceed it with dust specks.

Seriously, you people (LW crowd in general) need to take more calculus or something before your mathematical intuitions become in any way relevant to anything whatsoever. It does feel intuitively that with your epsilon it's going to keep growing without a limit, but that's simply not true.

Comment author: Kindly 26 March 2015 10:30:03PM 1 point [-]

It's not a continuum fallacy because I would accept "There is some pair (N,T) such that (N people tortured for T seconds) is worse than (10^100 N people tortured for T-1 seconds), but I don't know the exact values of N and T" as an answer. If, on the other hand, the comparison goes the other way for any values of N and T, then you have to accept the transitive closure of those comparisons as well.

Also, why are you so sure that the number of people increases suffering in a linear way for even very large numbers? What is a number of people anyway?

I'm not sure what you mean by this. I don't believe in linearity of suffering: that would be the claim that 2 people tortured for 1 day is the same as 1 person tortured for 2 days, and that's ridiculous. I believe in comparability of suffering, which is the claim that for some value of N, N people tortured for 1 day is worse than 1 person tortured for 2 days.

Regarding anaesthetics: I would prefer a memory inhibitor for a painful surgery to the absence of one, but I would still strongly prefer to feel less pain during the surgery even if I know I will not remember it one way or the other. Is this preference unusual?

Comment author: private_messaging 26 March 2015 11:01:46PM *  0 points [-]

don't know the exact values of N and T

For one thing N=1 T=1 trivially satisfies your condition...

I'm not sure what you mean by this.

I mean, suppose that you got yourself a function that takes in a description of what's going on in a region of spacetime, and it spits out a real number of how bad it is.

Now, that function can do all sorts of perfectly reasonable things that could make it asymptotic for large numbers of people, for example it could be counting distinct subjective experiences in there (otherwise a mind upload on very multiple redundant hardware is an utility monster, despite having an identical subjective experience to same upload running one time. That's much sillier than the usual utility monster, which feels much stronger feelings). This would impose a finite limit (for brains of finite complexity).

One thing that function can't do, is to have a general property that f(a union b)=f(a)+f(b) , because then we just subdivide our space into individual atoms none of which are feeling anything.

In response to comment by [deleted] on Torture vs. Dust Specks
Comment author: [deleted] 25 March 2015 09:17:50PM 2 points [-]

It's not (necessarily) about dust specks accidentally leading to major accidents. But if you think that having a dust speck in your eye may be even slightly annoying (whether you consciously know that or not), the cost you have from having it fly into your eye is not zero.

Now something not zero multiplied by a sufficiently large number will necessarily be larger than the cost of one human being's life in torture.

In response to comment by [deleted] on Torture vs. Dust Specks
Comment author: private_messaging 26 March 2015 10:28:25PM *  0 points [-]

Now, do you have any actual argument as to why the 'badness' function computed over a box containing two persons with a dust speck, is exactly twice the badness of a box containing one person with a dust speck, all the way up to very large numbers (when you may even have exhausted the number of possible distinct people) ?

I don't think you do. This is why this stuff strikes me as pseudomath. You don't even state your premises let alone justify them.

Comment author: Kindly 26 March 2015 01:44:36PM 3 points [-]

We could go from a day to a minute more slowly; for example, by increasing the number of people by a factor of a googolplex every time the torture time decreases by 1 second.

I absolutely agree that the length of torture increases how bad it is in nonlinear ways, but this doesn't mean we can't find exponential factors that dominate it at every point at least along the "less than 50 years" range.

Comment author: private_messaging 26 March 2015 08:21:38PM *  1 point [-]

That strikes me as a deliberate set up for a continuum fallacy.

Also, why are you so sure that the number of people increases suffering in a linear way for even very large numbers? What is a number of people anyway?

I'd much prefer to have a [large number of exact copies of me] experience 1 second of headache than for one me to suffer it for a whole day. Because those copies they don't have any mechanism which could compound their suffering. They aren't even different subjectivities. I don't see any reason why a hypothetical mind upload of me running on multiple redundant hardware should be an utility monster, if it can't even tell subjectively how redundant it's hardware is.

Some anaesthetics do something similar, preventing any new long term memories, people have no problem with taking those for surgery. Something's still experiencing pain but it's not compounding into anything really bad (unless the drugs fail to work, or unless some form of long term memory still works). A real example of a very strong preference for N independent experiences of 30 seconds of pain over 1 experience of 30*N seconds of pain.

View more: Prev | Next