MagnetoHydroDynamics comments on Rationality Quotes August 2013 - Less Wrong

7 Post author: Vaniver 02 August 2013 08:59PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (733)

You are viewing a single comment's thread. Show more comments above.

Comment author: [deleted] 13 August 2013 11:44:07PM *  0 points [-]

Simple, under dubiously ethical and physically possible conditions, you turn their internal world model into a formal bayesian network, and for every possible physical and mathematical observation and outcome, do the above calculation. Sum, print, idle.

It's impossible in practise, but only like, four line formal definition.

Comment author: Decius 14 August 2013 05:40:36AM 2 points [-]

How do you measure someone whose internal world model is not isomorphic to one formal Bayesian network (for example, someone who is completely certain of something)? Should it be the case that someone whose world model contains fewer possible observations has a major advantage in being closer to the truth?

Note also that a perfect Bayesian will score lower than some gamblers using this scheme. Betting everything on black does better than a fair distribution almost half the time.

Comment author: [deleted] 16 August 2013 01:23:35PM 1 point [-]

I am not very certain that humans actually can have an internal belief model that isn't isomorphic to some bayesian network. Anyone who proclaims to be absolutely certain; I suspect that they are in fact not.

Comment author: pragmatist 16 August 2013 09:39:07PM 2 points [-]

How do you account for people falling prey to things like the conjunction fallacy?

Comment author: private_messaging 23 August 2013 09:48:59AM *  3 points [-]

I don't think people just miscalculate conjunctions. Everyone will tell you that HFFHF is less probable than H, HF, or HFF even. It's when it gets long and difference is small and the strings are quite specially crafted, errors appear. And with the scenarios, a more detailed scenario looks more plausibly a product of some deliberate reasoning, plus, existence of one detailed scenario is information about existence of other detailed scenarios leading to the same outcome (and it must be made clear in the question that we are not asking about the outcome but about everything happening precisely as scenario specifies it).

On top of that, the meaning of the word "probable" in everyday context is somewhat different - a proper study should ask people to actually make bets. All around it's not clear why people make this mistake, but it is clear that it is not some fully general failure to account for conjunctions.

edit: actually, just read the wikipedia article on the conjunction fallacy. When asking about "how many people out of 100", nobody gave a wrong answer. Which immediately implies that the understanding of "probable" has been an issue, or some other cause, but not some general failure to apply conjunctions.

Comment author: pragmatist 23 August 2013 10:24:28AM *  0 points [-]

There have been studies that asked people to make bets. Here's an example. It makes no difference -- subjects still arrive at fallacious conclusions. That study also goes some way towards answering your concern about ambiguity in the question. The conjunction fallacy is a pretty robust phenomenon.

Comment author: private_messaging 23 August 2013 11:14:45AM *  2 points [-]

I've just read the example beyond it's abstract. Typical psychology: the actual finding was that there were fewer errors with the bet (even though the expected winning was very tiny, and the sample sizes were small so the difference was only marginally significant), and also approximately half of the questions were answered correctly, and the high prevalence of "conjunction fallacy" was attained by considering at least one error over many questions.

Comment author: private_messaging 23 August 2013 10:38:05AM *  2 points [-]

How is it a "robust phenomenon" if it is negated by using strings of larger length difference in the head-tail example or by asking people to answer in the N out of 100 format?

I am thinking that people have to learn reasoning to answer questions correctly, including questions about probability, for which the feedback they receive from the world is fairly noisy. And consequently they learn that fairly badly, or mislearn it all-together due to how more detailed accounts are more frequently the correct ones in their "training dataset" (which consists of detailed correct accounts of actual facts and fuzzy speculations).

edit: Let's say, the notion that people are just generally not accounting for conjunction is sort of like Newtonian mechanics. In a hard science - physics - Newtonian mechanics was done for as a fundamental account of reality once conditions were found where it did not work. Didn't matter any how "robust" it was. In a soft science - psychology - an approximate notion persists in spite of this, as if it should be decided by some sort of game of tug between experiments in favour and against that notion. If we were doing physics like this, we would never have moved beyond Newtonian mechanics.

Comment author: pragmatist 23 August 2013 11:21:24AM *  0 points [-]

Framing the problem in terms of frequencies mitigates a number of probabilistic fallacies, not just the conjunction fallacy. It also mitigates, for instance, base rate neglect. So whatever explanation you have for the difference between the probability and frequency framings shouldn't rely on peculiarities of the conjunction fallacy case. A plausible hypothesis is that presenting frequency information simply makes algorithmic calculation of the result easier, and so subjects are no longer reliant on fallible heuristics in order to arrive at the conclusion.

The claim of the heuristics and biases program is that the conjunction fallacy is a manifestation of the representativeness heuristic. One does not need to suppose that there is a misunderstanding about the word "probability" involved (if there is, how do you account for the betting experiments?). The difference in the frequency framing is not that it makes it clear what the experimenter means by "probability", it's that the ease of algorithmic reasoning in that case reduces reliance on the representativeness heuristic. Further evidence for this is that the fallacy is also mitigated if the question is framed in terms of single-case probabilities, but with a diagram clarifying the relationship between properties in the problem. If the effect were merely due to a misunderstanding about what is meant by "probability", why would there be a mitigation of the fallacy in this case? Does the diagram somehow make it clear what the experimenter means by "probability"?

In response to your Newtonian physics example, it's simply not true that scientists abandoned Newtonian mechanics as soon as they found conditions under which it appeared not to work. Rather, they tried to find alternative explanations that preserved Newtonian mechanics, such as positing the existence of Uranus to account for discrepancies in planetary orbits. It was only once there was a better theory available that Newtonian mechanics was abandoned. Is there currently a better account of probabilistic fallacies than that offered by the heuristics and biases program? And do you think that there is anything about the conjunction fallacy research that makes it impossible to fit the effect within the framework of the heuristics and biases program?

I'm not familiar with the effect of variable string length difference, and quick Googling isn't helping. If you could direct me to some research on this, I'd appreciate it.

Comment author: private_messaging 23 August 2013 11:34:38AM *  0 points [-]

A plausible hypothesis is that presenting frequency information simply makes algorithmic calculation of the result easier, and so subjects are no longer reliant on fallible heuristics in order to arrive at the conclusion.

There's only room for making it easier when the word "probable" is not synonymous with "larger N out of 100". So I maintain that alternate understanding of the word "probable" (and perhaps also an invalid idea of what one should bet on) are relevant. edit: to clarify, I can easily imagine an alternate cultural context where "blerg" is always, universally, invariably, a shorthand for "N out of 100". In such context, asking about "N out of 100" or about "blerg" should produce nearly identical results.

Also, in your study, about half of the questions were answered correctly.

The claim of the heuristics and biases program is that the conjunction fallacy is a manifestation of the representativeness heuristic.

I guess that's fair enough, albeit its not clear how that works on Linda-like examples.

In my opinion its just that through their life people are exposed to a training dataset which consists of

  1. Detailed accounts of real events.

  2. Speculative guesses.

and (1) is much more commonly correct than (2) even though (1) is more conjunctive. So people get mis-trained through a biased training set. A very wide class of learning AIs would get mis-trained by this sort of thing too.

I'm not familiar with the effect of variable string length difference, and quick Googling isn't helping. If you could direct me to some research on this, I'd appreciate it.

The point is that you can't pull the representativeness trick with e.g. R vs RGGRRGRRRGG . All research I ever seen had strings with small % difference in their length. I am assuming that the research is strongly biased towards researching something un-obvious, while it is fairly obvious that R is more probable than RGGRRGRRRGG and frankly we do not expect to find anyone who thinks that RGGRRGRRRGG is more probable than R.

Comment author: pragmatist 23 August 2013 12:05:42PM *  0 points [-]

There's only room for making it easier when the word "probable" is not synonymous with "larger N out of 100". So I maintain that alternate understanding of the word "probable" (and perhaps also an invalid idea of what one should bet on) are relevant.

Maybe a misunderstanding about the word is relevant, but it clearly isn't entirely responsible for the effect. Like I said, the conjunction fallacy is much less common if the structure of the question is made clear to the subject using a diagram (e.g. if it is made obvious that feminist bank tellers are a proper subset of bank tellers). It seems implausible that providing this extra information will change the subject's judgment about what the experimenter means by "probable".

I guess that's fair enough, albeit its not clear how that works on Linda-like examples.

The description given of Linda in the problem statement (outspoken philosophy major, social justice activist) is much more representative of feminist bank tellers than it is of bank tellers.

Comment author: [deleted] 23 August 2013 07:38:54AM 1 point [-]

Poor brain design.

Honestly, I could do way better if you gave me a millenium.

Comment author: linkhyrule5 23 August 2013 09:29:46AM 3 points [-]

You know, at some point, whoever's still alive when that becomes not-a-joke needs to actually test this.

Because I'm just curious what a human-designed human would look like.

Comment author: Decius 17 August 2013 04:05:04AM 1 point [-]

How likely do you believe it is that there exists a human who is absolutely certain of something?

Comment author: Lumifer 16 August 2013 03:09:23PM 1 point [-]

Anyone who proclaims to be absolutely certain; I suspect that they are in fact not.

Is this a testable assertion? How do you determine whether someone is, in fact, absolutely certain?

It's not unheard of people to bet their life on some belief of theirs.

Comment author: Randaly 16 August 2013 03:22:19PM 1 point [-]

It's not unheard of people to bet their life on some belief of theirs.

That doesn't show that they're absolutely certain; it just shows that the expected value of the payoff outweighs the chance of them dying.

The real issue with this claim is that people don't actually model everything using probabilities, nor do they actually use Bayesian belief updating. However, the closest analogue would be people who will not change their beliefs in literally any circumstances, which is clearly false. (Definitely false if you're considering, e.g. surgery or cosmic rays; almost certainly false if you only include hypotheticals like cult leaders disbanding the cult or personally attacking the individual.)

Comment author: Lumifer 16 August 2013 03:26:56PM 0 points [-]

the closest analogue would be people who will not change their beliefs in literally any circumstances

Nope. "I'm certain that X is true now" is different from "I am certain that X is true and will be true forever and ever".

I am absolutely certain today is Friday. Ask me tomorrow whether my belief has changed.

Comment author: Randaly 16 August 2013 06:19:46PM 1 point [-]

In fact, unless you're insane, you probably already believe that tomorrow will not be Friday!

(That belief is underspecified- "today" is a notion that varies independently, it doesn't point to a specific date. Today you believe that August 16th, 2013 is a Friday; tomorrow, you will presumably continue to believe that August 16th, 2013 was a Friday.)

Comment author: Lumifer 16 August 2013 06:58:21PM 1 point [-]

That belief is underspecified

Not exactly that but yes, there is the reference issue which makes this example less than totally convincing.

The main point still stands, though -- certainty of a belief and its time-invariance are different things.

Comment author: AndHisHorse 16 August 2013 06:49:02PM *  0 points [-]

I very much doubt that you are absolutely certain. There are a number of outlandish but not impossible worlds in which you could believe that it is Friday, yet it might not be Friday; something akin to the world of The Truman Show comes to mind.

Unless you believe that all such alternatives are impossible, in which case you may be absolutely certain, but incorrectly so.

Comment author: Lumifer 16 August 2013 07:00:36PM 1 point [-]

I very much doubt that you are absolutely certain.

Define "absolute certainty".

In the brain-in-the-vat scenario which is not impossible I cannot be certain of anything at all. So what?

Comment author: linkhyrule5 16 August 2013 07:09:21PM 1 point [-]

So you're not absolutely certain. The probability you assign to "Today is Friday" is, oh, nine nines, not 1.

Comment author: Lumifer 16 August 2013 07:35:59PM *  0 points [-]

Nope. I assign it the probability of 1.

On the other hand, you think I'm mistaken about that.

On the third tentacle I think you are mistaken because, among other things, my mind does not assign probabilities like 0.999999999 -- it's not capable of such granularity. My wetware rounds such numbers and so assigns the probability of 1 to the statement that today is Friday.

Comment author: AndHisHorse 16 August 2013 07:10:48PM 0 points [-]

Taking a (modified) page from Randaly's book, I would define absolute certainty as "so certain that one cannot conceive of any possible evidence which might convince one that the belief in question is false". Since you can conceive of the brain-in-the-vat scenario and believe that it is not impossible, I would say that you cannot be absolutely certain of anything, including the axioms and logic of the world you know (even the rejection of absolute certainty).

Comment author: Decius 18 August 2013 12:42:18AM *  0 points [-]

I don't have to believe that the alternatives are impossible; I just have to be certain that the alternatives are not exemplified.

Comment author: AndHisHorse 16 August 2013 07:06:58PM 0 points [-]

Is someone absolutely certain if the say that they cannot imagine any circumstances under which they might change their beliefs (or, alternately, can imagine only circumstances which they are absolutely certain will not happen)? It would seem to be a better definition, as it defines probability (and certainty) as a thing in the mind, rather than outside.

In this case, I would see no contradiction as declaring someone to be absolutely certain of their beliefs, though I would say (with non-absolute certainty) that they are incorrect. Someone who believes that the Earth is 6000 years old, for example, may not be swayed by any evidence short of the Christian god coming down and telling them otherwise, an event to which they may assign 0.0 probability (because they believe that it's impossible for their god to contradict himself, or something like that).

Further, I would exclude methods of changing someone's mind without using evidence (surgery or cosmic rays). I can't quite put it into words, but it seems like the fact that it isn't evidence and instead changes probabilities directly means that it doesn't so much affect beliefs as it replaces them.

Comment author: Randaly 16 August 2013 07:56:42PM *  2 points [-]

Is someone absolutely certain if they say that they cannot imagine any circumstances under which they might change their beliefs (or, alternately, can imagine only circumstances which they are absolutely certain will not happen)?

Disagree. This would be a statement about their imagination, not about reality.

Also, people are not well calibrated on this sort of thing. People are especially poorly calibrated on this sort of thing in a social context, where others are considering their beliefs.

ETA: An example: While I haven't actually done this, I would expect that a significant fraction of religious people would reply to such a question by saying that they would never change their beliefs because of their absolute faith. I can't be bothered to do enough googling to find a specific interviewee about faith who then became an atheist, but I strongly suspect that some such people actually exist.

I can't quite put it into words, but it seems like the fact that it isn't evidence and instead changes probabilities directly means that it doesn't so much affect beliefs as it replaces them.

Yeah, fair enough.

Comment author: AndHisHorse 16 August 2013 08:15:02PM 0 points [-]

Disagree. This would be a statement about their imagination, not about reality.

You are correct. I am making my statements on the basis that probability is in the mind, and as such it is perfectly possible for someone to have a probability which is incorrect. I would distinguish between a belief which it is impossible to disprove, and one which someone believes it is impossible to disprove, and as "absolutely certain" seems to refer to a mental state, I would give it the definition of the latter.

Comment author: Randaly 16 August 2013 08:42:09PM 1 point [-]

(I suspect that we don't actually disagree about anything in reality. I further suspect that the phrase I used regarding imagination and reality was misleading; sorry, it's my standard response to thought experiments based on people's ability to imagine things.)

I'm not claiming that there is a difference between their stated probabilities and the actual, objective probabilities. I'm claiming that there is a difference between their stated probabilities and the probabilities that they actually hold. The relevant mental states are the implicit probabilities from their internal belief system; while words can be some evidence about this, I highly suspect, for reasons given above, that anybody who claims to be 100% confident of something is simply wrong in mapping their own internal beliefs, which they don't have explicit access to and aren't even stored as probabilities (?), over onto explicitly stated probabilities.

Suppose that somebody stated that they cannot imagine any circumstances under which they might change their beliefs. This is a statement about their ability to imagine situations; it is not a proof that no such situation could possibly exist in reality. The fact that it is not is demonstrated by my claim that there are people who did make that statement, but then actually encountered a situation that caused them to change their belief. Clearly, these people's statement that they were absolutely, 100% confident of their belief was incorrect.

Comment author: AndHisHorse 16 August 2013 08:49:33PM 1 point [-]

I would still say that while belief-altering experiences are certainly possible, even for people with stated absolute certainty, I am not convinced that they can imagine them occurring with nonzero probability. In fact, if I had absolute certainty about something, I would as a logical consequence be absolutely certain that any disproof of that belief could not occur.

However, it is also not unreasonable that someone does not believe what they profess to believe in some practically testable manner. For example, someone who states that they have absolute certainty that their deity will protect them from harm, but still declines to walk through a fire, would fall into such a category - even if they are not intentionally lying, on some level they are not absolutely certain.

I think that some of our disagreement arises from the fact that I, being relatively uneducated (for this particular community) about Bayesian networks, am not convinced that all human belief systems are isomorphic to one. This is, however, a fault in my own knowledge, and not a strong critique of the assertion.

Comment author: Lumifer 16 August 2013 08:06:24PM -1 points [-]

I would expect that most religious fundamentalists would reply to such a question by saying that they would never change their beliefs because of their absolute faith.

First, fundamentalism is a matter of theology, not of intensity of faith.

Second, what would these people do if their God appeared before them and flat out told them they're wrong? :-D

Comment author: shminux 19 August 2013 06:57:55PM 1 point [-]

Second, what would these people do if their God appeared before them and flat out told them they're wrong?

Clearly they would consider this entity a false God/Satan.

Comment author: Lumifer 19 August 2013 07:03:34PM *  0 points [-]

This is starting to veer into free-will territory, but I don't think God would have much problem convincing these people that He is the Real Deal. Wouldn't be much of a god otherwise :-)

Comment author: Randaly 16 August 2013 08:17:06PM 1 point [-]

First, fundamentalism is a matter of theology, not of intensity of faith.

Fixed, thanks.

Second, what would these people do if their God appeared before them and flat out told them they're wrong? :-D

Their verbal response would be that this would be impossible.

(I agree that such a situation would likely lead to them actually changing their beliefs.)

Comment author: Lumifer 16 August 2013 08:36:14PM 0 points [-]

Their verbal response would be that this would be impossible.

At which point you can point out to them that God can do WTF He wants and is certainly not limited by ideas of pathetic mortals about what's impossible and what's not.

Oh, and step back, exploding heads can be messy :-)

Comment author: Protagoras 19 August 2013 07:55:35PM 1 point [-]

I cannot imagine circumstances under which I would come to believe that the Christian God exists. All of the evidence I can imagine encountering which could push me in that direction if I found it seems even better explained by various deceptive possibilities, e.g. that I'm a simulation or I've gone insane or what have you. But I suspect that there is some sequence of experience such that if I had it I would be convinced; it's just too complicated for me to work out in advance what it would be. Which perhaps means I can imagine it in an abstract, meta sort of way, just not in a concrete way? Am I certain that the Christian God doesn't exist? I admit that I'm not certain about that (heh!), which is part of the reason I'm curious about your test.

Comment author: RichardKennaway 19 August 2013 09:00:59PM 4 points [-]

If imagination fails, consult reality for inspiration. You could look into the conversion experiences of materialist, rationalist atheists. John C Wright, for example.

Comment author: Lumifer 19 August 2013 08:10:59PM 1 point [-]

So you're effectively saying that your prior is zero and will not be budged by ANY evidence.

Hmm... smells of heresy to me... :-D

Comment author: Lumifer 16 August 2013 07:10:45PM *  0 points [-]

I would argue that this definition of absolute certainty is completely useless as nothing could possibly satisfy it. It results in an empty set.

If you "cannot imagine under any circumstances" your imagination is deficient.

Comment author: AndHisHorse 16 August 2013 07:15:52PM 1 point [-]

I am not arguing that it is not an empty set. Consider it akin to the intersection of the set of natural numbers, and the set of infinities; the fact that it is the empty set is meaningful. It means that by following the rules of simple, additive arithmetic, one cannot reach infinity, and if one does reach infinity, that is a good sign of an error somewhere in the calculation.

Similarly, one should not be absolutely certain if they are updating from finite evidence. Barring omniscience (infinite evidence), one cannot become absolutely/infinitely certain.

What definition of absolute certainty would you propose?

Comment author: Lumifer 16 August 2013 07:29:24PM -2 points [-]

I am not arguing that it is not an empty set.

So you are proposing a definition that nothing can satisfy. That doesn't seem like a useful activity. If you want to say that no belief can stand up to the powers of imagination, sure, I'll agree with you. However if we want to talk about what people call "absolute certainty" it would be nice to have some agreed-on terms to use in discussing it. Saying "oh, there just ain't no such animal" doesn't lead anywhere.

As to what I propose, I believe that definitions serve a purpose and the same thing can be defined differently in different contexts. You want a definition of "absolute certainty" for which purpose and in which context?

Comment author: AndHisHorse 16 August 2013 08:11:44PM 1 point [-]

You are correct, I have contradicted myself. I failed to mention the possibility of people who are not reasoning perfectly, and in fact are not close, to the point where they can mistakenly arrive at absolute certainty. I am not arguing that their certainty is fake - it is a mental state, after all - but rather that it cannot be reached using proper rational thought.

What you have pointed out to me is that absolute certainty is not, in fact, a useful thing. It is the result of a mistake in the reasoning process. An inept mathematician can add together a large but finite series of natural numbers, and then write down "infinity" after the equals sign, and thereafter goes about believing that the sum of a certain series is infinite.

The sum is not, in fact, infinite; no finite set of finite things can add up to an infinity, just as no finite set of finite pieces of evidence can produce absolute, infinitely strong certainty. But if we use some process other than the "correct" one, as the mathematician's brain has to somehow output "infinity" from the finite inputs it has been given, we can generate absolute certainty from finite evidence - it simply isn't correct. It doesn't correspond to something which is either impossible or inevitable in the real world, just as the inept mathematician's infinity does not correspond to a real infinity. Rather, they both correspond to beliefs about the real world.

While I do not believe that there are any rationally acquired beliefs which can stand up to the powers of imagination (though I am not absolutely certain of this belief), I do believe that irrational beliefs can. See my above description of the hypothetical young-earther; they may be able to conceive of a circumstance which would falsify their belief (i.e. their god telling them that it isn't so), but they cannot conceive of that circumstance actually occurring (they are absolutely certain that their god does not contradict himself, which may have its roots in other absolutely certain beliefs or may be simply taken as a given).

Comment author: linkhyrule5 17 August 2013 05:08:47AM 0 points [-]

Well, yes.

That is the point.

Nothing is absolutely certain.

Comment author: Decius 17 August 2013 04:08:04AM 0 points [-]

Why does a deficient imagination disqualify a brain from being certain?

Comment author: Lumifer 17 August 2013 04:45:43AM 1 point [-]

Vice versa. Deficient imagination allows a brain to be certain.

Comment author: Decius 18 August 2013 12:18:31AM 1 point [-]

... ergo there exist human brains that are certain.

if people exist that are absolutely certain of something, I want to believe that they exist.

Comment author: linkhyrule5 17 August 2013 05:08:15AM 0 points [-]

So... a brain is allowed to be certain because it can't tell it's wrong?

Comment author: Document 16 August 2013 04:06:43PM 0 points [-]

cult leaders disbanding the cult

Tangent: Does that work?