Watercressed comments on Only You Can Prevent Your Mind From Getting Killed By Politics - Less Wrong

39 Post author: ChrisHallquist 26 October 2013 01:59PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (143)

You are viewing a single comment's thread. Show more comments above.

Comment author: Jack 26 October 2013 10:13:14PM *  16 points [-]

The whole idea of having a belief as a litmus test for rationality seems totally backward. The whole point is how you change your beliefs in response to new evidence.

Meanwhile, if a lot of people have a belief that isn't true it is almost necessarily politically salient. The existence of God isn't an issue that is debated in the halls of government: but it is still hugely about group identity which means that people can get mind-killed about it. The only reason it works as any kind of litmus test is that everyone here is/was already a part of the same group when it comes to theism.

I think the true objection to Stuart's post was less about climate change and more about branding Less Wrong with an issue that has ideological salience. And that seems totally fair to me. If you have a one issue litmus test it's sort of weird to make it one that isn't specific enough to screen out even the most irrational liberals. At the very least add a sub-test asking if a person thinks carbon emissions are responsible for the Hurricane Sandy disaster, their confidence that climate change causes more hurricanes and what (if any) existential risk they assign to it. Catch the folks who think the moon is made out of gold in the filter.

Comment author: Watercressed 27 October 2013 12:04:51AM 0 points [-]

I generally agree with this post, but since people's beliefs are evidence for how they change their beliefs in response to evidence, I would call it bias-inducing and usually tribal cheering instead of totally backwards.

Comment author: Jack 27 October 2013 12:11:56AM 3 points [-]

If not "totally backwards" surely "orthogonal". Why not a test that supplies it's own evidence and asks the one being tested to come to a conclusion? Like the Amanda Knox case was for people here who hadn't heard of it before reading about it here.

Comment author: hyporational 28 October 2013 10:31:30AM 1 point [-]

There are several situations where that's not possible. Also it takes effort to test someone like that.

Comment author: Watercressed 27 October 2013 12:27:24AM 1 point [-]

I wouldn't call it orthogonal either. Rationality is about having correct beliefs, and I would label a belief-based litmus test rational to the extent it's correct.

Writing a post about how $political_belief is a litmus test is probably a bad idea because of the reasons you mentioned.

Comment author: Jack 27 October 2013 01:09:34AM 3 points [-]

Rationality is about have correct beliefs. But a single belief that has only two possible answers is never going to stand in for the entirety of a person's belief structure. That's why you have to look at the process by which a person forms beliefs to have any idea if they are rational.

Comment author: Viliam_Bur 28 October 2013 11:12:55AM *  4 points [-]

a single belief that has only two possible answers is never going to stand in for the entirety of a person's belief structure.

Exactly. If there is any hope in using a list of beliefs as a test of rationality, it will need multiple items.

You know, IQ tests also don't have a single question. Neither do any other personality tests.

Comment author: [deleted] 28 October 2013 07:39:34PM 3 points [-]

OTOH the Cognitive Reflection Test has a shockingly low three questions and I've been told it's surprisingly accurate.

Comment author: Viliam_Bur 29 October 2013 09:44:04AM *  1 point [-]

I'd call it the "Paying-Good-Attention-While-Doing-Simple-Math Test". :D

But yeah... I can imagine that something similarly simple could be an important part of rationality. Some simple task that predicts the ability to do more complex tasks of a similar type.

However, in that case the test will resemble a kind of puzzle, instead of pattern-matching "Do you agree with Greens?"

Specifically for updating, I can imagine a test where the person is gradually given more and more information; the initial information is an evidence of an outcome "A", but most of the latter information is an evidence of an outcome "B". The person is informally asked to make a guess soon after the beginning (when the reasonable answer is "A"), and at the end they are asked to provide a final answer. Some people would probably get stuck as "A", and some would update to "B". But the test would involve some small numbers, shapes, coins, etc.; not real-life examples.

Comment author: Vaniver 03 November 2013 06:33:57PM 4 points [-]

Specifically for updating, I can imagine a test where the person is gradually given more and more information; the initial information is an evidence of an outcome "A", but most of the latter information is an evidence of an outcome "B". The person is informally asked to make a guess soon after the beginning (when the reasonable answer is "A"), and at the end they are asked to provide a final answer. Some people would probably get stuck as "A", and some would update to "B". But the test would involve some small numbers, shapes, coins, etc.; not real-life examples.

I've seen experiments that tested this; I thought they were mentioned in Thinking and Deciding or Thinking Fast and Slow, but I didn't see it in a quick check of either of those. If I recall the experimental setup correctly (I doubt I got the numbers right), they began with a sequence that was 80% red and 20% blue, which switched to being 80% blue and 20% red after n draws. The subjects' estimate that the next draw would be red stayed above 50% for significantly longer than n draws from the second distribution, and some took until 2n or 3n draws from the second distribution to assign 50% chance to each, at which point almost two thirds of the examples they had seen were blue!

Comment author: [deleted] 02 November 2013 08:10:40PM 0 points [-]

But the test would involve some small numbers, shapes, coins, etc.; not real-life examples.

I dunno... people who do fine at the Wason selection task with ages and drinks get it wrong with numbers and colours. (I'm not sure whether that's a bug or a feature.)

Comment author: Viliam_Bur 03 November 2013 04:32:16PM *  4 points [-]

That seems to me like a reason not to test the skill on real-life examples.

We wouldn't want a rationality test that a person can pass with original wording, but will fail if we replace "Republicans" by "Democrats"... or by Green aliens. We wouldn't want the person to merely recognize logical fallacies when spoken by Republicans. This is in my opinion a risk with real-life examples. Is the example with drinking age easier because it is easier to imagine, or because it is something we already agree with?

Okay, I am curious here... what exactly would happen if we replaced the Wason selection task with something that uses words from real life (is less abstract), but is not an actual rule (therefore it cannot be answered using only previous experience)? For example: "Only dogs are allowed at jumping competitions, cats are not allowed. We have a) a dog going to unknown competition; b) a cat going to unknown competition; c) an unknown animal going to swimming competition, and d) an unknown animal going to jumping competition -- which of these cases do you have to check thoroughly to make sure the rule is not broken?"

Comment author: ChristianKl 27 October 2013 02:39:00AM 1 point [-]

I generally agree with this post, but since people's beliefs are evidence for how they change their beliefs in response to evidence, I would call it bias-inducing and usually tribal cheering instead of totally backwards.

If I would want to estimate people rationality from beliefs I would look at whether the belief is nuanced. There are a lot of people who say irrational stuff such that they evidence we have for global warming is comparable to the evidence we have for evolution. In reality the p value doesn't even approach the 5 sigma level that you need to validate a result about a new result in particle physics.

It's just as irrational as being a global warming denier who thinks that p(global warming)<0.5.

Yet we do see smart people making both mistakes. You have smart people who claim that the evidence for global warming is comparable to evolution and you have smart people who are global warming deniers.

People don't get mind killed by political issues because they are dumb. It might be completely rational for them because signaling is more important for them. If you want a useful metric do judge someone rationality don't take something where group identities matter a good deal.

The metric is just too noisy because the person might get something from signaling group identity. I think the only reason to choose such a metric is because you get yourself mindkilled and want to label people who don't belong to your tribe as irrational and seek some rationalisation for it.

As far as empirics go, collegue educated Republicans just have a higher rate of climate change denial than Republicans who didn't go to collegue.

While we can discuss whether collegue causes people to be more rational it certainly correlates with it.

If you want to use beliefs to judge people rationality, calibrate the test. Give people ratioanlity quizes and quiz them for their beliefs. If you get strong correlations you have something that you can use. Don't intellectually analyse the content of the beliefs and think about what rational people should believe if you want an effective metric.

Comment author: hyporational 27 October 2013 08:07:05AM *  -2 points [-]

RETRACTED: It wasn't my intention to start another global warming debate.

If I would want to estimate people rationality from beliefs I would look at whether the belief is nuanced.

Lots of insane beliefs are nuanced.

In reality the p value doesn't even approach the 5 sigma level that you need to validate a result about a new result in particle physics.

Requiring the same strength of evidence from climate science as from particle physics would be insane.

There are a lot of people who say irrational stuff such that they evidence we have for global warming is comparable to the evidence we have for evolution.

From Stuart's post: "Of course, reverse stupidity isn't intelligence: simply because one accepts AGW, doesn't make one more rational."

People don't get mind killed by political issues because they are dumb. It might be completely rational for them because signaling is more important for them.

Choosing to signal wouldn't be mindkill as it's understood hjhink the only reason to choose such a metric is because you get yourself mindkilled and want to label people who don't belong to your tribe as irrational and seek some rationalisation for it.

Labeling people seems to be exactly what you're doing yourself here. I can think of at least three more reasons.

I think Stuart simply underestimated the local mindkill caused by global warming debate in other people, or failed to understand that local mindkill isn't necessarily a good metric for irrationality. Neither of those require him to be mindkilled about the topic himself. One possibility is he failed to evaluate evidence of global warming himself and overestimated the probability of the relevant propositions.

You seem to be conflating intelligence and rationality in this comment. You probably know they're not the same thing.

All this being said, I don't agree with what Stuart was saying in his post. I have no opinion of global warming and haven't read about it much.

Comment author: ChristianKl 27 October 2013 02:31:56PM *  1 point [-]

Requiring the same strength of evidence from climate science as from particle physics would be insane.

What do you mean with "require"? If I say that climate science has the same strength of evidence as evolution than we can debate whether climate change does fulfill the 5 sigma criteria criteria.

I think it does, therefore the strength of evidence for climate change is not the same as the strength of evidence for evolution.

Why does it matter? It a X-risk that global warming doesn't really exist and we do geoengineering that seriously wrecks our planet. That risk might be something like p=0.001 but it does exist. It's greater than the risk of an asteroid destroying our civilisation in the next 100 years.

To the extend that one cares about X-risks it's important to distinguish claims with 2-3 sigma from those who pass 5 sigmas. It's just not the same level of evidence.

If we want to stay alive over the next hundred years it's important that decision makers in our society don't manuver us into an X-risk because they treat 2-3 sigma the same way as they treat treat 5 sigmas.

You seem to be conflating intelligence and rationality in this comment.

I don't use the word intelligence in the comment you quote. I use it in another post as a proxy variable. I equate rationality for the ability to update your beliefs in order to win.

Comment author: hyporational 27 October 2013 03:19:21PM *  0 points [-]

You used the words smart and dumb, I suppose that counts. I failed to understand most of your reply.

What do you mean with "require"?

I mean you don't need to be even nearly that certain for the findings to be actionable.

It a X-risk that global warming doesn't really exist and we do geoengineering

What's the expected utility of that compared to the expected utility of AGW? If you're too uncertain, why not just try to drastically reduce emissions instead of do major geoengineering? What's the expected utility of reducing emissions?

Comment author: Moss_Piglet 27 October 2013 06:28:37PM 1 point [-]

What's the expected utility of that compared to the expected utility of AGW? If you're too uncertain, why not just try to drastically reduce emissions instead of do major geoengineering? What's the expected utility of reducing emissions?

The current understanding of climate sensitivity is that since Carbon Dioxide gas will remain in the upper atmosphere for decades (and possibly centuries) even a complete halt on emissions will not avert warming predicted for the next century or so. And the models currently favored have pretty dire predictions for that level of warming, even if they're less severe than the alternative.

The only realistic solution, and naturally the one most strongly opposed by environmental groups, is solar radiation management. This would be very expensive, about $700M a year according to David Keith, and has potential risks which should be tested before any implementation plan. So not a silver bullet, but still much cheaper and safer in the long run than the standard environmental agenda even according to their own data.

(Note: I am assuming for the sake of argument that current climate models are accurate, but that is an assumption which should be questioned. Climate modeling is still in it's infancy and most existing models have difficulty with predictions even as close as a decade out. Warming is probably happening but that does not mean that any given prediction of warming is accurate, for reasons which should be obvious.)

Comment author: [deleted] 28 October 2013 09:07:16AM 0 points [-]

The current understanding of climate sensitivity is that since Carbon Dioxide gas will remain in the upper atmosphere for decades (and possibly centuries) even a complete halt on emissions will not avert warming predicted for the next century or so.

Methane has a shorter lifetime, though (though my five minutes' research tells me we've already stopped increasing methane emissions).

Comment author: Jack 27 October 2013 08:20:02PM *  0 points [-]

Are you saying that solar radiation management is an alternative to long-term emissions reduction? Or that, in addition to eventually tapering off greenhouse gas emissions, we're going to have to do something to keep temperatures down, and the best option is solar radiation management?

(edit: apparently I wrote social radiation management)

Comment author: Moss_Piglet 27 October 2013 10:53:41PM *  6 points [-]

Reducing emissions is a good goal, but energy needs will continue to increase even as we decrease the number of tons of carbon dioxide per kWh. As the population increases and becomes more wealthy there's not much we can do but put out more carbon dioxide; that's one of the reasons people bent on lowering world population and wealth have attached themselves to the environmental movement.

If the stigma against nuclear power goes away, or the technological issues which make speculative energy sources like wind/solar/fusion unprofitable are resolved, we could see a bigger dip but even then the century-long trend will probably be one of increase. SRM is the most realistic way I can think of to head off serious disasters until then.

Comment author: ChristianKl 27 October 2013 04:37:50PM *  1 point [-]

I mean you don't need to be even nearly that certain for the findings to be actionable.

If I ask "What's the evidence for global warming being real?" in searching for an accurate description of the world. Having accurate maps of the world is useful.

In the above example, saying that the evidence for global warming is like that for evolution is like claiming the moon is made of cheese.

The belief might help you to convince people to reduce emissions. Believing that the moon is made of cheese might help you to discourage people from going to the moon.

If the reason that someone advocates the ridiculous claim that the evidence for global warming is comparable to that for evolution, is that it helps him convince people to lower emission that person is mindkilled by his politics.

What's the expected utility of that compared to the expected utility of AGW? If you're too uncertain, why not just try to drastically reduce emissions instead of do major geoengineering? What's the expected utility of reducing emissions?

Right, because our political leaders excel at doing rational good expected utility comparisions... Memes exist in the real world. They have effects. Promoting false beliefs about the certainity of science has dangers.

I'm not in the position to have the power to choose that the world drastically reduces emissions or whether it does major geoengineering and scientists aren't either. Scientists do have a social responsiblity to promote accurate beliefs about the world.

Whether or not we should reduce emissions is a different question. If you can't mentally separate: "Should we reduce emissions" from "What's the evidence for global warming?" you likely mindkilled about the second question and hold beliefs that aren't accurate descriptions of reality.