Alexandros comments on Open Thread: July 2010 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (653)
Is there an on-line 'rationality test' anywhere, and if not, would it be worth making one?
The idea would be to have some type of on-line questionnaire, testing for various types of biases, etc. Initially I thought of it as a way of getting data on the rationality of different demographics, but it could also be a fantastic promotional tool for LessWrong (taking a page out of the Scientology playbook tee-hee). People love tests, just look at the cottage industry around IQ-testing. This could help raise the sanity waterline, if only by making people aware of their blind spots.
There are of course the typical problems with 'putting a number on a person's rationality' and perhaps it would need some focused expertise to pull off plausibly, but I do think it's a useful thing to have around, even just to iterate on.
My kind of test would be like this:
1) Do you always seem to be able to predict the future, even as others doubt your predictions?
If they say yes ---> "That's because of confirmation bias, moron. You're not special."
In their defense, it might be hindsight bias instead. :P
There's an online test for calibration of subjective probabilities.
That was pretty awesome, thanks. Not precisely what I had in mind, but close enough to be an inspiration. Cheers.
The test should include questions about applying rationality in one's life, not just abstract problems.
I would love for this to exist! I have some notes on easily-tested aspects of rationality which I will share:
The Conjunction Fallacy easily fits into a short multi-choice question.
I'm not sure what the error is called, but you can do the test described in Lawful Uncertainty:
You could do the positive bias test where you tell someone the triplet "2-4-6" conforms to a rule and have them figure out the rule.
You might be able to come up with some questions that test resistance to anchoring.
It might be out of scope of rationality and getting closer to an intelligence test, but you could take some "cognitive reflection" questions from here, which were discussed at LessWrong here.
That Virginia Postrel article was interesting.
I was wondering why more reflective people were both more patient and less risk-averse -- she doesn't make this speculation, but it occurs to me that non-reflective people don't trust themselves and don't trust the future. If you aren't good at math and you know it, you won't take a gamble, because you know that good gamblers have to be clever. If you aren't good at predicting the future, you won't feel safe waiting for money to arrive later. Tomorrow the gods might send you an earthquake.
Risk aversion and time preference are both sensible adaptations for people who know they're not clever. People who are good at math and science don't retain such protections because they can estimate probabilities, and because their world appears intelligible and predictable.
Um, that should make them more risk-averse, shouldn't it? Or do you mean reflective people don't trust themselves or the future?
oops. Reflective people are LESS risk averse. Corrected above.
That's even more confusing. I would expect a reflective person to be more self-doubtful and more risk-averse than a non-reflective person, all else being equal. But perhaps a different definition of "reflective" is involved here.
Possibly. A reflective person can use expected-utility to make choices that regular people would simply categorically avoid. (One might say in game-theoretic terms that a rational player can use mixed strategies, but irrational ones cannot and so can do worse. But that's probably pushing it too far.)
I recall reading one anecdote on an economics blog. The economist lived in an apartment and the nearest parking for his car was quite a ways away. There were tickets for parking on the street. He figured out the likelihood of being ticketed & the fine, and compared its expected disutility against the expected disutility of walking all the way to safe parking and back. It came out in favor of just eating the occasional ticket. His wife was horrified at him deliberately risking the fines.
Isn't this a case of rational reflection leading to an acceptance of risk which his less-reflective wife was averse to?
In a serendipitous and quite germane piece of research, Marginal Revolution links to a study on IQ and risk-aversion:
I don't believe the article says "reflective":
The problem with the temperament checks in the last two paragraphs is that they're still testing roughly the same thing that's tested earlier on-- competence at word problems.
And possibly interest in word problems-- I know I've seen versions of the three problems before. I wouldn't be going at them completely cold, but I wouldn't have noticed and remembered having seen them decades ago if word problems weren't part of my mental univers.
Somewhat offtopic:
I recall reading a study once that used a test which I am almost certain was this one to try to answer the cause/correlation question of whether philosophical training/credentials improved one's critical thinking or whether those who undertook philosophy already had good critical thinking skills; when I recently tried to re-find it for some point or other, I was unable to. If anyone also remembers this study, I'd appreciate any pointers.
(About all I can remember about it was that it concluded, after using Bayesian networks, that training probably caused the improvements and didn't just correlate.)
They are more risk-averse - that was a typo.
Thanks for the ideas. It's good to have something concrete. Let's see how it goes.
The test's questions may need to be considerably dynamic to avert the possibility that people condition to specific problems without shedding the entire infected heuristic. Someone who had read Less Wrong a few times, but didn't make the knowledge truly a part of them, might return false negative for certain biases while retaining those biases in real-life situations. Don't want to make the test about guessing the teacher's password.
I'd suggest starting with a list of common biases and producing a question (or a few?) for each. The questions could test the biases and you could have an explanation of why the biased reasoning is bad, with examples.
It would also be useful to group the biases together in natural clusters, if possible.
Sounds like a good idea. Doesn't have to be invented from scratch; adapt a few psychological or behavioral-economics experiments. It's hard to ask about rationality in one's own life because of self-reporting problems; if we're going to do it, I think it's better to use questions of the form "Scenario: would you do a, b, c, or d?" rather than self-descriptive questions of the form "Are you more: a or b?"