Even considering that, the 3% figure still seems wildly implausible. This would require something like 90% of the population thinking they pay 0% taxes, and the remaining 10% thinking they pay 30% taxes (which is still an underestimate).
The PDF that Louie linked to doesn't explain what the numbers mean. Surely there would be lots of articles about this epidemic of grossly underestimating taxes. Can anyone provide more evidence?
This is a great article, but it only lists studies where SPRs have succeeded. In fairness, it would be good to know if there were any studies that showed SPRs failing (and also consider publication bias, etc.).
Here is a very similar post on Ask Metafilter. (It is actually Ask Metafilter's most favorited post of all time.)
Here's an insightful comment on the article:
http://www.reddit.com/r/math/comments/ezm6s/the_mathematics_of_beauty/c1c87ts
This is the same reason that when shopping on Amazon I ignore the reviews from people who rated the product 1 or 5 stars. They often have an ulterior motive of trying to damage/help the image of the product as much as possible.
It's a useful exercise for aspiring economists and rationalists to dissect charity into separate components of warm fuzzies vs. efficiency. However, maybe it's best for the general population not to be fully conscious that these are separate components, since the spirit of giving is like a frog: you can dissect it, but it dies in the process (adaptation of an E.B. White quote).
Lemma: we want charity to be enjoyable, so that more people are motivated to do it. (Analogy: capitalist countries let rich people keep their riches, to create an incentive for econo...
This is a genuine problem you're presenting, and I think it requires a third solution besides the presented options of "Let the lawyer do what he wants" and "Give the lawyer a buzzkill". What we need to do is find a way of getting the lawyer to understand what the right thing to do is, without making them feel defensive or like a jerk. If we make the bullet tasty enough, it'll get easier to swallow.
Rationalist marketing FTU (For The Utilons).
These people comment only on difficult, controversial issues which are selected as issues where people perform worse than random.
Related, maybe they only comment when they have something original and unorthodox to say (selection bias). It's easy to echo conventional wisdom and be right most of the time; for a smart person it's more exciting to challenge conventional wisdom, even if this gives them a higher risk of being wrong. In other words, maybe they place a lower priority on karma points, and more on building their muscles for original thought.
Examp...
I had the same issue with the Schwartz test. It seems not to correct for people who rate everything high (or low).
Talib Kweli is nonreligious, so I'm not changing the meaning of the quotation. "God" is often used poetically. Example:
"Subtle is the Lord, but malicious He is not."
Albert Einstein
Even if Kweli were religious the point would not be to put words in his mouth, but to reapply a beautiful quotation to another context where it is meaningful.
All my confidence comes from knowing God's laws.
-- Talib Kweli (substitute "nature" for "God")
Thanks Nick. That paper looks very interesting.
Oops, yes, I misread the original post. Thanks for pointing that out.
The items on that list of appeals can also be ranked. According to mainstream US values, "Appeal to egalitarianism" trumps "Appeal to unquestionable authority", "Appeal to personal freedom" trumps "Appeal to egalitarianism"; and so on. The standard political talk show debate consists of a back-and-forth escalation up this ladder.
For example, in a televised debate on regulation:
Person 1: "The National Bureau of Economics Research published a study showing conclusively that regulation of X is harmful" (author...
I think Eliezer's using these terms in a more specific sense than you are. For instance, your Person 2 is making an appeal to egalitarianism (in the conventional sense) as an argument for their position; while it still may be invalid, it's not an argument for why the debate should stop, which is what this post is about, if I'm reading it correctly. The appeal to egalitarianism is something like "Both of us have equally valid opinions, so who's to say which of us is right or wrong? Let's agree to disagree." The appeal to personal freedom is "...
Exercising "rational" self-control can be very unpleasant, therefore resulting in disutility.
Example 1: When I come buy an interesting-looking book on Amazon, I can either have it shipped to me in 8 days for free, or 2 days for a few bucks. The naive rational thing to do is to select the free shipping, but you know what? That 10-day wait is more unpleasant than spending a few bucks.
Example 2: When I come home from the grocery store I'm tempted to eat all the tastiest food first. It would be more "emotionally intelligent" to spread it ou...
I think clever people are especially susceptible to the belief that their perceptions are typical. Let's say you can't visualize images in your mind, but your coworker insists that he can. Since you're not a brain scientist, you can't verify whether he's right or whether he's just misinterpreted the question. However, the last few times you had a disagreement with him on a verifiable subject, you were vindicated by the facts, so you can only assume that you are right this time as well. Add to that the fact that people's stated perceptions and preferences a...
I think this disagreement comes down to the definition of "bias", which Wikipedia defines as "a tendency or preference towards a particular perspective, ideology or result, when the tendency interferes with the ability to be impartial, unprejudiced, or objective." If a bias helps you make fewer errors, I would argue it's not a bias.
Maybe it is clearer if we speak of behaviors rather than biases. A given behavior (e.g. tendency to perceive what you were expecting to perceive) may make you more biased in certain contexts, and more rationa...
How do people here consume Less Wrong? I just started reading and am looking for a good way to stay on top of posts and comments. Do you periodically check the website? Do you use an RSS feed? (which?) Or something else?
Imagine an experiment where we randomize subjects into two groups. All subjects are given a 20-question quiz that asks them to provide a confidence interval on the temperatures in various cities around the world on various dates in the past year. However, the cities and dates for group 1 are chosen at random, whereas the cities and dates for group 2 are chosen because they were record highs or lows.
This will result in two radically different estimates of overconfidence. The fact that the result of a calibration test depends heavily on the questions being a...
I have seen a problem with selection bias in calibration tests, where trick questions are overrepresented. For example, in this PDF article, the authors ask subjects to provide a 90% confidence interval estimating the number of employees IBM has. They find that fewer than 90% of subjects select a suitable range, which they conclude results from overconfidence. However, IBM has almost 400,000 employees, which is atypically high (more than 4x Microsoft). The results of this study have just as much to do with the question asked as with the overconfidence of t...
Another reason converts are more zealous than people who grew up with a religion is that conversion is a voluntary act, whereas being born into a religious family is not. Converting to a religion late in life is a radical move, one that generally requires a certain amount of zeal and motivation to begin with, so converts are pre-selected to be zealous.
Regarding the "Repent" example: as conformists, human beings are more likely to make particular decisions (like wear a "Repent" sign) if they believe others would do the same. So instead of framing this study as showing that "sign-wearing volunteers overestimate the probability others would volunteer", one could flip the implied causality and say "people who think others would volunteer are more likely to volunteer themselves", a much more banal claim. One could test the effect by re-running the experiment on self-id...
I wonder how long-lasting this "quota" effect is. The study only looked at the immediate effects of moral behavior, not the more important long-term effects.
To make an analogy with physical exercise, maybe flexing your moral muscles exhausts your ability to be moral for the rest of the day, but when you wake up tomorrow your moral strength will be not only restored but actually strengthened. Most forms of exertion I can think of (e.g. learning, writing, working) work like this, so I wouldn't be surprised if the same held for doing good deeds.
CiteULike is quite nice for this.
Connotea is a similar "personal research library" service but it doesn't let you store PDFs, just links to articles.