All of SK2's Comments + Replies

SK200

CiteULike is quite nice for this.

Connotea is a similar "personal research library" service but it doesn't let you store PDFs, just links to articles.

SK260

Even considering that, the 3% figure still seems wildly implausible. This would require something like 90% of the population thinking they pay 0% taxes, and the remaining 10% thinking they pay 30% taxes (which is still an underestimate).

The PDF that Louie linked to doesn't explain what the numbers mean. Surely there would be lots of articles about this epidemic of grossly underestimating taxes. Can anyone provide more evidence?

3SilasBarta
True. A few other possible factors: * Consider the impact of interpreting "I got some back" answers as being negative entries in the summation (though I hope the survey would put up a big asterisk about this when reporting the results!). * People took the question as being about federal income taxes, and that value is (incorrecty) compared to all taxes at all levels: social security taxes, state sales taxes, etc.
SK290

This is a great article, but it only lists studies where SPRs have succeeded. In fairness, it would be good to know if there were any studies that showed SPRs failing (and also consider publication bias, etc.).

1lukeprog
Definitely.
SK2110

Here is a very similar post on Ask Metafilter. (It is actually Ask Metafilter's most favorited post of all time.)

3lukeprog
Ah, yes, that list is one of my favorites. But, it doesn't enforce anything like the rules I've given above, which I think are useful.
SK200

Here's an insightful comment on the article:

http://www.reddit.com/r/math/comments/ezm6s/the_mathematics_of_beauty/c1c87ts

This is the same reason that when shopping on Amazon I ignore the reviews from people who rated the product 1 or 5 stars. They often have an ulterior motive of trying to damage/help the image of the product as much as possible.

SK200

Related positions include operations research analysts and quants at finance firms.

SK2260

It's a useful exercise for aspiring economists and rationalists to dissect charity into separate components of warm fuzzies vs. efficiency. However, maybe it's best for the general population not to be fully conscious that these are separate components, since the spirit of giving is like a frog: you can dissect it, but it dies in the process (adaptation of an E.B. White quote).

Lemma: we want charity to be enjoyable, so that more people are motivated to do it. (Analogy: capitalist countries let rich people keep their riches, to create an incentive for econo... (read more)

2DanielLC
If you learn about how to give right, some of the warm fuzzies will go away, and fewer people will donate, but the people who do donate will donate better. If all you're going to be doing is picking up litter at a beach, it really doesn't matter if you stop when you find out it's not helping people. You can find another hobby.
2Mqrius
Not quite the same scenario, but close: often when I'm considering donating to some charity, there's a reminder in the back of my head that if I were to truly support this charity I would donate a much larger amount. This isn't a happy thought, it generates conflict: there's another part of me that doesn't like spending large amounts of money. Thus, I often donate nothing at all. I'm still working on this conflict.
DSimon210

This is a genuine problem you're presenting, and I think it requires a third solution besides the presented options of "Let the lawyer do what he wants" and "Give the lawyer a buzzkill". What we need to do is find a way of getting the lawyer to understand what the right thing to do is, without making them feel defensive or like a jerk. If we make the bullet tasty enough, it'll get easier to swallow.

Rationalist marketing FTU (For The Utilons).

SK210

These people comment only on difficult, controversial issues which are selected as issues where people perform worse than random.

Related, maybe they only comment when they have something original and unorthodox to say (selection bias). It's easy to echo conventional wisdom and be right most of the time; for a smart person it's more exciting to challenge conventional wisdom, even if this gives them a higher risk of being wrong. In other words, maybe they place a lower priority on karma points, and more on building their muscles for original thought.

Examp... (read more)

SK220

I had the same issue with the Schwartz test. It seems not to correct for people who rate everything high (or low).

SK240

Talib Kweli is nonreligious, so I'm not changing the meaning of the quotation. "God" is often used poetically. Example:

"Subtle is the Lord, but malicious He is not."

Albert Einstein

Even if Kweli were religious the point would not be to put words in his mouth, but to reapply a beautiful quotation to another context where it is meaningful.

0smdaniel2
reapplying it to another context changes the meaning. because of einstein's explicitly stated opinions on the meaning of God (and the Lord), we can understand his meaning to be synonymous with that of nature and its order. "I believe in Spinoza's God who reveals himself in the orderly harmony of what exists, not in a God who concerns himself with the fates and actions of human beings." "I do not believe in a personal God and I have never denied this but have expressed it clearly. If something is in me which can be called religious then it is the unbounded admiration for the structure of the world so far as our science can reveal it. " - 1936 Talib Kweli, on the other hand, hasn't given us a clear opinion of his thoughts on the term God. There is no evidence for us to assume that the meaning he gives to the term God would fit in the context of this quote.
SK200

All my confidence comes from knowing God's laws.

-- Talib Kweli (substitute "nature" for "God")

8Risto_Saarelma
I don't think it would be a good idea to take a Carl Sagan quote and add a 'substitute "God" for "nature"' postscript. I don't think this is a good idea either.
SK200

Thanks Nick. That paper looks very interesting.

SK270

Oops, yes, I misread the original post. Thanks for pointing that out.

SK2140

The items on that list of appeals can also be ranked. According to mainstream US values, "Appeal to egalitarianism" trumps "Appeal to unquestionable authority", "Appeal to personal freedom" trumps "Appeal to egalitarianism"; and so on. The standard political talk show debate consists of a back-and-forth escalation up this ladder.

For example, in a televised debate on regulation:

Person 1: "The National Bureau of Economics Research published a study showing conclusively that regulation of X is harmful" (author... (read more)

3AdeleneDawner
Meta: Why was this voted down? (I voted it up earlier, and it's at 0 karma at the moment.) I understand that the actual point in the comment is tangential to the original article, and thus could be taken as off-topic or wrong, but I find it valuable to read such comments and the reactions that they evoke; such exchanges help point out the limitations of the tools and frameworks being discussed.
3Psy-Kosh
I'm not entirely sure it's the same. I mean, what you're describing is more a policy/decision debate. That is where principles like egalitarianism, personal freedom, and such are actually valid to appeal to since they're part of that-which-we-value. It's not exactly the same thing as what the OP is talking about, is it? (unless person 2 is saying "because it is unfair, the study that implied those consequences is, in fact, invalid" rather than "even given those consequences, it's still worthwhile because this value here is so important")
ata210

I think Eliezer's using these terms in a more specific sense than you are. For instance, your Person 2 is making an appeal to egalitarianism (in the conventional sense) as an argument for their position; while it still may be invalid, it's not an argument for why the debate should stop, which is what this post is about, if I'm reading it correctly. The appeal to egalitarianism is something like "Both of us have equally valid opinions, so who's to say which of us is right or wrong? Let's agree to disagree." The appeal to personal freedom is "... (read more)

SK200

Exercising "rational" self-control can be very unpleasant, therefore resulting in disutility.

Example 1: When I come buy an interesting-looking book on Amazon, I can either have it shipped to me in 8 days for free, or 2 days for a few bucks. The naive rational thing to do is to select the free shipping, but you know what? That 10-day wait is more unpleasant than spending a few bucks.

Example 2: When I come home from the grocery store I'm tempted to eat all the tastiest food first. It would be more "emotionally intelligent" to spread it ou... (read more)

1Nick_Tarleton
Relevant paper: Lay Rationalism and Inconsistency between Predicted Experience and Decision
SK2200

I think clever people are especially susceptible to the belief that their perceptions are typical. Let's say you can't visualize images in your mind, but your coworker insists that he can. Since you're not a brain scientist, you can't verify whether he's right or whether he's just misinterpreted the question. However, the last few times you had a disagreement with him on a verifiable subject, you were vindicated by the facts, so you can only assume that you are right this time as well. Add to that the fact that people's stated perceptions and preferences a... (read more)

3handoflixue
I'd assume blue collar, artist, and depression are pretty trivial to experience, if you're curious.... Female is also eminently doable, although it'd take a lot more time and energy (and if you're set on "temporary" it's going to be even slower) Admittedly, I seem to be vastly above-average in my ability to perceive the world through alternate lenses (Indeed, I find it baffling that you haven't experienced at least a few of those!)
0helm
Optimism/Pessimism seems to operate on a pretty linear scale. I was very optimistic about my own future until I hit my early 20s, now after a few bouts of depression I regularly underperform. (to generalize from one example, I know I have a hard time believing some people can be depressed and productive at the same time) What I can say with reasonable certainty is that liberals and conservatives build up different associations, retain different facts, etc, etc, which would make a temporary switch more difficult.
SK200

I think this disagreement comes down to the definition of "bias", which Wikipedia defines as "a tendency or preference towards a particular perspective, ideology or result, when the tendency interferes with the ability to be impartial, unprejudiced, or objective." If a bias helps you make fewer errors, I would argue it's not a bias.

Maybe it is clearer if we speak of behaviors rather than biases. A given behavior (e.g. tendency to perceive what you were expecting to perceive) may make you more biased in certain contexts, and more rationa... (read more)

SK200

How do people here consume Less Wrong? I just started reading and am looking for a good way to stay on top of posts and comments. Do you periodically check the website? Do you use an RSS feed? (which?) Or something else?

3AdeleneDawner
When I'm actively following the site (visiting 3+ times a day), I primarily follow the new comments page. I only read top posts when I see that there's an interesting discussion going on about one of them, or if the post's title seems particularly interesting. (I do wind up reading a large portion of the top posts sooner or later, though.) I have the 'recent posts' RSS feed in my reader for when I'm not actively following the site, but I only click through if something seems very interesting.
3Alicorn
I use RSS for top level posts, and have an easily accessible bookmark to the comments page which I check more frequently than I should.
1LucasSloan
I read new posts as soon as I see them. I look at the comments through the recent comments bar, but that requires having the LW tab open more or less constantly. I also reread posts to get any comments I miss and to get a better sense of how the discussions are preceding.
SK280

Imagine an experiment where we randomize subjects into two groups. All subjects are given a 20-question quiz that asks them to provide a confidence interval on the temperatures in various cities around the world on various dates in the past year. However, the cities and dates for group 1 are chosen at random, whereas the cities and dates for group 2 are chosen because they were record highs or lows.

This will result in two radically different estimates of overconfidence. The fact that the result of a calibration test depends heavily on the questions being a... (read more)

3pengvado
I think the two of you are looking at different parts of the process. "Amount of trickiness" is a random variable that is rolled once per quiz. Averaging over a sufficiently large number of quizzes will eliminate any error it causes, which makes it a contribution to variance, not systematic bias. Otoh, "estimate of the average trickiness of quizzes" is a single question that people can be wrong about. No amount of averaging will reduce the influence of that question on the results, so if your reason for caring about calibration isn't to get that particular question right, it does cause a systematic bias when applying the results to every other situation.
SK240

I have seen a problem with selection bias in calibration tests, where trick questions are overrepresented. For example, in this PDF article, the authors ask subjects to provide a 90% confidence interval estimating the number of employees IBM has. They find that fewer than 90% of subjects select a suitable range, which they conclude results from overconfidence. However, IBM has almost 400,000 employees, which is atypically high (more than 4x Microsoft). The results of this study have just as much to do with the question asked as with the overconfidence of t... (read more)

2Blueberry
That really shouldn't matter. Your calibration should include the chances of the question being a "trick question". If fewer than 90% of subjects give confidence intervals containing the actual number of employees, they're being overconfident by underestimating the probability that the question has an unexpected answer.
SK2953

Another reason converts are more zealous than people who grew up with a religion is that conversion is a voluntary act, whereas being born into a religious family is not. Converting to a religion late in life is a radical move, one that generally requires a certain amount of zeal and motivation to begin with, so converts are pre-selected to be zealous.

SK2150

Regarding the "Repent" example: as conformists, human beings are more likely to make particular decisions (like wear a "Repent" sign) if they believe others would do the same. So instead of framing this study as showing that "sign-wearing volunteers overestimate the probability others would volunteer", one could flip the implied causality and say "people who think others would volunteer are more likely to volunteer themselves", a much more banal claim. One could test the effect by re-running the experiment on self-id... (read more)

SK280

I wonder how long-lasting this "quota" effect is. The study only looked at the immediate effects of moral behavior, not the more important long-term effects.

To make an analogy with physical exercise, maybe flexing your moral muscles exhausts your ability to be moral for the rest of the day, but when you wake up tomorrow your moral strength will be not only restored but actually strengthened. Most forms of exertion I can think of (e.g. learning, writing, working) work like this, so I wouldn't be surprised if the same held for doing good deeds.