7

Discussion of a book by "Dow Jones 36,000" Glassman". I'm wondering whether there are pundits who are so often wrong that their predictions are reliable indicators that something else (ideally the opposite) will happen.

New Comment
19 comments, sorted by Click to highlight new comments since: Today at 9:26 AM

A bit of silliness, courtesy of an old Paul Krugman blog post:

[W]ith apologies to Brad DeLong, when reading WSJ editorials you need to bear two things in mind:

  1. The WSJ editorial page is wrong about everything.
  2. If you think the WSJ editorial page is right about something, see rule #1.

After all, here’s what you would have believed if you listened to that page over the years: Clinton’s tax hike will destroy the economy, you really should check out those people suggesting that Clinton was a drug smuggler, Dow 36000, the Bush tax cuts will bring surging prosperity, Saddam is backing Al Qaeda and has WMD, there isn’t any housing bubble, US households have a high savings rate if you measure it right. I’m sure I missed another couple of dozen high points.

More seriously, I've heard that the Wall Street Journal editorial page acts like our clever argurer: the writers are given a conclusion (usually something along the lines of "cut taxes!") by their bosses and ordered to write an editorial in support of that conclusion. Saying "This policy should be implemented because it would make my bosses wealthier (at the expense of others)" isn't very persuasive, so they need to get clever about it...

Even if they are acting as clever arguers for their bosses, that shouldn't make them actively anti-correlated with correct claims. That's not in their interest and would without a lot of brain power and dedication to just that goal be extremely difficult.

[-][anonymous]13y00

It should if we assume the interests of their bosses are not the same as the interests of the readers. Assume that, say, cutting the taxes of the rich will lose me five utilons, but give their bosses that many. If they're trying to persuade me to support tax cuts for the rich, then predicting a loss of utilons for me will make it less likely that I support it. However, saying it will give me more utilons will make it more likely that I support it.

For them to be actively anti-correlated with correctness, then, we only have to assume that the boss' interests are actively anti-correlated with those of the readers.

But the claim that the boss' interest are actively anti-correlated with readers is an incredibly strong claim itself. In the vast majority of contexts this won't be true. Who benefits if new technologies are discovered? Who suffers if there's a nuclear war or if we reach peak oil faster than generally anticipated?

It is conceivable that within the narrow confines of what the WSJ editorials generally discuss there's an anti-correlation. But even that set of narrow topics is pretty wide. To the point where this seems unlikely. It seems more likely to me that there might be specific ideologies which in their narrow realm are anti-correlated with truths about controversial questions. But to construct explicit unambiguous examples of that even in economics I need to do something like select merchantilism as the relevant economic theory.

I agree it's a strong claim, but I can see a mechanism that makes it a little more plausible. Where the owners of WSJ have the same interests as their readers, the WSJ need not write about it because the readers will do what is in their mutual interests. It is only when their interests are opposed that the WSJ has to work to persuade them to work against their own interests and for Murdoch's interests.

Most interesting claims are narrow; the narrower a claim, the more likely it is to be wrong.

If you have an expert who seems systematically wrong, "opposite" predictions will be broad claims that are often right. But that's not very useful - it's easy to make a series of broad claims, most of which are right.

This seems like a very unlikely sort of phenomena, reversed stupidity != intelligence etc. Why would you expect such people?

It's common in certain types of polemic. People hold (or claim to hold) beliefs to signal group affiliation, and the more outlandishly improbable the beliefs become, the more effective they are as a signal.

It becomes a competition: Whoever professes beliefs which most strain credibility is the most loyal.

[-][anonymous]13y20

I think that most people who tell pollsters they believe conspiracy theories wouldn't bet on them.

Data on that question would be an interesting thing to gather, though I might guess they would take attempts to measure their belief as somehow a manifestation of the conspiracy. (Everything is evidence for the conspiracy.)

The != operator suggest bidirectionality, but it's really unidirectional. Intelligence can be reversed stupidity if it wants to be.

Possibly just an aesthetic preference. You probably have a point.

I think such people might exist when the possibilities for prediction are relatively constrained, but even then, some fraction of their consistent wrongness would be a matter of luck, and couldn't be used for prediction.

[-][anonymous]13y00

In fact, when the possibilities for prediction are relatively constrained, but there are a lot of people making predictions, and the system is complicated enough that you can't expect most people to be mostly right, we'd have some people being consistently wrong by chance alone.

I've read that actively managed mutual funds are, on average, slightly worse than random chance allows.

This could result from adding management fees to "as bad as random" performance.

No. It was talking just about their stock choices. It specifically commented that this was worse than a monkey throwing darts.

That seems plausible. Their own behavior, bulk and biased incentives could well be enough to put them very slightly behind.

This and transaction costs.

I believe certain investors used some sentiment as counter-indicators. Jim Cramer comes to mind.