elharo comments on Rationality Quotes March 2014 - Less Wrong

4 Post author: malcolmocean 01 March 2014 03:34PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (326)

You are viewing a single comment's thread.

Comment author: elharo 23 March 2014 10:58:04AM *  0 points [-]

in general, when experts are dealing with some big unfathomable future, and it’s a complex system, I tend to discount that. The complexity makes it almost impossible to predict.

Also, if they are using a model, I pretty much discount everything I hear. But if they are just looking at data like a scientist and saying, “When this happens, that happens,” then I’m going to put more stock in it.

--Scott Adams, Interview with Julia Galef, February 10, 2014

Comment author: satt 27 March 2014 03:32:25AM 6 points [-]

But you can’t be an effective fox just by letting the data speak for itself — because it never does. You use data to inform your analysis, you let it tell you that your pet hypothesis is wrong, but data are never a substitute for hard thinking. If you think the data are speaking for themselves, what you’re really doing is implicit theorizing, which is a really bad idea (because you can’t test your assumptions if you don’t even know what you’re assuming.)

— Paul Krugman, "Sergeant Friday Was Not A Fox"

Comment author: Manfred 26 March 2014 04:06:27AM 1 point [-]

"Just looking at the data like a scientist" does not give you magic scientist powers. Models of the world are what allow you to predict it, without need for magic scientist vision.

Comment author: elharo 26 March 2014 11:15:19AM *  5 points [-]

Adams doesn't elaborate on this point, but I read him as saying, if you've actually measured things and taken data that goes to your point, then your model is more likely to be correct.

For example, suppose a model says that raising the minimum wage reduces employment. That's a pretty common model in economics and it can be backed up with a lot of math. However I would not find that alone convincing. On the other hand, if an economist goes out into the world and looks at what actually happened when the minimum wage was raised, that would be more convincing. If they can figure out a way to do an experiment in which, for example, 5 nearby towns raise their minimum wage, 5 keep it the same, and another 5 lower it, that would be even more convincing.

Another example: consider a model that says

  • heart disease kills people
  • heart disease is correlated with high cholesterol
  • eggs contain lots of cholesterol

Those three statements are reasonably well established and backed up by data. However if you throw in a model that says dietary cholesterol causes in-body cholesterol, and in-body cholesterol causes heart disease, and therefore eating eggs reduces life expectancy; you've jumped way beyond what the data supports. On the other hand, if you compare the levels of all-cause morbidity among people who eat eggs and people who don't or, better yet, do a multiyear controlled experiment in which
the only diet variation between groups is that some people eat eggs and others don't, the answers you get are far more likely to be correct.

Here's another one: you have lots of detailed calculations that say if you smash two protons together at .999999c relative velocity, and you do it a few million times, then you'll see certain particles show up in the debris with very precise probabilities. Only when you run the experiment, you discover that the fractions of different particles you see don't quite match what you expected because there's an additional resonance you didn't know about and didn't include in the model.

In other words, empirical data beats mere models. Models can be self-consistent and plausible, but not fully reflect the real world. Models that go beyond what the data says run the risk of assuming causal connections that don't exist (dietary cholesterol to in-body cholesterol) or missing factors outside the model (maybe eggs do increase the risk of heart disease but reduce the risk of cancer) that are more important.

Of course all these experiments are really hard to do, and take years of time and millions, even billions, of dollars, so often we muddle along with seriously flawed models instead. However we need to remember that models are just models, not data, and be reasonably skeptical of their recommendations. In particular, if we're about to do something really expensive and difficult like changing a nation's dietary preferences based on nothing more than a model, maybe we should step back and spend the money and the time needed to collect real data before we go full speed ahead.

Comment author: Manfred 26 March 2014 11:08:51PM 1 point [-]

Fair enough - political conditioning has caused me to assume that any non-specialist saying "don't trust models, just 'look at the data'," is the victim of some sort of anti-epistemology.

In context, it's less likely that that's the case, but I still think this quote is painting with much too wide a brush.

Comment author: TheAncientGeek 27 March 2014 08:18:20AM *  0 points [-]

Prediction is going beyond the data, so a model that never goes beyond the data isn't going to be much use.

Climate change models incorporated data, so they are not purely theoretical like the economic model you mentioned.

Comment author: MugaSofer 27 March 2014 02:48:49PM 0 points [-]

I ... think he's talking about basic correlation, statistical analysis, that sort of thing?

(I enjoy Scott's writing, but I didn't upvote the grandparent.)

Comment author: EHeller 26 March 2014 05:16:00AM *  -2 points [-]

I'm fairly certain that's actually horrible advice. It boils down to "substitute your judgement over professionals on those problems that are hardest."

Comment author: shminux 26 March 2014 07:00:58AM 1 point [-]

More like "discount status on problems where expertise is a poor predictor of accuracy".

Comment author: fezziwig 26 March 2014 07:50:24PM 2 points [-]

I think this steelman is not quite true to the spirit of the original. The contrast he draws between "using a model" and "looking at data like a scientist" is especially strange. One wonders what he thinks of meteorology.

Comment author: [deleted] 30 March 2014 08:13:41AM *  -1 points [-]

Also, if they are using a model, I pretty much discount everything I hear. But if they are just looking at data like a scientist and saying, “When this happens, that happens,” then I’m going to put more stock in it.

What? Scientists do use models. Assuming charitably that he's not mistaken or bullshitting about what scientists do, by “model” he must mean something different -- what?