Zvi comments on Quantified Health Prize results announced - Less Wrong

44 Post author: Zvi 19 February 2012 08:10AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (63)

You are viewing a single comment's thread. Show more comments above.

Comment author: Zvi 19 February 2012 02:48:57PM 8 points [-]

A lot of people were very put off by the state of knowledge in this area; it turned out we'd chosen an area where it's very difficult to do good work. Kevin's entry illustrates this more than anything - he started out thinking he knew things about how to supplement, and then decided he knew far less than he thought.

Comment author: MichaelVassar 20 February 2012 03:59:21PM 8 points [-]

Making sense of knowledge in a bad state though is precisely the sort of thing that should test rationality skills of the sort we try to cultivate here.

Comment author: Dr_Manhattan 20 February 2012 06:19:03PM 10 points [-]

I think rationality makes a big difference when an area is confused, not when it's a complex area with little data (like weather prediction without lots of sensors and supercomputers). My impression from the discussion is that supplementation is closer to the later, and not much is achieved by pure paper research.

Comment author: NancyLebovitz 20 February 2012 07:06:34PM 0 points [-]

Mathsemantics is the only book I know of on the subject. Any other suggestions?

Actually, How to Lie with Statistics (1954) might also count, but has there been any more recent work on the subject?

Comment author: MichaelVassar 20 February 2012 08:50:36PM 2 points [-]

There's lots about lying with statistics, especially in the context of medicine. Anything by Ioannidis for one. Arguments in general should be part of our skill set though, and those are obviously relevant here.

Comment author: Eliezer_Yudkowsky 20 February 2012 10:16:49PM 6 points [-]

This confirms a prediction that I'm pretty sure I went around saying in advance (the state of knowledge in nutrition is fucked up beyond most easy epistemic wins).

Comment author: MichaelVassar 21 February 2012 06:46:35PM 6 points [-]

Wow I'd call that confirmation bias. We got some significant solidification of existing info, a bit of new info, and spent $7500. Multiply these results by 1000 and what do you get? I'd guess, a fairly thorough knowledge-base.