You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

drethelin comments on Stupid Questions May 2015 - Less Wrong Discussion

10 Post author: Gondolinian 01 May 2015 05:28PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (263)

You are viewing a single comment's thread.

Comment author: drethelin 01 May 2015 08:05:43PM 3 points [-]

Is there anything a non-famous non-billionaire person do to meaningfully impact medical research? It seems like the barriers to innovation are insurmountable to everyone with the will to try, and the very few organizations and people who might be able to aren't dedicated to it.

Comment author: James_Miller 03 May 2015 04:05:16PM 3 points [-]

Yes depending on how you define "meaningful". If you can speed up a cure for cancer by 1 day, or increase the chance of us getting a cure next year by 1 in a million then the expected value of this increase in terms of human welfare is huge compared to what most people manage to accomplish in their life.

Comment author: IlyaShpitser 01 May 2015 08:38:41PM 7 points [-]

Causal inference research.

Comment author: Vaniver 03 May 2015 06:07:42PM 4 points [-]

Is the limiting factor here heads or funding?

Comment author: mkf 03 May 2015 04:46:33PM 2 points [-]

Could you elaborate?

Comment author: IlyaShpitser 03 May 2015 06:05:30PM *  4 points [-]

Sure. I am not an MD, but this is my view as an outsider: the medical profession is quite conservative, and people that publish medical papers have an incentive to publish "fast and loose," and not necessarily be very careful (because hey you can just write another paper later if a better method comes along!)


Because medicine deals with people, you often can't do random treatment assignment in your studies (for ethical reasons), so you often have evidence in the form of observational data, where you have a ton of confounding of various types. Causal inference can help people make use of observational data properly. This is important -- most data we have is observational. And if you are not careful, you are going to get garbage from it. For example, there was a fairly recent paper by Robins et al. that basically showed that their way of adjusting for confounding in observational data was correct because they reproduced a result found in an RCT.

There is room both for people to go to graduate school and work on new methods for properly dealing with observational data for drawing causal conclusions, and for popularizers.

Popularizing this stuff is a lot of what Pearl does, and is also partly the purpose of Hernan's and Robin's new book:

https://www.facebook.com/causalinference


Full disclosure: obviously this is my area, so I am going to say that. So don't take my word for it :).

Comment author: Kazuo_Thow 03 May 2015 09:31:58PM 2 points [-]

Here on Less Wrong there are a significant number of mathematically inclined software engineers who know some probability theory, meaning they've read/worked through at least one of Jaynes and Pearl but may not have gone to graduate school. How could someone with this background contribute to making causal inference more accessible to researchers? Any tools that are particularly under-developed or missing?

Comment author: IlyaShpitser 04 May 2015 12:55:47PM *  3 points [-]

I am not sure I know what the most impactful thing to do is, by edu level. Let me think about it.


My intuition is the best thing for "raising the sanity waterline" is what the LW community would do with any other bias: just preaching association/causation to the masses that would otherwise read bad scientific reporting and conclude garbage about e.g. nutrition. Scientists will generally not outright lie, but are incentivized to overstate a bit, and reporters are incentivized to overstate a bit more. In general, we trust scientific output too much, so much of it is contingent on modeling assumptions, etc.

Explaining good clear examples of gotchas in observational data is good: e.g. doctors give sicker people a pill, so it might look like the pill is making people sick. It's like the causality version of the "rare cancer => likely you have a false positive by Bayes theorem". Unlike Bayes theorem, this is the kind of thing people immediately grasp if you point it out, because we have good causal processing natively, unlike our native probability processing which is terrible. Association/causation is just another type of bias to be aware of, it just happens to come up a lot when we read scientific literature.


If you are looking for some specific stuff to do as a programmer, email me :). There is plenty to do.

Comment author: adamzerner 03 May 2015 03:39:12AM 4 points [-]

Become a famous billionaire?

Comment author: DanArmak 04 May 2015 06:02:44PM *  1 point [-]

Or just a billionaire, or (in a pinch) just very famous.

Comment author: ChristianKl 02 May 2015 12:05:49AM 0 points [-]

It seems like the barriers to innovation are insurmountable to everyone with the will to try

Why?

When it comes to running studies, you could run a study for the effect of taking Vitamin D in the morning vs. in the evening. Various QS people have found an effect for themselves. The effect should be there. As far as I know there no study establishing the effect. Doesn't seem to be difficult for someone without resources but with a decent amount of time on their hand.

As far as diagnostic tools go, we today have a lot of smart scales and blood pressure measurement devices. As far as I know nobody produced a similarly smart peak flow meter (/ FEV1). Such a device would allow the gathering of more data for a lot of people. More and better data means more opportunities for scientific insight.

On a more theoretical level I think there could be advances in organizing the large pile of biological and medical knowledge we have. I don't think that textbooks and journal articles are good mediums for transferring knowledge between scientists. Protein databases like uniprot contain a lot of information in a way that's easy to query. I think that finding ways to organize less structured biological insights and the evidence for them is an area with a high potential impact.