Lumifer comments on The Power of Noise - LessWrong

28 Post author: jsteinhardt 16 June 2014 05:26PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (80)

You are viewing a single comment's thread. Show more comments above.

Comment author: PhilGoetz 29 October 2015 06:38:57PM *  1 point [-]

JWW suggests that an AI could partition trial subjects into control and experimental groups such that expected number of events in both was equal, and presumably also such that cases involving assumptions were distributed equally, to minimize the impact of assumptions. For instance, an AI doing a study of responses to an artificial sweetener could do some calculations to estimate the impact of each gene on sugar metabolism, then partition subjects so as to balance their allele frequencies for those genes.

(A more extreme interpretation would be that the AI is partitioning subjects and performing the experiment not in a way designed to test a single hypothesis, but to maximize total information extracted from the experiment. This would be optimal, but a radical departure from how we do science. Actually, now that I think of it, I wrote a grant proposal suggesting this 7 years ago. My idea was that molecular biology must now be done by interposing a layer of abstraction via computational intelligence in between the scientist and the data, so that the scientist is framing hypotheses not about individual genes or proteins, but about causes, effects, or systems. It was not well-received.)

There's another comment somewhere countering this idea by noting that this almost requires omniscience; the method one uses to balance out one bias may introduce another.

Comment author: Lumifer 29 October 2015 06:53:38PM 1 point [-]

performing the experiment not in a way designed to test a single hypothesis, but to maximize total information extracted from the experiment

Designing experiments to get more information than just evidence for a single hypothesis is old hat.