You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

buybuydandavis comments on A probability question - Less Wrong Discussion

6 Post author: PhilGoetz 19 October 2012 10:34PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (27)

You are viewing a single comment's thread. Show more comments above.

Comment author: buybuydandavis 20 October 2012 02:11:11AM 0 points [-]

In mixture of expert problems, the experts are not independent, that's the whole problem. They are all trying to correlate to some underlying reality, and thereby are correlated with each other.

But you also say "dozens of different things". Are they trying to estimate the same things, different things, or different things that should all correlate to the same thing?

See my longer comment above for more details, but it's sounding like you don't want to evaluate over the whole data set, you just want to make some assumption about the statistics of your classifiers, and combine them via maximum entropy and those statistics.