Posts

Sorted by New

Wiki Contributions

Comments

Sorted by

This seems like a dead thread, but I'll chance it anyway.

Elizer, there's something off about your calculation of the expected score:

The expected score is something that should go up the more certain I am of something, right?

But in fact the expected score is highest when I'm most uncertain about something: If I believe with equal probability that snow might be white and non-white, the expected score is actually 0.5(-1) + 0.5(-1) = -1. This is the highest possible expected score.

In any other case, the expected score will be lower, as you calculate for the 70/30 case.

It seems like what you should be trying to do is minimize your expected score but maximize your actual score. That seems weird.