Posts

Sorted by New

Wiki Contributions

Comments

Very interesting article! It seemed very surprising to me that the information entropy can be interpreted as the (minimum average) number of yes/no-questions needed to fully determine the state of a system. Especially since the value of the entropy depends on the units (i.e. what logarithm base you use). Is the use of base two related to the fact that the answers can be either 'yes' or 'no' (binary)?

Another question. Suppose you have a system X in 4 possible states {X1,X2,X3,X4} with probabilities {3/8,2/8,2/8,1/8} respectively. The information entropy is about 1.91 bits. But I couldnt find any series of yes/no questions to ask that would give me 1.91 questions on average. An average of 2 questions is the best I could do. That is the same as the situation with probabilities {1/4,1/4,1/4,1/4} even though we have more information than that in our present case. Am I not looking hard enough?

I`m really trying to get a grasp on the concept of information entropy so I hope someone can answer my questions.

Great blog in any case. I`m glad I found it!