Navid Hakimi

Posts

Sorted by New

Wiki Contributions

Comments

Sorted by

If we chose some other, more informative distribution, then we’d have . The average heat emitted given this is then  straightforwardly represents how much useful information the system stores, in the manner usually meant in information theory

what a great and insightful article. thank you

I always had a question about what would be the heat loss (entropy increase) for a situation where you might have some prior knowledge and just update the distribution, like in Bayesian setting. these lines clarifies the question I think. your heat loss Can be less than one bit in a case of updating prior believes.

can one add that if you are switching the wells, (updating a very bad prior believe), you are spending more energy? Instead of entropy, some kind of information distance or KL will determine the cost in this case