alex_zag_al comments on Putting in the Numbers - Less Wrong

8 Post author: Manfred 30 January 2014 06:41AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (32)

You are viewing a single comment's thread. Show more comments above.

Comment author: Manfred 31 January 2014 05:04:31AM 0 points [-]

Well, you can still define information entropy for probability density functions - though I suppose if we ignore Jaynes we can probably get paradoxes if we try. In fact, I'm pretty sure just integrating p*Log(p) is right. There's also a problem if you want to have a maxent prior over the integers or over the real numbers; that takes us into the realm of improper priors.

I don't know as much as I should about this topic, so you may have to illustrate using an example before I figure out what you mean.

Comment author: alex_zag_al 02 February 2014 04:25:48AM 0 points [-]

According to Jaynes, it's actually not - I don't have the page number on me, unfortunately. But the way he does it is by discretizing the space of possibilities, and taking the limit as the number of discrete possibilities goes to infinity. It's not the limit of the entropy H, since that goes to infinity, it's the limit of H - log(n). It turns out to be a little different from integrating p*Log(p).