Manfred comments on Logical uncertainty, kind of. A proposal, at least. - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (35)
I'll be specific. Shannon entropy represents ignorance. The bigger it is, the more ignorant you are.
Well, that doesn't seem very useful as per Wei Dai example...
Why not look at the methods people actually employ to approximate and guess math when they can't quite compute something? Applied mathematics is an enormously huge field.