private_messaging comments on Logical uncertainty, kind of. A proposal, at least. - Less Wrong

8 Post author: Manfred 13 January 2013 09:26AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (35)

You are viewing a single comment's thread. Show more comments above.

Comment author: private_messaging 17 January 2013 12:09:10PM *  1 point [-]

I thought of it some more... Don't think of probability as a way to represent ignorance. Probability represents a kind of knowledge. For example when you say "probability that coin fell heads is 1/2 ", that represents a sort of a model of what bounces do to the coin direction. When you multiply probabilities for a pair of coin throws, that represents independence of the throws. The 1/2 for Q(i) is not representative of that, and even if it was representative of something (statistics on the Q), product of these numbers is not probability but lower bound on the probability assuming independence; assuming they aren't independent, the probability is still 1/2 .

Probabilistic methods certainly are very useful in time bounded calculations - they are used very heavily in computer graphics and simulations of all kinds - but that is usually through use of random number source or sufficiently good random number generator, as in the probabilistic primality tests such as Miller-Rabin for example. The time-bounded bot, far from evaluating and updating "probabilities", would have to do symbolic mathematics searching for an imperfect solution, and make use of random number sources whenever appropriate, striving for bounds on probability of being wrong. (e.g. when I implemented Miller-Rabin I just made it run enough calculations so that it wouldn't fail till heat death of universe assuming that my prng wouldn't somehow return a huge string of bad witnesses, which would be very interesting if it happened).

Comment author: Manfred 17 January 2013 02:53:08PM 1 point [-]

I thought of it some more... Don't think of probability as a way to represent ignorance. Probability represents a kind of knowledge.

I'll be specific. Shannon entropy represents ignorance. The bigger it is, the more ignorant you are.

Comment author: private_messaging 17 January 2013 04:15:05PM 0 points [-]

Well, that doesn't seem very useful as per Wei Dai example...

Why not look at the methods people actually employ to approximate and guess math when they can't quite compute something? Applied mathematics is an enormously huge field.