Sewing-Machine comments on The prior of a hypothesis does not depend on its complexity - Less Wrong

26 Post author: cousin_it 26 August 2010 01:20PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (59)

You are viewing a single comment's thread. Show more comments above.

Comment author: [deleted] 26 August 2010 04:11:08PM *  0 points [-]

To be fair, would you agree that this is a big weakness of probability theory? It can be really difficult to sort out some of those boolean combinations.

Edit: Why the downvote?

Comment author: cousin_it 26 August 2010 04:24:24PM *  0 points [-]

I'm pretty sure it doesn't break probability theory per se, only limits its applicability. For example, Wei Dai wants UDT to use a "mathematical intuition module" that assigns degrees of belief to math statements, while I'm skeptical of the idea.

Comment author: [deleted] 26 August 2010 04:42:27PM *  0 points [-]

I argued here that it means that an AI can't compute its own prior. That argument is about all mathematical statements. If we just restrict to boolean combinations on a finite sample space then I think a similar argument shows that an AI can't compute its own prior in polynomial time, unless P = NP.

Edit: Wait, that last might be boneheaded. For given inputs we can compute Boolean expressions pretty efficiently, I think we just can't efficiently determine whether they're tautologies.

Comment author: cousin_it 26 August 2010 05:02:44PM 3 points [-]

That depends on what statements are "expressible" by the AI. If it can use quantification ("this boolean formula is true for some/all inputs"), computing the prior becomes NP-hard. An even trickier case: imagine you program the AI with ZFC and ask it for its prior about the continuum hypothesis? On the other hand, you may use clever programming to avoid evaluating prior values that are difficult or impossible to evaluate, like Monte Carlo AIXI does.

Comment author: [deleted] 26 August 2010 05:07:38PM 0 points [-]

Far out.