cousin_it comments on A note on the description complexity of physical theories - Less Wrong

19 Post author: cousin_it 09 November 2010 04:25PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (177)

You are viewing a single comment's thread. Show more comments above.

Comment author: cousin_it 11 November 2010 04:09:51AM *  2 points [-]

There's a useful heuristic to solve tricky questions about "truths" and "beliefs": reduce them to questions about decisions and utilities. For example, the Sleeping Beauty problem is very puzzling if you insist on thinking in terms of subjective probabilities, but becomes trivial once you introduce any payoff structure. Maybe we could apply this heuristic here? Believing in one formulation of a theory over a different equivalent formulation isn't likely to win a Bayesian reasoner many dollars, no matter what observations come in.

Comment author: Perplexed 11 November 2010 04:56:03AM 1 point [-]

Believing in one formulation of a theory over a different equivalent formulation isn't likely to win a Bayesian reasoner many dollars, no matter what observations come in.

Actually, it might help a reasoner saddled with bounded rationality. One theory might require less computation to get from theory to prediction, or it might require less memory resources to store. Having a fast, easy-to-use theory can be like money in the bank to someone who needs lots and lots of predictions.

It might be interesting to look at that idea someone here was talking about that merged ideas from Zadeh's fuzzy logic with Bayesianism. Instead of simple Bayesian probabilities which can be updated instantaneously, we may need to think of fuzzy probabilities which grow sharper as we devote cognitive resources to refining them. But with a good, simple theory we can get a sharper picture quicker.

Comment author: cousin_it 11 November 2010 05:03:36AM *  0 points [-]

I don't understand your point about bounded rationality. If you know theory X is equivalent to theory Y, you can believe in X more, but use Y for calculations.

Comment author: Jack 11 November 2010 05:18:34AM 0 points [-]

Thats the definition of a free-floating belief isn't it? If you only have so much computational resources even storing theory X in your memory is a waste of space.

Comment author: shokwave 11 November 2010 06:33:55AM *  0 points [-]

I think cousin_it's point was that if you have a preference for both quickly solving problems and knowing the true nature of things, then if theory X tells you the true nature of things but theory Y is a hackjob approximation that nevertheless gives you the answer you need much faster (in computer terms, say, a simulation of the actual event vs a monte-carlo run with the probabilities just plugged in) then it might be positive utility even under bounded rationality to keep both theory X and theory Y.

edit: the assumption is that we have at least mild preferences for both and the bounds on our rationality are sufficiently high that this is the preferred option for most of science).

Comment author: Jack 11 November 2010 06:58:42AM *  0 points [-]

It's one thing if you want to calculate a theory that is simpler because you don't have a need for perfect accuracy. Newton is good enough for a large fraction of physics calculations and so even though it is strictly wrong I imagine most reasoners would have need to keep it handy because it is simpler. But if you have two empirically equivalent and complete theories X and Y, and X is computationally simpler so you rely on X for calculating predictions, it seems to me you believe x. What would saying "No, actually I believe in Y not X" even mean in this context? The statement is unconnected to anticipated experience and any conceivable payoff structure.

Better yet, taboo "belief". Say you are an agent with a program that allows you to calculate, based on your observations, what your observations will be in the future contingent on various actions. You have another program that ranks those futures according to a utility function. What would it mean to add "belief" to this picture?

Comment author: cousin_it 11 November 2010 01:44:59PM *  1 point [-]

Your first paragraph looks misguided to me: does it imply we should "believe" matrix multiplication is defined by the naive algorithm for small n, and the Strassen and Coppersmith-Winograd algorithms for larger values of n? Your second paragraph, on the other hand, makes exactly the point I was trying to make in the original post: we can assign degrees of belief to equivalence classes of theories that give the same observable predictions.

Comment author: ata 11 November 2010 04:25:51AM *  1 point [-]

For example, the Sleeping Beauty problem is very puzzling if you insist on thinking in terms of subjective probabilities, but becomes completely clear once you introduce a payoff structure.

Heh, I was just working on a post on that point.

Believing in one formulation of a theory over a different equivalent formulation isn't likely to win a Bayesian reasoner many dollars, no matter what observations come in. Therefore the reasoner should assign degrees of belief to equivalence classes of theories rather than individual theories.

I agree that that is true about equivalent formulations, literally isomorphic theories (as in this comment), but is that really the case about MWI vs. Copenhagen? Collapse is claimed as something that's actually happening out there in reality, not just as another way of looking at the same thing. Doesn't it have to be evaluated as a hypothesis on its own, such that the conjunction (MWI & Collapse) is necessarily less probable than just MWI?

Comment author: Jack 11 November 2010 04:32:45AM *  0 points [-]

Except the whole quantum suicide thing does create payoff structures. In determining weather or not to play a game of Quantum Russian Roulette you take your estimated winnings for playing if MWI and Quantum immortality is true and your estimated winnings if MWI or Quantum immortality is false and weigh them according to the probability you assign each theory.

(ETA: But this seems to be a quirky feature of QM interpretation, not a feature of empirically equivalent theories generally.)

(ETA 2: And it is a quirky feature of QM interpretation because MWI+Quantum Immortality is empirically equivalent to single world theories is a really quirky way.)

Comment author: cousin_it 11 November 2010 04:52:13AM *  1 point [-]

IMO quantum suicide/immortality is so mysterious that it can't support any definite conclusions about the topic we're discussing. I'm beginning to view it as a sort of thread-killer, like "consciousness". See a comment that mentions QI, collapse the whole thread because you know it's not gonna make you happier.

Comment author: Jack 11 November 2010 05:04:47AM 0 points [-]

I agree that neither we nor anyone else do a good job discussing it. It seems like a pretty important issue though.