gwern comments on Estimating the kolmogorov complexity of the known laws of physics? - Less Wrong

10 Post author: Strilanc 08 July 2013 04:30AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (46)

You are viewing a single comment's thread. Show more comments above.

Comment author: Baughn 08 July 2013 12:23:09PM *  0 points [-]

It's the program length that matters not it's time or memory performance

That's a common assumption, but does this really make sense?

It seems to me that if you have a program doing two things - simulating two people, say - then, if it has twice the overhead for the second one, the first should in some sense exist twice as much.

The same argument could be applied to universes.

Comment author: gwern 08 July 2013 03:54:54PM 0 points [-]
Comment author: Baughn 09 July 2013 01:14:09AM 0 points [-]

"Use of the Speed Prior has the disadvantage of leading to less optimal predictions"

Unless I'm misunderstanding something, doesn't this imply we've already figured out that that's not the true prior? Which would be very interesting indeed.

Comment author: Eugine_Nier 09 July 2013 04:02:29AM 1 point [-]

Would someone mind explaining what "the true prior" is? Given that probability is in the mind, I don't see how the concept makes sense.

Comment author: Baughn 09 July 2013 09:33:16AM *  0 points [-]

I was going for "Matches what the universe is actually doing", whether that means setting it equal to the apparent laws of physics or to something like a dovetailer.

Sure, there's no way of being sure we've figured out the correct rule; doesn't mean there isn't one.

Comment author: Eugine_Nier 10 July 2013 02:23:54AM 0 points [-]

In other words it's the prior that's 1 on the actual universe and 0 on everything else.

Comment author: Baughn 10 July 2013 10:58:15AM 0 points [-]

Sure. A little hard to determine, fair enough.

Comment author: JoshuaZ 09 July 2013 04:31:13AM 0 points [-]

I'm confused also. I think they may mean something like "empirically not the optimal prior we can use with a small amount of computation" but that doesn't seem consistent with how it is being used.

Comment author: Eugine_Nier 09 July 2013 04:50:59AM 0 points [-]

"empirically not the optimal prior we can use with a small amount of computation"

I'm not even sure that makes sense since if this is based on empirical observations, presumable there was some prior prior that was updated based on those observations.

Comment author: JoshuaZ 09 July 2013 04:55:45AM 0 points [-]

Well, they could be using a set of distinct priors (say 5 or 6 of them) and then noting over time which set required less major updating in general, but I don't think this is what is going on either. We may need to just wait for Baughn to clarify what they meant.

Comment author: gwern 09 July 2013 02:01:00AM 0 points [-]

has the disadvantage of leading to less optimal predictions

As far as I know, any computable prior will have that disadvantage relative to full uncomputable SI.