infotropism comments on The ideas you're not ready to post - Less Wrong

24 Post author: JulianMorrison 19 April 2009 09:23PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (253)

You are viewing a single comment's thread.

Comment author: infotropism 19 April 2009 11:44:54PM *  0 points [-]

I have an idea I need to build up about simplicity, how to build your mind and beliefs up incrementally, layer by layer, how perfection is achieved not when there's nothing left to add, but nothing left to remove, how simple minded people are sometimes being the ones to declare simple, true ideas others lost sight of, people who're too clever and sophisticate, whose knowledge is like a card house, or a bag of knots, genius, learning, growing up, creativity correlated with age, zen. But I really need to do a lot more searching about that before I can put something together.

Edit : and if I post that here, that's because if someone else wants to dig that idea, and work on it with me, that'd be with pleasure.

Comment author: ciphergoth 20 April 2009 07:38:05AM 0 points [-]

Do you understand Solomonoff's Universal Prior?

Comment author: infotropism 20 April 2009 10:02:07AM *  0 points [-]

Not the mathematical proof.

But the idea that if you don't yet have data bound to observation, then you decide the probability of a prior by looking at its complexity.

Complexity, defined as looking up the smallest compressed bitstring program for each possible turing machines (and that is the reason why it's intractable unless you have infinite computational ressources yes ?), that can be said to generate this prior as the output of being run on that machine.

The longest the bitstring, the less likely the prior (and this has to do with the idea you can make more permutations on larger bit strings, like, a one bit string can be in two states, a two bit one can be in 2 states, a 3 bit one in 2 exp 3 states, and so on.).

Then you somehow average the probabilities for all pairs of (turing machine + program) into one overall probability ?

(I'd love to understand that formally)

Comment author: PhilGoetz 20 April 2009 01:57:19AM *  0 points [-]

I'm skeptical of the concept as presented here. Anything with the phrase "how perfection is achieved" sets up a strong prior in my mind saying it is completely off-base.

More generally, in evolution and ecosystems I see that simplicity is good temporarily, as long as you retain the ability to experiment with complexity. Bacteria rapidly simplify themselves to adapt to current conditions, but they also experiment a lot and rapidly acquire complexity when environmental conditions change. When conditions stabilize, they then gradually throw off the acquired complexity until they reach another temporary simple state.

Comment author: infotropism 20 April 2009 02:23:09AM *  0 points [-]

So maybe, to rephrase the idea then, we want to strive, to achieve something as close as we can to perfection; optimality ?

If we do, we may then start laying the bases, as well as collecting practical advices, general methods, on how to do that. Though not a step by step absolute guide to perfection, rather, the first draft of one idea that would be helpful in aiming towards optimality.

edit : also, that's a st Exupery quote, that illustrates the idea, I wouldn't mean it that literally, not as more than a general guideline.

Comment author: JulianMorrison 20 April 2009 02:39:34AM 0 points [-]

The Occam ideal is "simplest fully explanatory theory". The reality is that there never has been one. They're either broken in "the sixth decimal place", like Newtonian physics, or they're missing bits, like quantum gravity, or they're short of evidence, like string theory.