You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

passive_fist comments on [LINK] Will Eating Nuts Save Your Life? - Less Wrong Discussion

7 Post author: Vaniver 30 November 2013 03:13AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (24)

You are viewing a single comment's thread. Show more comments above.

Comment author: passive_fist 30 November 2013 07:23:58AM *  3 points [-]

Sure. In a lot of cases where there seems to be a cyclic dependency, it's actually an acyclic dependency when unrolled over time. So X1 causes Y1 causes Z1 causes X2 (number denotes timestep).

The simplest model of such type is simply where X1 directly causes X2 which directly causes X3 and so on, which is called a Markov chain, and is a type of Bayesian network.

Comment author: Kawoomba 30 November 2013 07:32:39AM 0 points [-]

it's actually an acyclic dependency when unrolled over time

Yes; by introducing time steps and indexing by "at time tn" you can "always" do away with loops by kind of transforming them into spirals going down in time, from the loops they once were. But I wonder if people don't seem to mind that the endless spiral still resembles a loop, that "X at time t1" and "X at time t2" may technically be different nodes, but only in a trivial and uninteresting sense?

You can work DAG magic on such a "linearized" causal structure, I just wonder whether it's actually cutting reality at its joints, and whether there are (commonly used) alternatives.

Comment author: passive_fist 30 November 2013 07:44:44AM 4 points [-]

You should definitely look at the book "Probabilistic Graphical Models" by Daphne Koller, these concepts and more are explained in depth and with numerous examples. I kind of consider it the spiritual successor to Jayne's work.

The reason why having a DAG is important is because it makes inference much easier. There are very efficient and exact algorithms for doing inference over DAGs, whereas inference over general graphs can be extremely hard (often taking exponential time). Once you roll out a loop in this fashion, eventually you reach a point where you can assume that X1 has very little influence over, say, X100, due to mixing. http://en.wikipedia.org/wiki/Markov_chain#Ergodicity Thus the network can be 'grounded' at X1, breaking the cycle.

If you can convert a cyclic dependency to a hundred acyclic dependencies, it makes sense to do so, and not just because of computational concerns.

Often real-world events really do unroll over time and we have to have some way of modelling time dependencies. Hidden Markov models and the Viterbi algorithm are a good example of this in action.

Comment author: Kawoomba 02 December 2013 10:06:15AM 1 point [-]

You should definitely look at the book "Probabilistic Graphical Models" by Daphne Koller

I will (... put it on my "do() at some indeterminable time in the near to far future"-list. Sigh.). Thanks.

Comment author: khafra 16 December 2013 04:45:38PM 1 point [-]

If you can wait, she may teach it again.