You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

passive_fist comments on [LINK] Will Eating Nuts Save Your Life? - Less Wrong Discussion

7 Post author: Vaniver 30 November 2013 03:13AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (24)

You are viewing a single comment's thread. Show more comments above.

Comment author: passive_fist 30 November 2013 07:44:44AM 4 points [-]

You should definitely look at the book "Probabilistic Graphical Models" by Daphne Koller, these concepts and more are explained in depth and with numerous examples. I kind of consider it the spiritual successor to Jayne's work.

The reason why having a DAG is important is because it makes inference much easier. There are very efficient and exact algorithms for doing inference over DAGs, whereas inference over general graphs can be extremely hard (often taking exponential time). Once you roll out a loop in this fashion, eventually you reach a point where you can assume that X1 has very little influence over, say, X100, due to mixing. http://en.wikipedia.org/wiki/Markov_chain#Ergodicity Thus the network can be 'grounded' at X1, breaking the cycle.

If you can convert a cyclic dependency to a hundred acyclic dependencies, it makes sense to do so, and not just because of computational concerns.

Often real-world events really do unroll over time and we have to have some way of modelling time dependencies. Hidden Markov models and the Viterbi algorithm are a good example of this in action.

Comment author: Kawoomba 02 December 2013 10:06:15AM 1 point [-]

You should definitely look at the book "Probabilistic Graphical Models" by Daphne Koller

I will (... put it on my "do() at some indeterminable time in the near to far future"-list. Sigh.). Thanks.

Comment author: khafra 16 December 2013 04:45:38PM 1 point [-]

If you can wait, she may teach it again.