You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

MrMind comments on Open Thread, Jun. 29 - Jul. 5, 2015 - Less Wrong Discussion

5 Post author: Gondolinian 29 June 2015 12:14AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (210)

You are viewing a single comment's thread. Show more comments above.

Comment author: MrMind 02 July 2015 08:49:06AM *  0 points [-]

As pointed out elsewhere, typically people use "frequentist" to mean "non-Bayesian," which is not particularly effective as a classification.

Reducing a frequentist model to a Bayesian one though it's not a pointless excercise, since it elucidates the hidden assumptions, and at least you are better aware of its field of applicability.

Did you google Bayesian Machine Learning, or search for it on Amazon?

Only after buying the book I have :/ Bishop though seems a lot interesting, thanks!

The more meta point here is to not let a worldview shut you out from potentially useful resources.

Thankfully, I'm learning ML for my own education, it's not something I need to practice right now.

Comment author: Vaniver 02 July 2015 01:50:48PM 1 point [-]

Bishop though seems a lot interesting, thanks!

You're welcome! I should point out that the other words I was considering using to describe Bishop are "classic" and "venerable"--it's not out of date (most actively used ML methods are surprisingly old), but you may want to read it in parallel with Barber. (In general, if you've never read textbooks in parallel before, I recommend it as a lesson in textbook design / pedagogy.)

Comment author: IlyaShpitser 02 July 2015 02:42:44PM 2 points [-]

Using Bishop in my class this Fall, very popular for good reason.