Riothamus comments on Open thread, Jul. 18 - Jul. 24, 2016 - Less Wrong

3 Post author: MrMind 18 July 2016 07:17AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (123)

You are viewing a single comment's thread.

Comment author: Riothamus 21 July 2016 08:44:23PM 2 points [-]

Is there a procedure in Bayesian inference to determine how much new information in the future invalidates your model?

Say I have some kind of time-series data, and I make an inference from it up to the current time. If the data is costly to get in the future, would I have a way of determining when cost of increasing error exceeds the cost of getting the new data and updating my inference?

Comment author: Lumifer 22 July 2016 03:03:49PM 2 points [-]

Generally speaking, for this you need a meta-model, that is, a model of how your model will change (e.g. become outdated) with the arrival of new information. Plus, if you want to compare costs, you need a loss function which will tell you how costly the errors of your model are.

Comment author: MrMind 22 July 2016 01:14:04PM 0 points [-]

Unfortunately to pull this off you need to look closely to both your model and the model of the error, there's no general method AFAIK.