dimensionx comments on Open thread, Jul. 04 - Jul. 10, 2016 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (80)
Yes, it's true. It is very difficult to build a forecast of non-automated way. Apparently need to take care of the right frame algorithm and gradually increase it, such as a inventions tree or matrix, for example. Then add the probability, time and further, to increase the amount of data. But what do we when the forecast crumbles in the future? Theoretically, even we having enough data, it's possible to come across a large number of bifurcation points that's just create too much a parallel universes with different futures. What if it does not matter that the forecast is correct. Importance their infinite number, from which you can choose the right future, with the help of special algorithms or be prepared for multiple outcomes.
The right algorithm doesn't give you good results if the data which you have isn't good enough.
What do you mean?
The amount of entropy that corresponds to real world information in the starting data vs. the predictions is at best the same but likely the prediction contains less information.
Another possibility is that after n years the algorithm smoothes out the probability of all the possible futures so that they are equally likely...
The problem is not only computational: unless there are some strong pruning heuristics, the value of predicting the far future decays rapidly, since the probability mass (which is conserved) becomes diluted between more and more branches.