Locke comments on Harry Potter and the Methods of Rationality discussion thread, part 16, chapter 85 - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (1106)
Why should the time of an ominous decision be so relevant to seers? Even if the consequences of the decision have a big impact on the future, that future already was the future. It's not like there is a default future before you make your decision and a different future afterwards, your decision itself would already be a part of the future of any earlier point in time. From a many worlds perspective you might have several different possible futures so your overall prospect of the future might significantly change after an important branching, but Harry's decision doesn't seem particularly influenced by recent random chance; it seems unlikely that from the perspective of 6 hours ago most future Harrys would make a completely different decision.
Eliezer seems to be taking a page from Alicorn's book. In Luminosity Alice is plagued by differing visions as Bella constantly changes her mind about her future, and then the actual future snaps into place when a final choice is made.
That's how it is in the canon Twilight (Eclipse).
Try not to take this as me being a big snobby snob, but did you actually read them?
Secondary source: I have seen the first 3 films, and Alice explicitly (and repeatedly, I think) states that "a decision has been made" when she has a vision. That decision needn't be made by Bella specifically though.
Weirdly enough, I have read both the canon and the Alicorn's fanfic.
And I already remarked in the Luminosity thread that that makes no sense. It makes even less sense in a universe with time turners.
Essentially? It has to happen at some point along the timeline, and whatever engine runs magic finds it simplest to give visions simultaneous to the decisions that cause them. (Or at least, contribute in some major way to them.)
Or, in other words, enforced narrative causality.
Take the present state of the universe and use an imperfect tool to extrapolate likely future outcomes. Changing your mind causes the present state to shift towards predicting a certain future outcome more.
The only weird thing is that you can actually fool people by pretending. The prediction mechanism has to have some very specific flaws for that to work.