A related fact: suppose you have a simple random walk (let's say integer valued for simplicity, this all works with Brownian motion too) conditioned to reach (say) 100 before reaching 0. Then (at least before it has reached 100), from state n it has a (n+1)/2n chance to move up to n+1, instead of a 1/2 chance for the unconditioned walk. The proof is another helping of Bayes' Rule.
This model applies pretty directly if you think of a probability as a martingale in [0,1], and the conditioning as being secretly told the truth. So in this example you can explicitly quantify the drift toward the truth.
This was fun!
A related fact: suppose you have a simple random walk (let's say integer valued for simplicity, this all works with Brownian motion too) conditioned to reach (say) 100 before reaching 0. Then (at least before it has reached 100), from state n it has a (n+1)/2n chance to move up to n+1, instead of a 1/2 chance for the unconditioned walk. The proof is another helping of Bayes' Rule.
This model applies pretty directly if you think of a probability as a martingale in [0,1], and the conditioning as being secretly told the truth. So in this example you can explicitly quantify the drift toward the truth.