This is a thread for rationality-related or LW-related jokes and humor. Please post jokes (new or old) in the comments.
------------------------------------
Q: Why are Chromebooks good Bayesians?
A: Because they frequently update!
------------------------------------
A super-intelligent AI walks out of a box...
------------------------------------
Q: Why did the psychopathic utilitarian push a fat man in front of a trolley?
A: Just for fun.
Which means that P(heads on toss after next|heads on next toss) != P(heads on toss after next|tails on next toss). Independence of A and B means that P(A|B) = P(A).
As long as you're using the same coin, P(heads on toss after next|heads on next toss) == P(heads on toss after next|tails on next toss).
You're confusing the probability of coin toss outcome with your knowledge about it.
Consider a RNG which generates independent samples from a normal distrubution centered on some -- unknown to you -- value mu. As you see more samples you get a better idea of what mu is and your expectations about what numbers you are going to see next change. But these samples do not become dependent just because your knowledge of mu changes.