There is also something like anti-cascade: if everybody believe that something is a) false b) it is bad taste to discuss it, – than it creates a social dynamics that some ideas are getting less discussed or evidence collected in such fields is less known or disregarded as they are collected by "these stupid people".
Examples: quantum immortality, doomsday argument, parapsychology, UFO, AGI until recently in wider IT community.
Note, that not all presumably false things are also associated with "bad taste".
As a result, cascades eventually create something like group bounding forces, and a believe in X makes people to choose different "reality bubbles".
Why stop at 2? Belief-space is large, and many issues admit more than one (+/-) bit of information to cascade.
This is a question in the info-cascade question series. There is a prize pool of up to $800 for answers to these questions. See the link above for full background on the problem (including a bibliography) as well as examples of responses we’d be especially excited to see.
___
How common, and how large, are info-cascades in communities that seek to make intellectual progress, such as academia? This distribution is presumably very heavy-tailed as we are dealing with network phenomena. But what is its actual values? How can we estimate this number?
A good starting point for thinking about this might be the paper “How citation distortions create unfounded authority: analysis of a citation network” (Greenberg, 2009), which uses social network theory and graph theory to trace how an at best very uncertain claim in biomedicine cascading into established knowledge. We won’t attempt to summarise the paper here.