Here's another, roughly isomorphic statement:
What is Gravity besides some form of superintelligence, or at least the remnants of superintelligence? The strongest evidence is that engineers and even physicists don't really have to understand how gravity actually works in order to use it. There is information entering the system from somewhere, and it's enough information to accurately detect when an object is unsupported or structurally unstable. And the chaotic side-effects tend to be improbably harmful. It's like an almost-Friendly, or perhaps a broken previously-Friendly, AI. Possibly the result of some ancient Singularity that is no longer explicitly remembered.
It's refreshing to see the non-anastrophic arrangement in the title.
What LessWrong would call the "system" of rationality is the rigorous mathematical application of Bayes' Theorem. The "one thousand tips" you speak of are what we get when we apply this system to itself to quickly guess its behavior under certain conditions, as carrying around a calculator and constantly applying the system in everyday life is rather impractical.
When I come across a pseudoscience I haven't seen before, I usually go to Google first to check its position with regard to reality.
Then I go to its RationalWiki article for entertainment. This is essential if I don't want to spend the rest of the day fuming at how many people "actually believe in that stuff".
Never mind, I see your point, although I still disagree with your conclusion on the grounds of narrative plausibility and good writing.