Scott Aaronson announced Worldview Manager, "a program that attempts to help users uncover hidden inconsistencies in their personal beliefs".
You can experiment with it here. The initial topics are Complexity Theory, Strong AI, Axiom of Choice, Quantum Computing, Libertarianism, Quantum Mechanics.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
I think you're re-inventing the wheel here.
"This towards the goal of creating "rationality augmentation" software. In the short term, my suspicion is that such software would look like a group of existing tools glued together with human practices."
Look at current work in AI, automated reasoning systems, and automated theorem proving.