If it's worth saying, but not worth its own post, then it goes here.
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)
3. Open Threads should start on Monday, and end on Sunday.
4. Unflag the two options "Notify me of new top level comments on this article" and "
The solution offered at the beginning is basically: "Don't try to let your reasoning be based on underlying foundations in the first place."
That leaves the open question about how to reason. GS is an answer to that question.
"One the one hand, on the other hand, on the third hand"-reasoning as advocated in Superforcasting where there doesn't have to be a shared foudnation for all three hands is another. That's what Tetlock calls "foxy" thinking and where he argues that it makes better predictions than hedgehog thinking where everything is based on one model with one foundation. But Superfocasting provides a bunch of heuristics and not a deep ontological foundation.
I also have other frameworks that point in the same direction but that are even harder to describe and likely not accessible by simply reading a book.
No. The problem exist if you take certain assumptions for granted. If haven't claim that you don't have the problem if you make those assumption and follow certain heuristics.
This leaves open the question of how to reason differently. GS is an answer of how to reason differently and it's complex and demonstrating that it's an internally consistent approach takes time and is done in Science and Sanity over many pages.
No, I do see that the problem exist if you follow certain heuristics.
What that seems to amount to is "conduct all your reasoning inside a black box". That makes soem problems, such as the problem of being able to veify your reasoning