Necessary, But Not Sufficient
There seems to be something odd about how people reason in relation to themselves, compared to the way they examine problems in other domains.
In mechanical domains, we seem to have little problem with the idea that things can be "necessary, but not sufficient". For example, if your car fails to start, you will likely know that several things are necessary for the car to start, but not sufficient for it to do so. It has to have fuel, ignition, and compression, and oxygen... each of which in turn has further necessary conditions, such as an operating fuel pump, electricity for the spark plugs, electricity for the starter, and so on.
And usually, we don't go around claiming that "fuel" is a magic bullet for fixing the problem of car-not-startia, or argue that if we increase the amount of electricity in the system, the car will necessarily run faster or better.
For some reason, however, we don't seem to apply this sort of necessary-but-not-sufficient thinking to systems above a certain level of complexity... such as ourselves.
When I wrote my previous post about the akrasia hypothesis, I mentioned that there was something bothering me about the way people seemed to be reasoning about akrasia and other complex problems. And recently, with taw's post about blood sugar and akrasia, I've realized that the specific thing bothering me is the absence of causal-chain reasoning there.
= 783df68a0f980790206b9ea87794c5b6)
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)