Confusion is in the map, not the territory. The same goes for predictability, surprise, mysteriousness, weirdness, and so on. Yet, sometimes, we say "no, actually, that's quite surprising" after initially not being surprised -- and it has a meaning. Or, sometimes we say "ah, right, it is obvious" after being confused and needing to think for a while -- and it has meaning.
Though they may be phrased as map/territory errors, statements like this can refer to the amount of surprise or confusion etc which a specific gears-level model has. A step in a proof may be "obvious" because it is an elementary application of a common method, even if it takes time for a fallible human to see it. A physicist may one day come to realize that a commonplace event is "surprising" after being not-surprised by the event dozens of times, because suddenly they've compared it to a model from physics, and found the event difficult to account for.
Our brain has a big messy model of reality, but it is often trying to simulate smaller, cleaner, more justified, more objective, and hopefully better models. When we do things like this, we are speaking from those models.
Follow up to Toward a New Technical Explanation of Technical Explanation.
Confusion is in the map, not the territory. The same goes for predictability, surprise, mysteriousness, weirdness, and so on. Yet, sometimes, we say "no, actually, that's quite surprising" after initially not being surprised -- and it has a meaning. Or, sometimes we say "ah, right, it is obvious" after being confused and needing to think for a while -- and it has meaning.
Though they may be phrased as map/territory errors, statements like this can refer to the amount of surprise or confusion etc which a specific gears-level model has. A step in a proof may be "obvious" because it is an elementary application of a common method, even if it takes time for a fallible human to see it. A physicist may one day come to realize that a commonplace event is "surprising" after being not-surprised by the event dozens of times, because suddenly they've compared it to a model from physics, and found the event difficult to account for.
Our brain has a big messy model of reality, but it is often trying to simulate smaller, cleaner, more justified, more objective, and hopefully better models. When we do things like this, we are speaking from those models.