Yet here you seem to entertain the idea that it's sometimes impossible to explain what you mean
I said impossible in an hour not impossible in general. It simple might take a few years. There a scene in Neuromancer where at the end one protagonist asks the AI why another acted the way they did. The first answer is: It's unexplainable. Then the answer is, it's not really unexplainable but would take 37 years to explain. (my memory on the exact number might not be accurate)
On the other hand the idea that teaching new phenomenological primitives is extremely hard. It takes more than an hour to teach a child that objects don't fall because they are heavy but because of gravity. Yes, you might get some token agreement but when you ask questions the person still thinks that a heavy object ought to fall faster than a light one because they haven't really understand the concept on a deep level. In physics education it's called teaching phenomenological primitives.
This entails that it is possible to simply explain what you mean, even across very large inferential gaps.
You can't explain a blind man what red looks like. There are discussions that are about qualia.
but when you ask questions the person still thinks that a heavy object ought to fall faster than a light one because they haven't really understand the concept on a deep level.
No, they think that a heavy object ought to fall faster than a light one because that's how it actually works for most familiar objects falling through air.
If you've just been telling without demonstrating, this is pure reliance on authority.
Another month has passed and here is a new rationality quotes thread. The usual rules are: