- Abstractions are important.
- In order to think in terms of "trees" and "birds" instead of raw retinal activation rates, human minds need to form abstractions.
- In order to keep Nature red in tooth and claw, organisms must first evolve abstractions like "tooth" and "claw".
- Neural networks can learn human abstractions.
- Figuring out abstraction is possibly "the main foundational piece" of successful AI alignment.
- Abstractions are functions that map a high-dimensional space to a low-dimensional space. They have more possible inputs than possible outputs, so abstractions have to shed some information.
- Abstractions filter out useless information, while keeping useful information.
- In order for there to be such a thing as "useful information", there must be some goal(s) being pursued.
You might argue that "abstraction" only means preserving information used to predict faraway observations from a given system. However, coarse modeling of distant objects is often a convergent subgoal of the kind of organisms that Nature selects for.
The scout does not tell the general about the bluejays he saw. He reports the number of bombers in the enemy's hangar. Condensation of information always selects for goal-relevant information. Any condensation of information implies that the omitted information is less goal-relevant than the reported information; there is no abstraction without a goal.
Abstraction is a compression algorithm for a computationally bounded agent, I don't see how it is related to a "goal", except insofar as a goal is just another abstraction, and they all have to work together for the agent to have a reasonable level of fidelity of the internal map of the world.
I think it's not a coincidence that the high-order bits are the ones that are preserved by more physical processes. Like, if you take two photos of the same thing, the high order bits are more likely to be the same than the low order bits. Or if you take a photo of a picture on a screen or printed out. Or if you dye two pieces of fabric in the same vat.
I'm not saying you couldn't get an agent that cared about the low-order bits and not the high-order bits, and if you did have such an agent maybe it would find abstractions that we wouldn't. But I don't think I'm being parochial when I say that would be a really weird agent.