- Abstractions are important.
- In order to think in terms of "trees" and "birds" instead of raw retinal activation rates, human minds need to form abstractions.
- In order to keep Nature red in tooth and claw, organisms must first evolve abstractions like "tooth" and "claw".
- Neural networks can learn human abstractions.
- Figuring out abstraction is possibly "the main foundational piece" of successful AI alignment.
- Abstractions are functions that map a high-dimensional space to a low-dimensional space. They have more possible inputs than possible outputs, so abstractions have to shed some information.
- Abstractions filter out useless information, while keeping useful information.
- In order for there to be such a thing as "useful information", there must be some goal(s) being pursued.
You might argue that "abstraction" only means preserving information used to predict faraway observations from a given system. However, coarse modeling of distant objects is often a convergent subgoal of the kind of organisms that Nature selects for.
The scout does not tell the general about the bluejays he saw. He reports the number of bombers in the enemy's hangar. Condensation of information always selects for goal-relevant information. Any condensation of information implies that the omitted information is less goal-relevant than the reported information; there is no abstraction without a goal.
My point was less about weight, and more about conditions that make it relevant. Yes, this might treat relevant/not as a binary, but it is an abstraction related to action, for example:
'orders are about a focus* (while someone scouting may act responsively to changing conditions)'. Arguably, scouting is open ended - the scout knows what might be important (at least if they see it). How things are done in practice here might be worth looking into.
*I'm making this up. The point is, actions can also throw stuff out.