Darn it, I wanted to use this term to distinguish "not-explictly-consequentialistically optimizing for Y still optimizes for X when X is being varied and is causally relevant to Y" from "having an explicit model of X being relevant to Y and therefore explicitly forming goals about X and searching for strategies that affect X." (E.g., natural selection does implicit consequentialism, humans do explicit consequentialism.) I'm not sure if I can think of an equally good replacement term for the thing I wanted to say. Would "proxy consequentialism" work for the thing you wanted to say?
Darn it, I wanted to use this term to distinguish "not-explictly-consequentialistically optimizing for Y still optimizes for X when X is being varied and is causally relevant to Y" from "having an explicit model of X being relevant to Y and therefore explicitly forming goals about X and searching for strategies that affect X." (E.g., natural selection does implicit consequentialism, humans do explicit consequentialism.) I'm not sure if I can think of an equally good replacement term for the thing I wanted to say. Would "proxy consequentialism" work for the thing you wanted to say?