In local parlance, "terminal" values are a decision maker's ultimate values, the things they consider ends in themselves.
A decision maker should never want to change their terminal values.
For example, if a being has "wanting to be a music star" as a terminal value, than it should adopt "wanting to make music" as an instrumental value.
For humans, how these values feel psychologically is a different question from whether they are terminal or not.
See here for more information
Thanks. Looks like I was using the word as I intended to.
My point is that humans (who are imperfect decision makers and not in full control of their motivational systems) may actually benefit from changing their terminal goals, even though perfectly rational agents with consistent utility functions never would want to.
Humans are not always consistent, and making yourself consistent can involve dropping or acquiring terminal goals. (Consider a converted slaveowner acquiring a terminal goal of improving quality of life for all humans.)
My original point stems...
If it's worth saying, but not worth its own post (even in Discussion), then it goes here.