If some entity pushes reality into some state -- across many contexts, not just by accident -- then you could say it prefers that state. Preferences are roughly equivalent to goals and values.
Preference orderings that obey some rationality axioms can be represented by a utility function.