KnaveOfAllTrades comments on Confused as to usefulness of 'consciousness' as a concept - LessWrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (229)
Every living thing "wants" not to be killed, even plants. This is part of the expressed preferences of their death-avoiding behavior. How does this help you assign quantitative moral value to killing some but not others?
You write that consciousness is "that thing that allows something to want other things", but how do you define or measure the presence of "wanting" except behavioristically?
I agree with the thrust of your first paragraph. But the second one (and to some extent the first) seems to be using a revealed preferences framework that I'm not sure fully captures wanting. E.g. can that framework handle akrasia, irrationality, etc.?
The word "wanting", like "consciousness", seems to me not to quite cut reality at its joints. Goal-directed behavior (or its absence) is a much clearer concept, but even then humans rarely have clear goals. As you point out, akrasia and irrationality are common.
So I would rather not use "wanting" if I can avoid it, unless the meaning is clear. For example, saying "I want ice cream now" is a statement about my thoughts and desires right now, and it gives some information about my likely actions; it leaves little room for misunderstanding.
This looks like a precision vs. accuracy/relevance tradeoff. For example, some goals that are not explicitly formulated may influence behavior in a limited way that affects actions only in some contexts, perhaps only hypothetical ones (such as those posited to elicit idealized values). Such goals are normatively important (contribute to idealized values), even though formulating what they could be or observing them is difficult.