Another meaning could be: I want to raise the salience of the issue ‘Red vs Not Red’, I want to convey that ‘Red vs Not Red’ is an underrated axis. I think this is also an example of level 4?
Multiply all of the above by all the possible definitions of "like" and "red" and any context relevant counterfactuals.
For example:
Also, too, maybe the speaker actually said "I, like, read" meaning that they viewed written material in a casual way and derived meaning from it, and it was mis-heard.
Very true. In this particular set of examples, I was holding the specific word meanings as pretty fixed, but in common usage, discrepancies here are a really big deal.
There are really so many things that a simple phrase could mean.
There are more shades around 1 + 2:
I think that when people on autistic spectrum say "I like red", they mean the option 1 (but normies often interpret it as something else, which sometimes gets the speaker in trouble); and when they hear someone else say "I like red", they assume that the person meant the option 1 (often, it is not the case). Learning that the other options exists is an important part of developing the "theory of normie mind".
How much this is literally true, I don't know, it was meant as a "ha ha only serious" joke.
Ah, got it.
I'd flag that I don't think non-autists literally think this way. It's not like they consider all 13 options and select #7 or something. My impression is that ~90% of the work happens intuitively or subconsciously. Often a person would agree to their intended meaning after having it explained to them, but they wouldn't naturally articulate it themselves.
To be more clear, this isn't exactly how non-autists think, it's more how nerds who are trying to understand non-autists think, think.
I think some of the time they'd agree with the clarified meaning... but also "often" they would treat it as an adversarial clarification and perhaps threateningly insinuate that you should stop adding clarity near their game.
(For reference: I'm not a nerd, I'm a language geek, and I think the main barrier to making really plausible and "human feeling" chatbots is (in some sense) figuring out to make them capable enough of manipulative insinuation (and defense from such attacks) that their powers start to feel like maybe they NEED to be Friendly for the machinery to feel safe to release into the wild?)
[1]: Here, mean means "attempt to convey"
It seems like you're not using this definition consistently. For example, in (3), the speaker doesn't care that any information at all is conveyed since the purpose is reinforcing their own belief. Several of the others appear to be beliefs that would motivate someone to say "I like red" rather than what they intend to convey to the listener (in (7) what is intended to be conveyed is "I'm committed to support you.")
You're right, I wrote this bit early on, then didn't refactor.
I just changed it a bit. I'm sure with more thought we could have a better definition here.
5, 6, 7, 9, 10, 11, and 12 are all variations on the same theme of "I want to be associated with a particular sub-set of humans" That is Simulacra level 3 behavior. And I don't think they really count as separate meanings.
8 (where the mouth noises "I like red" are just a thing our tribe does, like "ghesudheit") is a separate "meaning" from that (and is kind of a wrap-around Level 1 simulacra: you are accurately stating that you are a member of the tribe, and it is common knowledge that the mouth noise "I like red" carries no information relating to the speakers opinions about "red")
5, 6, 7, 9, 10, 11, and 12 are all variations on the same theme of "I want to be associated with a particular sub-set of humans" That is Simulacra level 3 behavior. And I don't think they really count as separate meanings.
I agree that those have that basic thing in common. Whether they "count as separate meanings" mostly depends on how big you decide a "meaning" is; this seems a lot like a semantic question to me. I could easily imagine some circumstances where caring about the differences between some of these might be useful.
and is kind of a wrap-around Level 1 simulacra: you are accurately stating that you are a member of the tribe, and it is common knowledge that the mouth noise "I like red" carries no information relating to the speakers opinions about "red"
I think we mostly agree, though I'd clarify:
Atheists say "God bless you" to other Atheists and nobody bats an eye or questions thier disbelief. People say "f u" all the time without any expectation of an difficult anatomical act. Some phrases are just arbitrary mouth noises that signal membership in "the tribe of people who use that phrase"
A speaker says, "I like red". Here are a few of the many things they could mean[1] by that.
A listener might pick up that intended meaning, or they might learn different things than the speaker intended. These could include:
B1. I have background knowledge that people who like red are overwhelming represented by tribe Y, which have characteristics M and K, so I can infer that the speaker has characteristics M and K.
B2. The speaker thinks they like red. However, I can be confident that they have a really confused or overgeneralized definition of red, and haven’t thought about it very hard.
B3. Null. The costs of interpreting and remembering this phrase exceed the expected benefits, so it's being ignored.
B4. Null. I've already been able to predict with 100% accuracy that the speaker would say "I like red", so this gives me zero new information.
B5. I was expecting the speaker to say something more relevant in their own father’s eulogy. I’m going to infer that they are pretty naive about social environments.
Additional suggestions of meanings are appreciated. (Leave as comments)
With more reflection, one could likely come up with a possible context such that "I like red" could mean any other statement, though most of the options would be trivial. Perhaps this process of trying to list interesting potential meanings and interpretations is futile, because the space is too large and ill-defined. However, my hunch is that there are some interesting subgroups, most of which are presented above in some shape.
Reflections
I think these sorts of meanings are very common. There are three obvious reasons for such meanings:
Implications
In the post on Implicature Conflation, abramdemski suggests that people "simply stop equivocating." If taken at face value, I'm not sure I totally agree. However, I do think that communication styles that come with vagueness and multiple levels of meta, represent significant costs. I imagine that we generally want to encourage more plain communication, but I expect that this desire will take extended effort for long periods of time to make progress on.
I would like to see better categorization/ontology of these sorts of communication. The Simulacrum Levels seem too simple (only 4 linear levels) and opaque.
Related discussion
Simulacrum Levels
Much of this is related to the concept of Simulacrum Levels. I think (1) is at Simulacrum level 1, (2) is at Simulacrum level 2, (5, 7, and 10) with Simulacrum level 3, and maybe (8,11,12) with Simulacrum level 4. I'm really unsure about this though, I find the levels confusing.
Conversational Implicature
As abramdemski wrote,
Arguably, almost all of these examples are about information that's not explicit, so might count as conversational implicature.
Ask vs. guess culture
Relevant post here. Guess culture requires more interpretation of things; it has more implicature.
Language Games
I believe some of the work on implicature was inspired by Wittgenstien's work around language games. My impression is that the newer work is better developed (though not as developed as I'd like) but it might be useful to point out the history.
[1]: Here, mean roughly means "attempt to convey" or "reason for saying"