Not all concepts are useful, of course; 'phlogiston' added nothing to the concept of 'fire' except for intuitions about fluids, which mostly proved to make poor predictions.
That's just wrong.
Phlogiston allowed a lot of predictions to be made that couldn't have been made before. One of them is that if you burn mercury you get mercury ashes. If you then reheat the ashes with charcoal you get your mercury back.
Phlogiston payed a lot of rent and was adopted because it payed rent.
Another counter-example for consent: being on a crowded subway with no room to not touch people (if there's someone next to you who is uncomfortable with the lack of space). I like your definition, though, and want to try to make a better one (and I acknowledge this is not the point of this post). My stab at a refinement of "consent" is "respect for another's choices", where "disrespect" is "deliberately(?) doing something to undermine". I think this has room for things like preconsent (you can choose to do something you disprefer) and crowded subways. It allows for pulling people out of the way of traffic (either they would choose to have you save their life, or you are knowingly being paternalistic and putting their life above their consent and choices).
I like your definition, though, and want to try to make a better one (and I acknowledge this is not the point of this post).
I think that's a perfectly valid thing to do in the comments here! However, I think your attempt,
My stab at a refinement of "consent" is "respect for another's choices", where "disrespect" is "deliberately(?) doing something to undermine"
is far too vague to be a useful concept.
In most realistic cases, I can give a definite answer to whether A touched B in a way B clearly did not want to be touched. In the case of my honesty definition, it does involve intent and so I can only infer statistically when someone else is being dishonest vs mistaken, but for myself I usually have an answer about whether saying X to person C would be honest or not.
I don't think I could do the same for your definition; "am I respecting their choices" is a tough query to bottom out in basic facts.
Regarding Chris, I think he is more right than you understand. There indeed are no planets. There are only some people, who, given some pictures, say "it's a planet!". And Chris himself has a brain that is able to collect all the "it's a planet" sounds and cluster them into a single bucket. And it is sometimes useful for Chris to predict whether, for a given picture, Betsy or Adam would say "planet".
Regarding "planet", it is of course and arbitrary definition. The measurements of curvature and density and orbit of an object are real things. But the definition of "planet" is only useful to the extent that it is a quick way to say "object with density X, curvature Y and orbit Z". "Usefulness" is a very low bar. There is nothing particularly natural about the choice of X Y Z, and those choices aren't equally useful for everyone. Even if your choice of X Y Z has a high score in some clustering metric, it is still arbitrary.
There is also an issue where you provide explicit definitions, when language doesn't quite work like that. I.e. "dishonesty" isn't "intention to miscalibrate", it is "the actions that would make Alice say 'Bob is dishonest'". It's good to try to describe what those actions are, but it's important not to confuse that description with the meaning of the word.
Also, I don't think either of your definitions is "fuzzy". For "honesty", I either intend to miscalibrate the recipient or I don't (of course, intention isn't quite binary, but I don't think that's a problem either). You are defining the word in terms of my internal brain state, so the main reason it feels fuzzy to you is because you can't read my mind. Likewise, if there is a card hidden in a closed box, the "redness" of that card is not fuzzy, you simply don't know whether the card is or isn't red.
Regarding Chris, I think he is more right than you understand. There indeed are no planets.
Even if humans are using arbitrary definitions. it doesn't follow that all statements in terms of those definitions are false.
it doesn't follow that all statements in terms of those definitions are false.
Indeed. I don't think anyone said anything about any statements being false though?
Also, I'm aware that Chris is an intentional strawman, so took a more charitable view of him, to compensate. The point is that Betsy's perspective is not without issues, as I assume OP to believe.
Lossiness is itself an optimized for quantity and varies in importance across differing domains with differing payoff structures. Clashes are often the result of two locally valid choices of lossiness function conflicting when attempts are made to propagate them more globally.
Better definitions->loses less of the things that I think are important and more of the things I think are unimportant. People who have faced a different payoff structure will have strenuous objections. Law of large numbers states that you will be able to find people who have faced a completely perverse data set in terms of edge cases and thus have a radically different payoff structure. If there are such people at both ends of a particular distribution then you get that effect no matter which you optimize for.
Monocultures make this worse because in effect it prevents people from taking their ball and going home ie deciding to use alternative functions for assignation of meaning.
Summary: Certain basic concepts are still very useful, even if they have fuzzy or contested boundaries, or break down in edge cases. This is basically just working out a few examples of The Cluster Structure of Thingspace. Two important examples are honesty and consent.
Adam: I can't believe that scientists 'decided' Pluto wasn't a planet. The absolute gall!
Betsy: Oh hey Adam, the demotion of Pluto is a pretty neat story! The astronomers originally didn't have to think about the definition of a planet, because they were so clearly different from other bodies they could see in the Solar System, like asteroids and comets and moons; but then they discovered Eris and many more objects like it, and they had to figure out an actual definition...
Adam: But Pluto is obviously a planet! I learned it as part of my acronym- My Very Excellent Mother Just Served Us Nine Pizzas! A few new asteroids can't change that!
Betsy: Okay, but Eris is more massive than Pluto. Sure, it's farther out, but so are some gas giant exoplanets we've discovered around other stars. The astronomers needed to agree on a definition that wasn't incredibly convoluted, so they decided that planets needed to be massive enough that gravity makes them approximately round, and massive enough to fling smaller asteroids out of their orbit.
Adam: You make it sound like astronomers can just decide what counts as a planet or not. That's ridiculous!
Chris: You both are so naive. There's no such thing as a planet, there's only collections of atoms.
Betsy: Um, Chris, that's really not helpful here. See, there are several things that the eight planets have in common with each other that nothing else orbiting the Sun is even close to; for instance, on the "clearing its own orbit" front, each of the eight planets accounts for more than 99.98% of the total mass in its orbit, while no other object accounts for more than one-third. (Pluto accounts for only 7 percent of the total mass in its orbit.)
Chris: But what if we discovered another object the size of Pluto that only accounted for 99.9% of the mass in its orbit? 99%? 90%? It's completely arbitrary where you draw the line!
Adam: (mumbling) Pizzas! You can't just serve us nine nothings!
Betsy: Yes, Chris, there may or may not be a gray area when it comes to exoplanets, since these things happen on a continuum and we don't know how common various kinds of orbiting objects are yet. But for present purposes, the clustering is pretty decisive.
Chris: Aha! You admit it, that there's no matter of fact about what constitutes a planet. Why, I'm as much of a planet as Mars is!
It shouldn't be surprising that Betsy speaks for me here. The existence of gray areas and fuzzy boundaries and edge cases doesn't prevent "planet" from being a very useful concept, especially for astronomers trying to detect them around other stars.
Adam's intuitions are pretty naive, and I'd just point him to 37 Ways That Words Can Be Wrong. It's Chris's intuitions that I want to discuss. The name for those intuitions, in several philosophical debates, is called eliminativism: the assertion that, since a certain concept is not ontologically fundamental, the concept is an illusion.
Yes, nothing is fundamental in this world except its most basic physics; all the concepts in our daily lives are trying to draw a boundary that includes a lot of examples with some properties in common, not perfect mathematical definitions.
Not all concepts are useful, of course; 'phlogiston' added nothing to the concept of 'fire' except for intuitions about fluids, which mostly proved to make poor predictions. But the concepts that come up often in our daily lives are often there because they pay rent in some sense. Sometimes they lump together disparate things and need to be split more finely to be more useful; sometimes they encode assumptions that aren't true, and only pay rent about human reactions (e.g. the concept of 'sin' as distinct from 'shame', 'harm', etc); but rarely are they entirely without content when it comes to clustering reality.
Eliezer spent some long sequences trying to articulate the core of certain concepts, most prominently choice and morality. It's worth noting that neither was a perfectly precise definition; we can come up with edge cases where those definitions get weird, and particularly in the latter case, we get legitimate gray areas where we don't know how best to extend our reasoning. But that doesn't mean that there's no moral distinction between the observable universe being turned into paperclips and the kind of future I'd prefer.
So, just to provide a handy resource, I thought I'd discuss two concepts that I've seen some particularly frustrating eliminativist intuitions on: honesty and consent.
In the case of honesty, I think there's an intuitive definition that does a terrible job at useful clustering, so let's get that out of the way first:
Bad Definition Of Honesty: Saying only things that you personally can affirm as true.
This definition is both too strict and too lax for the purposes that we want to use it. Too strict, because we don't usually take jokes and absurdities (when clearly denoted as such by tone or context) to be dishonest; too lax, because any clever person can easily make a less clever person believe falsities without saying anything technically untrue.
So, in order to think more carefully about honesty, what do we most want to use the concept for? In what ways do I care if a person is 'honest'?
Most obviously, I want to know whether I should count their statements as good evidence. People tend to have relatively stable habits when it comes to the degree on which you can trust their utterances, so that's a pretty useful concept to have for a person or an action. So with that in mind, here's my preferred definition of honesty- actually, it's easier to state as a negative:
Better Definition Of Dishonesty: Communicating in a way intended to miscalibrate the recipient.
I very precisely said 'miscalibrate'. It's not dishonest to refuse to give someone information; it is dishonest to make them believe false information.
Some nice examples of this definition in practice:
There are legitimate gray areas when it comes to honesty. If you fear you'll be grossly misunderstood whatever you say, it's very hard to find an honest course. And if you're speaking or writing to a mass audience, it's nigh impossible to cover all their misunderstandings at once without making yourself unintelligible with caveats.
But "oh, it would be so terrible if this person knew this fact, but it would hurt them if I refused to answer their question, so let me just misdirect them a bit"? That's not a gray area. That's dishonesty in exactly the sense people care most about.
For consent, I don't have as tidy a definition in general, but here's a limited one for a subset of it:
One Definition of Basic Bodily Consent: Don't touch a person in a way they clearly prefer not to be touched.
Consent in general concerns all kinds of human preferences, and since not all of these can be met simultaneously, there are some pretty complex tradeoffs to manage. Basic bodily consent avoids some of that complexity, by virtue of the fact that it's pretty rare to have different people's desires not to be touched conflict with each other. And it's more or less become the official norm of upper-class adult Western society (which is not to say that it's always observed there, just that it's well within the Overton window to call it monstrous when people violate it- this was not nearly as much the case two generations ago).
And it serves a pretty obvious purpose. We're primates with fragile physical bodies, and an evolutionary history of violence. Unwanted touching is a pattern-match for a sudden assault; it often raises our fear and anger in strong and predictable ways. And the new norm isn't that novel; the effective norm that Basic Bodily Consent replaced was "don't touch an equal-or-higher-status person in a way they clearly prefer not to be touched". We just added some egalitarianism to that.
One obvious complication to basic bodily consent is preconsent: if you join the army, you had better be aware that you're going to lose your bodily autonomy in some ways that would be very objectionable in civilian life. You might then very much not prefer to experience what you're experiencing, but you did sign up for it.
Some people will raise BDSM as another complication, but if you've met people who are very deep in BDSM culture, it's amazingly clear how much they think (in everyday life, not just the bedroom) about the nuances of bodily consent, how to handle discrepancies between preconsent and feelings in the moment, and more. (If anyone in the comments wants to suggest a good reference on advanced consent, that might be helpful.)
And as before, something being a violation of basic bodily consent doesn't always make it wrong. I will yank a toddler out of the way of a speeding car in the safest way I can, not the gentlest. (But also as before, be careful about rationalizing paternalistic reasons for violating consent! You are running on corrupted hardware.)
Legitimate gray areas include things like very weak preferences and guessing about unstated and unconscious preferences.
"This person told me they don't want to be hugged, but it'll be better for them if I expand their comfort zone, so hugs away" is not a gray area. It's a clear violation.
One last thing, while I'm here: my general assertion is "it all adds up to normality", or more specifically, "most of the common concepts that describe human interactions are useful, and need editing rather than discarding". My best example of this principle came when I realized at age 23 that my religion was almost surely untrue. I thought at the time that my morals were all based on the religion, so I felt like I was without ethical guidance. In order to avoid doing things I might regret, though, I resolved to abide by each of my basic moral principles until I felt I'd thought them through well enough to change or abandon them. Then I started reading some moral philosophy books.
Some of those precepts were indeed mirages (my religion had a silly thing against the kinds of hedonism that hurt no-one), but most of the basic moral principles, including honesty and altruism, turned out to be based in things I still found myself caring about. And I'm glad that my past self didn't foolishly binge on violating those norms before he figured out how they actually fit into a non-religious worldview.
So if you're annoyed by the naivety of the discourse on some topic, I suggest that it's better to look into how the concept is being used and what it's useful for, and maybe try and argue for a reshaping, rather than abandoning it wholesale immediately. You are not actually as much a planet as Mars is, after all.