by [anonymous]
1 min read

-2

Brienne on Facebook:

"There seems to be a whole class of words that have some sort of anti-reasoning field. Everything's going fine, and then one of these wibbly wobbly distortion devices pops up and wrecks your whole field for a decade or three.

Properties of garblejargon:

(1) Early on, you gain status by using it.
(2) Later, you lose status by not using it.
(3) It reliably causes the user to confuse the map with the territory.
(4) It feels really satisfying for most users.
(5) When people attempt to explain the same thing without using the garblejargon, they either become overtly incoherent, or give accounts that directly contradict most similar attempts by others.

Other examples of garblejargon: emergence, truth-maker, universals (maybe)."

(Slightly edited for Less Wrong.)

I'd note that 'accounts that directly contradict most similar attempts by others' might often agree with the original attempts in obvious literal denotation but differ significantly in subtle connotation or the cognitive processes they cue, such that the difference in sense amounts to a huge difference in the ways of thinking promoted by the attempt to explain vs. by the original usage.

What are other examples of garblejargon?

New Comment
10 comments, sorted by Click to highlight new comments since:

I think most words do have some meaning for some people.

Distinguishing emergent phenomena from phenomena that are the result of outside input is useful. Saying life is an emergent phenomena means that you believe that there's no intelligent designer. That's something worth saying.

On the other hand it frequently happens that people use a term without fully understanding it or simply because they want to signal tribalism. Rational is probably the word that's most often misused in that way on lesswrong.

The problem isn't that the word "rational" has no meaning. It's that people often use it without meaning anything in particular.

I agree that poor definitions can cause problems, but they don't seem to affect healthy fields too much. It seems to me that there might be some other reason for unhealthiness, like not insisting on a good numerical fit between models and data, or something else.

"Entropy" fulfils all the criteria. Despite this some people manage to use it effectively.

EDIT: The word "some" was supposed to be in there!

#5? "Entropy always increases over time"=>"the disorder in a system always increases over time," or "the number of piecewise arrangements that you effectively can't tell the difference between always increases over time, in a closed system."

"Disorder" isn't very clear or helpful, really. Still, a physicist or chemist is likely to be able to give an account of entropy that is coherent and not controversial among scientists. On the other hand, most attempts to explain entropy by non-scientists would probably satisfy the fifth criterion. But that there is any class of people who seem to be able to use the word in a meaningful way seems to distinguish "entropy" from "emergence."

."Disorder" isn't very clear or helpful, really

Not as a definition. But many explanations which use "entropy" could also use "disorder" without becoming overtly incoherent or contradicting accounts given by most others; which was the requirement of #5. Of those explanations which use "entropy" in a more technical sense, many could go with my second example; and the rest could use something more specific, like an information-theoretic epression, or a physical prediction.

But many explanations which use "entropy" could also use "disorder" without becoming overtly incoherent or contradicting accounts given by most others; which was the requirement of #5.

That works for physical entropy. For the sense of entropy used in information theory, a better substitution would be uncertainty.

"The number of microstates for a given macrostate tends to increase over time"

Or, are microstate and macrostate also garblejargon?

[This comment is no longer endorsed by its author]Reply
[-]V_V20

"The number of microstates for a given macrostate tends to increase over time"

That's not true. The number of microstates per macrostate is fixed.

You are right - my mistake.

An increase in entropy is a movement from a macrostate with a smaller number of microstates to a macrostate with a larger number of microstates.