"Sorry if it offends you, I just don't think in general that you should apply this stuff to society. Like... no."
I don't understand what "this stuff" refers to in this sentence and it is far from clear to me that your interpretation of what your friend said is correct.
I also don't think it's a good idea to take an axiomatic approach to something like social justice. This approach:
Edit: Also, a general comment. Suppose you think that the optimal algorithm for solving a problem is X. It does not follow that making your algorithm look more like X will make it a better algorithm. X may have many essential parts, and making your algorithm look more like X by imitating some but not all of its essential parts may make it much worse than it was initially. In fact, a reasonably efficient algorithm which is reasonably good at solving the problem may look nothing like X.
T...
Reminds me of an comment by pjeby (holy cow, 100 upvotes!) in an old thread:
One of the things that I've noticed about this is that most people do not expect to understand things. For most people, the universe is a mysterious place filled with random events beyond their ability to comprehend or control. Think "guessing the teacher's password", but not just in school or knowledge, but about everything.
Such people have no problem with the idea of magic, because everything is magic to them, even science.
...
Belief structures do not necessarily have to be internally logically consistent. However, consistent systems are better, for the following reason: belief systems are used for deriving actions to take.
I have a working hypotheses that most evil (from otherwise well-intentioned people) comes from forcing a very complex, context-dependent moral system into one that is "consistent" (i.e., defined by necessarily overly simplified rules that are global rather than context-dependent) and then committing to that system even in doubtful cases since it seems better that it be consistent.
(There's no problem with looking for consistent rules or wanting consistent rules, the problem is settling on a system too early and applying or acting on insufficient, inadequate rules.)
Eliezer has written that religion can be an 'off-switch' for intuitively knowing what is moral ... religion is the common example of any ideology that a person can allow to trump their intuition in deciding how to act. My pet example is, while I generally approve of the values of the religion I was brought up with, you can always find specific contexts (its not too difficult, actually) where their decided rul...
A comment from another perspective. To be blunt, I don't think you understand why you got upset. (I'm not trying to single you out here; I also frequently don't understand why I am upset.) Your analysis of the situation focuses too much on the semantic content of the conversation and ignores a whole host of other potentially relevant factors, e.g. your blood sugar, your friend's body language, your friend's tone of voice, what other things happened that day that might have upset you, etc.
My current understanding of the way emotions work is something like this: first you feel an emotion, then your brain guesses a reason why you feel that emotion. Your brain is not necessarily right when it does this. This is why people watch horror movies on dates (first your date feels an intense feeling caused by the horror movie, then hopefully your date misinterprets it as nervousness caused by attraction instead of fear). Introspection is unreliable.
When you introspected for a reason why you were upset, you settled on "I was upset because my friend was being so irrational" too quickly. This is an explanation that indicates you weren't trying very hard to explicitly model what was going on in your friend's head. Remember, your friend is not an evil mutant. The things they say make sense to them.
But the point is that it, to me, is much more interesting/useful/not tedious to consider this idea that challenges rationality very fundamentally
This is what I mean when I say I don't think you've correctly understood your friend's point of view. Here is a steelmanning of what I imagine your friend's point of view to be that has nothing to do with challenging rationality:
"Different domain experts use different kinds of frameworks for understanding their domains. Taking the outside view, someone who claims that a framework used in domain X is more appropriate for use in domain Y than what Y-experts themselves use is probably wrong, especially if X and Y are very different, and it would take a substantial amount of evidence to convince me otherwise. In the particular case that X = mathematics and Y = social justice, it seems like applying the methods of X to Y risks drastically oversimplifying the phenomena in Y."
My friend compared me to white supremacist philosophes from the early days of the Enlightenment. And when I said that I did not share their ideas, my friend said that it was not because of my ideas, but because I was trying to apply rationality to society.
Yo...
"Sorry if it offends you, I just don't think in general that you should apply this stuff to society. Like... no."
Let me translate: "You should do what I say because I said so." This is an attempt to overpower you and is quite common. Anyone who insists that you accept their belief without logical justification is simply demanding that you do what they say because they say so. My response, to people who can be reasoned with, is often just to point this out and point out that it is extremely offensive. If they cannot be reasoned with then you just have to play the political game humans have been playing for ages.
A more charitable translation would be "I strongly disagree with you and have not yet been able to formulate a coherent explanation for my objection, so I'll start off simply stating my disagreement." Helping them state their argument would be a much more constructive response than confronting them for not giving an argument initially.
Let me offer a different translation: "You are proposing something that is profoundly inhuman to my sensibilities and is likely to have bad outcomes."
Rukifellth below has, I think, a much more likely reason for the reaction presented.
Tetlock's foxes vs. hedgehogs (people without strong ideologies are somewhat better predictors than those who have strong ideologies, though still not very good predictors) suggests that a hunt for consistency in for something as complex as politics leads to an excessively high risk of ignoring evidence.
Hedgehogs might have premises about how to learn more than about specific outcomes.
I suspect that what frustrated you is not noticing your own confusion. You clearly had a case of lost purposes: "applying a math thing to social justice" is instrumental, not terminal. You discovered a belief "applying math is always a good thing" which is not obviously connected to your terminal goal "social justice is a good thing".
You are rationalizing your belief about applying math in your point 2:
An inconsistent belief system will generate actions that are oriented towards non-constant goals, and interfere destructively with each other, and not make much progress. A consistent belief system will generate many actions oriented towards the same goal, and so will make much progress.
How do you know that? Seems like an argument you have invented on the spot to justify your entrenched position. Your point 3 confirms it:
No matter how offended you are about something, thinking about it will still resolve the issue.
In other words, you resolved your cognitive dissonance by believing the argument you invented, without any updating.
If you feel like thinking about the issue some more, consider connecting your floating belief "math is good" ...
An inconsistent belief system will generate actions that are oriented towards non-constant goals, and interfere destructively with each other, and not make much progress. A consistent belief system will generate many actions oriented towards the same goal, and so will make much progress.
One way to model willpower is that it is a muscle that uses up brain energy to accomplish things. This is a common model but it is not my current working hypothesis for how things "really universally work in human brains". Rather, I see a need for "that which people vaguely gesture towards with the word willpower" as a sign that a person's total cognitive makeup contains inconsistent elements that are destructively interfering with each other. In other words, the argument against logically coherent beliefs is sort of an argument in favor of akrasia.
Some people seem to have a standard response to this idea that is consonant with the slogan "that which can be destroyed by the truth should be" and this is generally not my preferred response except as a fallback in cases of a poverty of alternative options. The problem I have with "destroy my akrasia with the tr...
I sympathize, but I down voted this post.
this is a personal story and a generalization from one person's experience. I think that as a category, that's not enouph for a post on its own. It might be fine as a comment in an open thread or other less prominently placed content.
And that did it. For the rest of the day, I wreaked physical havoc, and emotionally alienated everyone I interacted with. I even seriously contemplated suicide. I wasn't angry at my friend in particular for having said that. For the first time, I was angry at an idea: that belief systems about certain things should not be internally consistent, should not follow logical rules.
This emotional reaction seems abnormal. Seriously, somebody says something confusing and you contemplate suicide?
What are you, a Straw Vulcan computer that can be disabled with a Logic Bomb ?
Unless you are making this up, I suggest you consider seeking professional help.
It was extremely difficult to construct an argument against, because all of my arguments had logically consistent bases, and were thus invalid in its face.
Actually, it's rather easy: just tell them that ex falso quodlibet.
And that did it. For the rest of the day, I wreaked physical havoc, and emotionally alienated everyone I interacted with. I even seriously contemplated suicide.
You never get offended but this little thing brought you on the verge of suicide!? Did you recently become a rationalist? I am not sure how to read the situation.
"Sorry if it offends you, I just don't think in general that you should apply this stuff to society. Like... no."
I'm making a wild guess, but possibly it's the bold part that offended you... Because this is usually what irritates me (Yay for lovely generalization from one example... but see also Emile's comment quoting pjeby's comment).
Similar offenders are:
In general, what irritates me is the refusal to really discuss the subject, and the quick dismissal. If arguments are soldiers, this is like building a foxhole and declare you won't move from there at any cost.
"Sorry if it offends you, I just don't think in general that you should apply this stuff to society. Like... no."
I felt offended reading this, even though I was expecting something along these lines and was determined not to be offended. I've come to interpret this feeling, on a 5-second level, as "Uh oh, someone's attacking my group." I'm sure I'd be a little flustered if someone said that to me in conversation. But after some time to think about it, I think my response would be "Why shouldn't math be applied to social justice?...
Almost no one these days regards axiom compiling as a way of describing emotional phenomenon, such as altruism. The idea of describing such warm reasons in terms of axioms was so unintuitive that it caused your friend to think that you were looking for some other reason for social justice, other than a basic appeal to better human nature. He may have been disgusted at what he thought was an implicit disregard for the more altruistic reasons for social justice, as if they weren't themselves sufficient to do good things.
1) I think your reaction to this situation owed itself more to your psychological peculiarities as a person (whichever they are) than to a characteristic that all people that identify as rationalists share. There's no reason to expect people with the same beliefs as yours never to keep their cool (at least never on the first time) when talking to someone with an obviously incompatible belief system.
2)
...It was extremely difficult to construct an argument against, because all of my arguments had logically consistent bases, and were thus invalid in its face.
I usually turn to the Principle of Explosion to explain why one should have core axioms in their ethics, (specifically non-contradictory axioms). If some principle you use in deciding what is or is not ethical creates a contradiction, you can justify any action on the basis of that contradiction. If the axioms aren't explicit, the chance of a hidden contradiction is higher. The idea that every action could be ethically justified is something that very few people will accept, so explaining this usually helps.
I try to understand that thinking this way is odd...
From your strong reaction I would guess that your friend's reaction somehow ruined the model of the world you had, in a way that was connected with your life goals. Therefore for some time your life goals seemed unattainable and the whole life meaningless. But gradually you found a way to connect your life goals with the new model.
Seems to me that your conclusion #2 is too abstract ("far") for a shock that I think had personal ("near") aspects. You write impersonal abstractions -- "do people really desire progress? can actions actu...
I would add to this that if the domain of discourse is one where we start out with a set of intuitive rules, as is the case for many of the kinds of real-world situations that "social justice" theories try to make statements about, there are two basic ways to arrive at a logically consistent belief structure: we can start from broad general axioms and reason forward to more specific rules (as you did with your friend), or we can start from our intuitions about specific cases and reason backward to general principles.
IME, when I try to reason-for...
It seems possible that when your friend said, in effect, that there can never be any axioms for social justice, what they really meant was simply, "I don't know the axioms either." That would indeed be a map/territory confusion on their part, but it's a pretty common and understandable one. The statement, "Flying machines are impossible" is not equivalent to "I don't know how to build a flying machine," but in the short term they are making the same prediction: no one is flying anywhere today.
Actually, and I don't know if you'...
Perhaps I'm mistaken about this, but isn't a far stronger argument in favor of a consistent belief system the fact that with inconsistent axioms you can derive any result you want? In an inconsistent belief system you can rationalize away any act you intend to take, and in fact this has often been seen throughout history.
I really liked this post, and I think a lot of people aren't giving you enough credit. I've felt similarly before -- not to the point of suicide, and I think you might want to find someone who you can confide those anxieties with -- but about being angered at someone's dismissal of rationalist methodology. Because ultimately, it's the methodology which makes someone a rationalist, not necessarily a set of beliefs. The categorizing of emotions as in opposition to logic for example is a feature I've been frustrated with for quite some time, because emotions ...
They seemed to be having trouble understanding what I was saying, and it was hard to get an opinion out of them. They also got angry at me for dismissing Tumblr as a legitmate source of social justice.
I am confused why your friend thought good social justice arguments do not use logic to defend their claims. Good arguments of any kind use logic to defend their claims. Ergo, all the good social justice arguments are using logic to defend their claims. Why did you not say this to your friend?
EDIT: Also confused about your focus on axioms. Axioms, though essential, are the least interesting part of any logical argument. If you do not accept the same axioms as your debate partner, the argument is over. Axioms are by definition not mathematically demonstrable. In your post, you stated that axioms could be derived from other fundamental axioms, which is incorrect. Could you clarify your thinking on this?
Did I miss any sort of deeper reasons I could be using for this?
"That one guy I know and the stuff they say" is usually not a great proxy for some system of belief in general; therefore, to the extent you care about the same stuff they care about when they say "social justice", a knee-jerk rejection of thinking about that stuff systematically or in terms of axioms and soforth should probably come off to you as self-defeating or shortsighted.
2 sounds wrong to me - like you're trying to explain why having a consistent internal belief structure is important to someone who already believes that.
The things which would occur to me are:
The refuse of your friend to axiomatize the theory of social justice doesn't necessarily imply that he believes that social justice can be governed by incoherence (theory here is used in its model theory meaning: a set of true propositions). It may under a (admittedly a little stretched) charitable reading just means that your friend believes it's incompressible: the complexity of the axioms is as great as the complexity of the facts the axioms would want to explain.
It's just like the set of all arithmetical truths: you cannot axiomatize it, but it's for sure not inconsistent.
Therefore, assuming the first few statements, having an internally consistent belief system is desirable!
I think that's a straw man. Nobody denies that it's advantagous to have a consistent belief system. People rather argue that consistency isn't the only criteria on which to judge belief systems.
It pretty easy to make up a belief system that's internally consistient but that leads to predictions about reality that are wrong.
A good example would be the problem of hidden Markov models. There are different algorithms to generate a path. The Viterbi al...
I don't have anything to add, other than to say that I've had similar frustrations with people. It was mainly in my heyday of debating theists on the Internet. I quite often would encounter the same exact dismissal of logic when presenting a logical argument against the existence of god; literally, they would say something like "you can't use that logic stuff on god" (check out stuff like the presuppositional argument for the existence of god if you want to suffer a similar apoplexy). Eventually, I just started to find it comical.
They seemed to be having trouble understanding what I was saying, and it was hard to get an opinion out of them. They also got angry at me for dismissing Tumblr as a legitmate source of social justice.
Also, a general comment. Suppose you think that the optimal algorithm for solving a problem is X. It does not follow that making your algorithm look more like X will make it a better algorithm. X may have many essential parts, and making your algorithm look more like X by imitating some but not all of its essential parts may make it much worse than it was initially. In fact, a reasonably efficient algorithm which is reasonably good at solving the problem may look nothing like X.
This is to say that at the end of the day, the main way you should be criticizing your friend's approach to social justice is based on its results, not based on aesthetic opinions you have about its structure.
In order to make a rationalist extremely aggravated, you can tell them that you don't think that belief structures should be internally logically consistent.
There are ways to argue for that too. Both the aggravated rationalist and your friend have inconsistent belief systems, as you say the difference is just that the aggravated rationalist would like for that to change, while your friend is fine with that.
The point is this: you can value keeping the "is"-state, and not want to change the as-is for some optimized but partly different / differe...
Usually, I don't get offended at things that people say to me, because I can see at what points in their argument we differ, and what sort of counterargument I could make to that. I can't get mad at people for having beliefs I think are wrong, since I myself regularly have beliefs that I later realize were wrong. I can't get mad at the idea, either, since either it's a thing that's right, or wrong, and if it's wrong, I have the power to say why. And if it turns out I'm wrong, so be it, I'll adopt new, right beliefs. And so I never got offended about anything.
Until one day.
One day, I encountered a belief that should have been easy to refute. Or, rather, easy to dissect, and see whether there was anything wrong with it, and if there was, formulate a counterargument. But for seemingly no reason at all, it frustrated me to great, great, lengths. My experience was as follows:
I was asking the opinion of a socially progressive friend on what they feel are the founding axioms of social justice, because I was having trouble thinking of them on my own. (They can be derived from any set of fundamental axioms that govern morality, but I wanted something that you could specifically use to describe who is being oppressed, and why.) They seemed to be having trouble understanding what I was saying, and it was hard to get an opinion out of them. They also got angry at me for dismissing Tumblr as a legitmate source of social justice. But eventually we got to the heart of the matter, and I discovered a basic disconnecf between us: they asked, "Wait, you're seriously applying a math thing to social justice?" And I pondered that for a moment and explained that it isn't restricted to math at all, and an axiom in this context can be any belief that you use to base your beliefs on. However, then the true problem came to light (after a comparison of me to misguided 18th-century philosophes): "Sorry if it offends you, I just don't think in general that you should apply this stuff to society. Like... no."
And that did it. For the rest of the day, I wreaked physical havoc, and emotionally alienated everyone I interacted with. I even seriously contemplated suicide. I wasn't angry at my friend in particular for having said that. For the first time, I was angry at an idea: that belief systems about certain things should not be internally consistent, should not follow logical rules. It was extremely difficult to construct an argument against, because all of my arguments had logically consistent bases, and were thus invalid in its face.
I'm glad that I encountered that belief, though, like all beliefs, since I was able to solve it in the end, and make peace with it. I came to the following conclusions: