Many people, anyway.

 

There is a common mistake in modeling humans, to think that they have actual goals, and that they deduce their actions from those goals. First there is a goal, then there is an action which is born from the goal. This is wrong.

More accurately, humans have a series of adaptations they execute. A series of behaviors which, under certain circumstances, kinda-sorta-maybe aim at the same-ish thing (like inclusive genetic fitness), but which will be executed regardless of whether or not they actually hit that thing.

Actions are not deduced from goals. The closest thing we have to "goals" are inferred from a messy conglomerate of actions, and are only an approximation of the reality that is there: just a group of behaviors.

-

I've come to see beliefs as very much the same way. Maybe some of us have real beliefs, real models. Some of us may in fact choose our statements about the world by deducing them from a foundational set of beliefs.

The mistake is repeated when we model most humans as having actual beliefs (nerds might be an exception). To suppose that their statements about reality, their propositions about the world, or their answers to questions are deduced from some foundational belief. First there is a belief, then there is a report on that belief, provided if anyone inquires about the belief they're carrying around. This is wrong.

More accurately, humans have a set of social moves/responses that they execute. Some of those moves APPEAR (to the naive nerd such as I) to be statements about how and what reality is. Each of these "statements" was probably vetted and accepted individually, without any consideration for the utterly foreign notion that the moves should be consistent or avoid contradiction. This sounds as tiresome to them as suggesting that their body language, or dance moves should be "consistent," for to them, the body language, dance moves, and "statements about reality" all belong to the same group of social moves, and thinking a social move is "consistent" is like thinking a certain posture/gesture is consistent or a color is tasty.

And missing the point like a nerd and taking things "literally" is exactly the kind of thing that reveals low social acuity.

Statements about individual beliefs are not deduced from a model of the world, just like actions are not deduced from goals. You can choose to interpret "I think we should help poor people" as a statement about the morality of helping poor people, if you want to miss the whole point, of course. We can suppose that "XYZ would be a good president" is a report on their model of someone's ability to fulfill a set of criteria. And if we interpret all their statement as though they were actual, REAL beliefs, we might be able to piece them together into something sort of like a model of the world.

All of which is pointless, missing the point, and counter-productive. Their statements don't add up to a model like ours might, anymore than our behaviors really add up to a goal. The "model" that comes out of aggregating their social learned behaviors will likely be inconsistent, but if you think that'll matter to them, you've fundamentally misunderstood what they're doing. You're trying to find their beliefs, but they don't HAVE any. There IS nothing more. It's just a set of cached responses. (Though you might find, if you interpret their propositions about reality as signals about tribal affiliation and personality traits, that they're quite consistent).

"What do you think about X" is re-interpreted and answered as though you had said "What do good, high-status groups (that you can plausibly be a part of) think about X?"

"I disagree" doesn't mean they think your model is wrong; they probably don't realize you have a model. Just as you interpret their social moves as propositional statements and misunderstand, so they interpret your propositional statements as social moves and misunderstand. If you ask how their model differs from yours, it'll be interpreted as a generic challenge to their tribe/status, and they'll respond like they do to such challenges. You might be confused by their hostility, or by how they change the subject. You think you're talking about X and they've switched to Y. While they'll think you've challenged them, and respond with a similar challenge, the "content" of the sentences need not be considered; the important thing is to parry the social attack and maybe counter-attack. Both perspectives make internal sense.

As far as they're concerned, the entire meaning of your statement was basically equivalent to a snarl, so they snarled back. Beliefs As Body Language.

Despite the obvious exceptions and caveats, this has been extremely useful for me in understanding less nerdy people. I try not to take what to them are just the verbal equivalent of gestures/postures or dance moves, and interpret them as propositional statements about the nature of reality (even though they REALLY sound like they're making propositional statements about the nature of reality), because that misunderstands what they're actually trying to communicate. The content of their sentences is not the point. There is no content. (None about reality, that is. All content is social). They do not HAVE beliefs. There's nothing to report on.

New Comment
42 comments, sorted by Click to highlight new comments since: Today at 5:20 PM

I think this is an interesting and useful view, if applied judiciously. In particular, it will always tend to be most relevant for crony beliefs--beliefs that affect the belief-holder's life mainly through other people's opinions of them, like much of politics and some of religion. When it comes to close-up stuff that can cause benefit or harm directly, you will find that most people really do have a model of the world. When you ask someone whether so-and-so would make a good president, the answer is often a signal about their cultural affiliations. Ask them which is the fastest way to get to where they work, and the answer reflects what they've learned about rush-hour traffic patterns. Ask people if they believe in God, and the answer is a signal. Ask them if they believe pre-marital sex is ever acceptable, and the answer you get is a lot more practical.

It's also worth unpacking the us-vs-them terminology you employ here. Many of us may tend to be more literal than the average person (especially those who fall on the spectrum) but in my experience we are still prone to this same behavior. In most cases, there's nothing wrong with that. Understanding the difference can help us avoid trying to cooperatively world-model with people who are just expressing social beliefs, and can also help us recognize world-modeling when we see it, so that we can reduce our tendency to make snap judgements about people on the basis of the beliefs they express.

When it comes to close-up stuff that can cause benefit or harm directly, you will find that most people really do have a model of the world.

This. Although even there people sometimes develop an absence of model. But often they don't.

in my experience we are still prone to this same behavior.

I like this approach in general. Complaining about humans doing stupid things is like complaining about water being wet. But it is potentially useful to look at some obviously stupid behavior and ask: "Am I doing this too? Maybe on a smaller scale, or in a different area, but essentially the same mistake?"

Quite right. They don't think those are beliefs, but those parts of their minds do work like a model.

They don't consciously model, but all mammals model subconsciously, right?

If I was going to clarify, I might say something like "This applies to abstract beliefs that they can't actually observe themselves being wrong about on a regular basis, like most of the ones they have about politics, religion, psychology, parenting strategies, etc."

This is rude and unnecessary. It comes across as "we real people have real beliefs and goals; those fake people do not have real beliefs and goals."

The truth is that beliefs and goals are vague generalizations that apply imperfectly to human beings, including to you. It is true that they apply better in some instances and to some people, but perfectly to none.

This is rude and unnecessary.

I understand this quotation as:

"What do good, high-status groups (that you can plausibly be a part of) think about Bound_up's post ?"

Those good, high-status people groups abhor Bound_up's post. They think it's entirely useless and even "narcissistic and hypocritical." Sure they do.

But some low life nerds like myself find it true. Just as it was expected to be.

The problem isn't rudeness, the problem is that this approach classifies the great majority of humanity as monkeys incapable of making any noises other than squeaks of affirmation towards the ingroup and screams of outrage towards the outgroup. At the very least, that's... wasteful.

The truth is often painful. But this "squeaks of affirmation" is the way a group thinks about a problem.

A group might change its opinion about something by expelling some members and accept some fresh meat from the other group.

Or just by reversing its course on something. The fashion changes from time to time. I don't wear Matrix style outfit anymore. Now it seems to me, I never did. The whole group has changed their wardrobes and forget how elegant we were back then. Or just silly.

Yeah, this is painful, but what can you do?

this "squeaks of affirmation" is the way a group thinks about a problem

No, this is the way a group maintains its current attitude towards a problem.

Combining the two words -- "group" and "thinking" -- is a problem in itself.

"squeaks of affirmation" is the way a group remembers how something is until this group changes its mind about that. By some member shufflings or by adopting the new truth by the majority of its members. Or at least by its Politburo. Purges are necessary sometimes, though.

I understand this quotation as:

"What do good, high-status groups (that you can plausibly be a part of) think about Bound_up's post ?"

Unless you mean some online groups (e.g. a subset of this forum), you misunderstand, because I am not the member of any groups in real life, whether low or high status. I live alone and very frequently do not see anyone at all in a particular 24 hour period, including at work.

Everybody is a member of various groups. For example, I consider myself as a member of Aristotelians, who prefer to speculate about the solution of a problem, then to conduct an experiment. Galileo is one of us because he logically proved how the Apollo 15 feather-hammer experiment will pan out. But those pesky experimentalists see Galileo as one of them too, since he conducted several crucial experiments as well. I have never met Galileo, they have never met Galileo, still, we chart our groups this way.

This rather bizarre example illustrates two such perceived groups. There are a billion at least such divisions (imaginary or not) out there. And some people consider themselves as members of some. Rightly or wrongly, doesn't matter.

And then they judge what some high-status members of their group would say about the particular Quantum Mechanics conundrum. Then, they side with him about that.

Almost nobody actually ponders what the Hell is really going on with the Schrodinger's poor cat. Almost nobody.

Siding with some prominent member of your (perceived) tribe is a proxy for the thinking about it. Even if you don't see this high-status person named Heisenberg a lot, you side with him.

Most problems are not that deep. Like whether or not Antarctica is currently melting. People still don't have their opinions about this, but just side either with Al Gore, either with me. Well, they side with me only incidentally, they don't know that I exist. They know that lord Monckton exists and they maybe side with him. So they think Antarctica is melting very slowly, if at all.

If I tell you Antarctica is increasing its snow cover, you may be nerd enough to either believe me after some calculations ... either be nerd enough to prove me wrong. Doesn't matter which.

But most likely you will go to either Al Gore's either to lord Monckton's side. Even though you don't meet with those two very frequently.

And then they judge what some high-status members of their group would say about the particular Quantum Mechanics conundrum. Then, they side with him about that. Almost nobody actually ponders what the Hell is really going on with the Schrodinger's poor cat. Almost nobody.

I find it harder to reason about the question "what would high status people in group X say about Schrodinger's cat?" than about the question "based on what I understand about QM, what would happen to Schrodinger's cat?". I admit that I suck at modelling other people, but how many people are actually good at it?

Not to say that belief signalling doesn't happen. After all in many cases you just know what the high status people say since they, well, said it.

I admit that I suck at modeling other people, but how many people are actually good at it?

Many, many times more people are good at judging other people than at pondering QM (or any other) conundrums. Even if they are not especially good psychologists, they suck in QM even more.

Sure, everyone has certain groups that they imagine themselves as members of. But if they don't actually interact with those people, this is more a question of an imaginary tribe and imaginary status, not a real tribe or real status.

Which tribe do you consider "real"? Those, you have a physical paper to prove your membership are only a few of them. Others are pretty undefined, but who cares?

I am not talking about pieces of paper. I am talking about people you see and talk to face to face, as commonly happened and still happens in real tribal environments.

I'd go further, and say it's grossly narcissistic and hypocritical. The framing of nerds vs. non-nerds is itself an example of the described mode of communication.

I read both this comment and the parent comment to be taking the OP in bad faith. Bound_up has taken the time to share their thinking with us and, while it may be there is an offensive interpretation of the post, it violates the discourse norms I'd at least like to see here to outright dismiss something as "bad". Some of the other comments under the parent comment make this a bit clearer, but even the most generous interpretations I can find of many of these comments lack much more content than "shut up OP".

This is one of those comments that presents itself as contradicting the post, but actually doesn't.

If you walk away using a properly watered-down version of this over-the-top description with its "obvious exceptions and caveats," then it will have exactly achieved its purpose.

I hope you won't be equally rude (but memorable) if you discuss this with people who are liable to interpret it only as a status move, and not as an attempt to describe pieces of reality. If you are discussing it with people who instinctively form conscious models of the world and interpret propositions as propositions and not as social maneuvers, then you might find that an over-the-top description will make the central idea clearer and more memorable.

You'll risk people not properly watering down the idea, of course, but if you trust your audience to water it down, you can enjoy the benefits of exaggeration. Nerdy or not, they are humans, after all, and exaggeration has its uses.

people who are liable to interpret it only as a status move ... people who instinctively form conscious models of the world and interpret propositions as propositions and not as social maneuvers

The thing is, most people do both depending on the topic and the context. Exactly the same person who will be unthinkingly tribal with respect to, say, politics, will show amazing abilities to model and reason about the world when the subject switches to his hobby (say, sailing or gardening or BBQ).

The distinction you're pointing at is not a distinction between people, it's mostly a distinction between subjects (see e.g. "politics is the mind-killer").

Maybe the best strategy for you is to keep feeling a bit superior to "non-nerds", but learn enough manners and forethought to ensure that your conversations with them are pleasant. Reading Jane Austen might help.

[-][anonymous]7y50

[General pushback in the opposite direction and providing an alternate view.]

Counterclaim 1: It's less of a status / posture thing. Most people just aren't thinking most of the time, but totally have the ability to form beliefs if pressed. Thinking of them as lazy evaluators might make more sense. "Smart" people are just those who have more metacognitive activity and are thus asking themselves questions and answering them with less need for external prompts from the environment.

Counterclaim 2: Yes, I think this model might be useful in providing better explanations for why "normal" people do things. But I also think that it can limit the way that "smart" people interact with "normal" people.

Models I myself tend to use try to focus on answering the question, "How can I take actions that improve this person's worldview / life trajectory?" which might involve using the concept that they don't have well-formed beliefs to inform how I move forward, but it certainly doesn't just end with noting that they're being mindless and writing them off as hopeless.

I guess I'm just worried that these sorts of models become an excuse for "smart" people to not even try when it comes to communicating "complex" ideas to "normal" people. I think there's something good that happens on both sides when your focus is on bridging inferential gaps and less on just modeling the other party as some sort of mindless adaptation executor.

I mean, that's part of why we end up with ontologies that refer to objects that only phenomenologically, right? Because it turns out that we get all sorts of cool additional functions when we start looking a little deeper.

I agree with your model, but without the nerd-exception.

The lack of nerd focus on epistemology and meta-ethics implies that nerds don't have beliefs either.

They do have pressures to appear rational. Either external (peer pressure) or internal (intelligence/rationality being part of the core identity because of reasons).

The same model you mention has been useful for me in understanding why nerdy people don't actually care about the epistemic soundness of their argument, and only about sounding rational. It made me understand why many were angered when I pointed the lack of sound definition of the words used or the use of countless fallacies : it's perceived as an attack against their rationality.

I agree with your model, but without the nerd-exception.

This exception might sound not very elegant, but it's crucial. Either you model the world and people inside this world and you have beliefs. Either you just try to fit in. Most people do both. But a minority do modeling when things are complicated. Which is almost always. This minority you can call nerds or geeks or professors or whatever. You have to steal the name somewhere. Or even some day invent a new one.

Those models nerds make may be often wrong, but it was a try to really understand things and not just to fit in.

It's possible that LW people are the "nerds" I mean here, and normal nerds don't have beliefs either, as you say...It's hard for me to distinguish between how much I owe to LW and how much is instinctive.

But, since well before LW, I was always explicitly willing to sacrifice any belief, like my God belief, if there was no reason to hold it. There's that, at least; I think there are meaningful instinctive differences

Indeed, we were talking about rationalists (not only LW, but SlateStarCodex too for instance).

I think there are meaningful instinctive differences too, but that's not the point, is it ? If it was, then we can assume that people holds beliefs too. Sometime they change their beliefs too because of reasons (or lack thereof).

I agree with you that other categories than beliefs and goals are often useful for understanding people, especially if you want to understand their experiences, but I worry that you may be advising throwing the baby out with the bathwater.

Likely none of us literally have metaphysically basic beliefs, preferences, or goals. But, that being said, they seem a robust ontology for understanding human behavior and that's why some of the greatest eliminative thinkers in the world--economists--use them to build their models. They even seem to work for modeling the behavior of animals and inorganic, reflective processes that we cannot examine the experiences of.

Beliefs, goals, preferences, etc. have their place in ontology; I think that place is just often not, as you notice, in understanding experiences of those who do not organize their own thoughts in those terms.

I think we're basically on the same page.

As you say, this applies to those who don't have beliefs, or don't organize their thoughts in those terms, not to all humans. We nerds do deal with propositional statements, after all, I do deduce my answers to people's questions from my model of the world, and they will find all of my answers to be consistent much more often than would occur only by chance.

I think most of us work the same way, and enjoy the kinship of similar communication instincts. But I've so often misunderstood others, and think others might benefit from this insight as I did.

This has been said before, but you are very clear and concise here, others before you were not as much.

Especially this "nerds as an exception" explains a lot. Everything becomes quite clear to me with this "exception clause".

It is possible that on some occasions I function as a non-nerd, but mostly not. I do have a world model in my head and I do talk according to it. Which is quite a freakish behavior. Even more so, if your worldview isn't a very boring one.

Quite enlightening post, thank you.

There is a common mistake in modeling humans, to think that they are simple. Assuming that "human chose a goal X" implies "human will take actions that optimally reach X" would be silly. Likewise assuming that humans can accurately observe their own internal state is silly. Humans have a series of flaws and limitations that obscure the simple abstractions of goal and belief. However, saying that goals and beliefs do not exist is a bit much. They are still useful in many cases and for many people.

By the way, it sounds a little like you're referring to so some particular set of beliefs. I think naming them explicitly would add clarity.

"What do you think about X" is re-interpreted and answered as though you had said "What do good, high-status groups (that you can plausibly be a part of) think about X?"

I don't think that's a good model. From my own work with changing beliefs I have the impression that many people have a hard time changing foundational beliefs that they learned before the age of 6 even when all the good high-status people with whom they want to associate don't value that belief.

What if everybody in their social circle (family, church members, etc) leaves? It's my impression most such beliefs are social in that they depend on the group for maintenance.

If your whole family leaves a church, and so do all your church friends and the pastor, there's a very good chance you'll leave, too. If 20 people from a similar church leave, it has little effect on you, the difference being your social relationship with the first group.

Most people change religions when they identify themselves with a new peer group and start to conform to their new peer group's norms, it seems to me.

So, yes, people have trouble changing beliefs they get before 6, but I think that's mediated by the social effects in question. Control for those, and those beliefs can be changed quite easily

In addition to what I wrote about beliefs it's worth noting that the centrality of social judgements is what happens in Kegan's stage 3. Social judgements matter a lot for how people choose which beliefs to hold at that stage but the stage isn't universal.

It's possible to change religions without changing much about the fundamental beliefs. A Christian who exchanges the Bible as the holy book for the Koran or even for our sequences as the holy book still has the same belief structure.

I was personally surprised what happened when I eliminated a deep "Don't talk to strangers" belief. No one who's opinion I value would endorse that belief but it was still back in my head and had an effect that resulted in me interacting less openly with strangers.

Other beliefs such as that a person is worthless or unlovable are also hard to change even if the person spends time with people who believe that they have inherent worth and are lovable.

I agree that "values" are useful way describe human behaviour, but to think that they actually exist inside human brain is sort of mind projection fallacy.

Most people will near halt if we directly ask them about their final goals: they don't know.

Psychologists deduce "human values" from a) human actions b ) claims of the person about his preferences. Actions and claims are often misaligned.

It all surely makes value alignment with AI more difficult: if humans don't have exact values, what should be aligned with?

This is a remarkably good post with which I couldn't have more fully resonated. Thank you!

I think it is important to note that there are probably some ways in which this is adaptive. Us nerds probably spend far too much time thinking and trying to be consistent when it offers us very little benefit. It's also better socially in order to be more flexible - people don't like people who follow the rules too strictly as they are more likely to dob them in. It also much it much easier to appear sincere, but also come up with an excuse for avoiding your prior commitments.

I believe that your distinctions instead of clarifying the concept of belief it has the opposite effect. Belief as a concept can signify:

  1. An embodied but not articulated attitude. In this sense a belief is only known when it is acted out.
  2. An articulated statement meant to describe the nature of a thing or to propose a certain course of action.

Although as a rationalist you tend to articulate your held beliefs and justify them within the rationalist methodology that does not qualify you, as far as I can see, to proclaim yourself the 'one true believer'. In addition an utterance of articulated belief does not constitute a guaranty that you will act accordingly. Only real action can prove whether you truly believe what you say. For this reason we can even conceive cases that a non-articulated but acted upon belief may be more 'real' than an articulated but not acted upon one.

Apart from this matter of consistency and sincerity we can examine the epistemological status of a belief that is derived by the rationalist methodology in contrast to a belief arrived by means of personal experience, cultural transmission, indoctrination etc. I have to admit that I am sceptical of the statement that you have a comprehensive model of the world based on Bayesian rationality and that you use it in every day life in all occasions. I would suggest an experiment. Try to articulate your full belief system by creating an actual graph. If your whole being is rationally articulated you should be able to achieve it and then observe for consistency of the graph with your behaviour. I, for once, have tried and realised that the majority of my beliefs (in the broad sense of the term) are embodied or/and unconscious with the conscious part (the intellect) constantly observing, analysing, articulating and feeding back.

But how do you know it applies to some people and not others? Post-hoc?

[-][anonymous]7y00

You're talking about stated beliefs, right? Because when it comes to revealed beliefs (those that affect utility maximization), "nerds" don't seem to have the upper hand. This reminds me of the fact that people who don't believe in evolution have more children :-)

[This comment is no longer endorsed by its author]Reply
[-][anonymous]7y00

I enjoy your writing style and want to see more of it. That said, I'm not sure the "us vs them" way of thinking is healthy in the long run.

Try talking to a "non-nerdy" person about something they know well - ask a doctor about medicine, ask a driver about driving, etc - and you'll discover that they do have beliefs, quite strongly held ones, and will express them even at the cost of social points. Whereas if you ask them about something they don't know as well, like economics, a remarkable thing will happen: they will reply without trying to sound smart! I think that's admirable behavior and we would all do well to imitate it. If you aren't a specialist in some area, thinking too much about it is most likely a mistake, no matter how "nerdy" you are. And even the "nerdiest" of us can only be specialists in a handful of areas.

[This comment is no longer endorsed by its author]Reply