If you really think Jordan Peterson is worth inducting into the rationalist hall of fame, you might as well give up the entire rationalist project altogether. The problem is not merely that Peterson is religious and a social conservative, but that he is a full-blown mystic and a crackpot, and his pursuit of "metaphorical truth" necessarily entails bad methodology and a lack of rigor that leads to outright falsehoods.
Take, for example, his stated belief that the ancient Egyptians and Chinese depicted the double helix structure of DNA in their art. (In another lecture he makes the same claim in the context of Hindu art)
Or his statement suggesting that belief in God is necessary for mathematical proof.
Or his "consciousness creates reality" quantum mysticism.
Or his use of Jung, including Jung's crackpot paranormal concept of "synchronicity".
As a trained PhD psychologist, Peterson almost certainly knows he's teaching things that are unsupported, but keeps doing it anyway. Indeed, when someone confronted him about his DNA pseudo-archaeology, he started backpedaling about how strongly he believed it- though he also went on to speculate about wheth...
Death of the Author, but iirc Scott mentioned the point of the Kabbalah in Unsong is the exact opposite-- you can connect anything to anything if you try hard enough, so the fact that you can is meaningless.
Of course, this shows the exact problem with using fiction as evidence.
What is your subjective probability that the most prolific mathematician of all time did half of his most productive work after going blind in both eyes?
That's surprising but not that surprising: Milton wrote much of his best poetry while blind, and Beethoven was famously deaf. Conversely, I cannot think of a single unambiguous example of a mythological motif encoding a non-obvious scientific truth (such as that nothing can go faster than light, or that all species evolved from a single-celled organism, or that the stars are trillions and trillions of miles away), so I think this is very very unlikely.
I have a generally positive opinion of Peterson but I wouldn't be adding anything to the conversation by talking about why, you already covered his good points. Instead I'll talk about why I'm leery of dubbing him a Rationalist hero.
Peterson's entire first podcast with Sam Harris was an argument over Peterson's (ab)use of the word "truth". I'm not sure if anyone walked away from that experience entirely sure what Peterson means when he says "truth".
One can assume that he means something like "metaphorical truth", that some stories contain a kind of truth that is more like "usefulness as a map" than "accurate reflection of objective reality". Sam Harris' rejoinder was along the lines that using stories as ways of discovering meaning is all well and good, but believing those stories are true leads to horrifying failure modes.
For example, if you believe some particular bit of metaphysical narrative is true, you feel compelled act on contingent details of the story that are unrelated to the intended moral. Insert your own favorite minor bit of religious dogma that led to hundreds of years of death and...
I think it's pretty risk to play Rationalist taboo with what other people are saying. It's supposed to be a technique for clarifying an argument by removing a word from the discussion, preventing it from being solely an argument about definitions. I would like it if Peterson would taboo the word "truth", yeah.
I also don't think that dereferencing the pointer actually helps. I object to how he uses "truth", and I also object to the idea that Harry Potter is (dereferenced pointer)->[more psychologically useful to believe and to use as a map than discoveries about reality arrived at via empiricism]. It's uh ... it's just not. Very much not. Dangerous to believe that it is, even. Equally if not more dangerous to believe that Christianity is [more psychologically useful to believe and to use as a map than discoveries about reality arrived at via empiricism]. I might sign on to something like, certain stories from Christianity are [a productive narrative lens to try on in an effort to understand general principles of psychology, maybe, sometimes].
The claim is that if believing a story predictably makes your life better, then you should ove...
But the thing we're interested in is instrumental rationality, not epistemic rationality.
Ironically, this sentence is epistemically true but instrumentally very dangerous.
See, to accurately assess which parts of epistemic rationality one should sacrifice for instrumental improvements requires a whole lot of epistemic rationality. And once you've made that sacrifice and lost some epistemic rationality, your capacity to make such trade-offs wisely in the future is severely impaired. But if you just focus on epistemic rationality, you can get quite a lot of winning as a side effect.
To bring it back to our example: it's very dangerous to convince yourself that Jesus died for your sins just because you notice Christians have more friends. To do so you need to understand why believing in Jesus correlates with having friends. If you have a strong enough understanding of friendship and social structures for that, you can easily make friends and build a community without Jesus.
But if you install Jesus on your system you're now left vulnerable to a lot of instrumentally bad things, with no guarantee that you'll actually get the friends and community you wanted.
Putting on the Jordan Peterson mask adds two crucial elements that rationalists often struggle with: motivation and meaning.
Holy shit, yes, thank you, this is exactly what has been motivating all of my contributions to LW 2.0. What is even the point of strengthening your epistemics if you aren't going to then use those strong epistemics to actually do something?
I first read the Sequences 6 years ago, and since then what little world-saving-relevant effort I've put in has been entirely other people asking me to join in on their projects. The time I spent doing that (at SPARC, at CFAR workshops, at MIRI workshops) was great; an embarrassing amount of the time I spent not doing that (really, truly embarrassing; amounts of time only a math grad student could afford to spend) was wasted on various forms of escapism (random internet browsing, TV, video games, anime, manga), because I was sad and lonely and put a lot of effort into avoiding having to deal with that (including avoiding debugging it at CFAR workshops because it was too painful to think about). At almost no point did I have the motivation to start a project on my own, and I didn't.
I've been working inte...
Rationalists who are epistemically strong are very lucky: you can use that strength in a place where it will actually help you, like investigating mysticism, by defending you from making the common epistemic mistakes there.
This is an interesting idea, but how does someone tell whether they're strong enough to avoid making the common epistemic mistakes when investigating mysticism? For example, if I practice meditation I might eventually start experiencing what Buddists call vipassana ("insight into the true nature of reality"). I don't know if I'd be able to avoid treating those experiences as some sort of direct metaphysical knowledge as most people apparently do, as opposed to just qualia generated by my brain while it's operating differently from normal (e.g., while in a state of transient hypofrontality).
There's probably a number of distinct epistemic risks surrounding mysticism. Bad social dynamics in the face of asymmetric information might be another one. (Access to mystical experiences is hard to verify by third parties but tempting to claim as a marker of social status.) I don't know how someone or some community could be confident that they wouldn't fall prey to one of these risks.
Good question. You can test your ability to avoid mistaking strong emotions for strong beliefs in general. For example, when you get very angry at someone, do you reflexively believe that they're a terrible person? When you get very sad, do you reflexively believe that everything is terrible? When you fall in love with someone, do you reflexively believe that they have only good qualities and no bad qualities? Etc.
I know I keep saying this, but it keeps being true: for me a lot of my ability to do this, and/or my trust in my ability to do this, came from circling, and specifically repeatedly practicing the skill of distinguishing my strong emotional reactions to what was happening in a circle from my best hypotheses about what was happening.
I don't know how someone or some community could be confident that they wouldn't fall prey to one of these risks.
I can't tell people what level of confidence they should want before trying this sort of thing, but I decided based on my instincts that the risk for me personally was low enough relative to the possible benefits that I was going to go for it, and things have been fine as far as my outside view is concerned so fa...
You can test your ability to mistake strong emotions for strong beliefs in general.
How much of this ability is needed in order to avoid taking strong mystical experiences at face value?
I can’t tell people what level of confidence they should want before trying this sort of thing, but I decided based on my instincts that the risk for me personally was low enough relative to the possible benefits that I was going to go for it,
In the comment I was replying to, you were saying that some rationalists are being too risk-averse. It seems like you're now backing off a bit and just talking about yourself?
and things have been fine so far, e.g. my belief in physics as standardly understood has not decreased at all.
I'm worried that the epistemic risks get stronger the further you go down this path. Have you had any mystical experiences similar to vipassana yet? If not, your continuing belief in physics as standardly understood does not seem to address my worry.
...Also, to some extent I feel like this argument proves too much. There are epistemic risks associated to e.g. watching well-made TV shows or movies, or reading persuasive writing, and rationalists take on these epistemic r
I've only taken a few steps down the path that Qiaochu is following, but I have a few thoughts regarding epistemic risk-management:
What would it look like, if we noticed that a particular subgroup was beginning to lose its mind? I think it might look like a few unusually-rude people calling into question the alleged experiences of that particular subgroup and asking pointed questions about exactly what had happened to them and exactly why they thought themselves better off for it; and like the members of that particular subgroup responding with a combination of indignation and obfuscation: "we've definitely been changed for the better, but of course we can't expect you to understand what it's like if it hasn't happened to you, so why do you keep pushing us for details you know we won't be able to give?" / "I find it very discouraging to get this sort of response, and if it keeps happening I'm going to leave"; and like some of the more community-minded folks objecting to the rudeness of the questioners, observing acerbically that it always seems to be the same people asking those rude questions and wondering whether the emperor really has any clothes, and maybe even threatening to hand out bans.
All of which sounds kinda familiar.
I don't actually think that ...
gjm, point well taken. I wonder if it would be easier for people inside or outside Berkeley to spot if anyone there is seriously going off the rails and say something about it.
Anyway, I do want to elaborate a little bit on my "Efficient Frontier" idea. If anyone can build a map of which "mystical experiences" are safe/dangerous/worthwhile/useless and for whom, it should be people like us. I think it's a worthwhile project and it has to be done communally, given how different each person's experience may be and how hard it is to generalize.
The main example here is Sam Harris, a hardcore epistemic rationalist who has also spent a lot of time exploring "altered states of consciousness". He wrote a book about meditation, endorses psychedelics with caveats, is extremely hostile to any and all religions, and probably thinks that Peterson is kinda crazy after arguing with him for four hours. Those are good data points, but we need 20 more Sam Harrises. I'm hoping that LW can be the platform for them.
Perhaps we need to establish some norms for talking about "mystical experiences", fake frameworks, altered consciousness etc. so that people feel safe both talking and listening.
I think that perhaps what bothers a lot of rationalists about your (or Valentine's) assertions is down to three factors:
You don't tend to make specific claims or predictions. I think you would come off better - certainly to me and I suspect to others - if you were to preregister hypotheses more, like you did in the above comment. I believe that you could and should be more specific, perhaps stating that over a six month period you expect to work n more hours without burning out or that a consensus of reports from outsiders about your mental well-being will show a marked positive change during a particular time period that the evaluators did not know was special.
I have several different responses to this which I guess I'll also number.
I suspect that a lot of fear of epistemic contamination comes from the emphasis on personal experience. Personal (meatspace) experiences, especially in groups, can trigger floods of emotions and feelings of insights without those first being fed through rational processing.
I recognize the concern here, but you can just have the System 1 experience and then do the System 2 processing afterwards (which could be seconds afterwards). It's really not that hard. I believe that most rationalists can handle it, and I certainly believe that I can handle it. I'm also willing to respect the boundaries of people who don't think they can handle it. What I don't want is for those people to typical mind themselves into assuming that because they can't handle it, no one else can either, and so the only people willing to try must be being epistemically reckless.
Therefore it seems reasonable to be suspicious of anyone who claims to teach through personal experience.
There are plenty of completely mundane skills that can basically only be taught in this way. Imagine trying to teach someone how to play basketball using only text, etc. There's no substitute for personal ex...
I'm glad you got over the initial triggered-ness. I did wonder about being even more explicit that I don't in fact think you guys are losing your minds, but worried about the "lady doth protest too much" effect.
I wasn't (in case it isn't obvious) by any means referring specifically to you, and in particular the "if it keeps happening I'm going to leave" wasn't intended to be anything like a quotation from you or any specific other person. It was intended to reflect the fact that a number of people (I think at least three) of what I called the Berkeley School have made comments along those general lines -- though I think all have taken the line you do here, that the problem is a norm of uncharitable pushback rather than being personally offended. I confess that the uncharitably-pushing-back part of my brain automatically translates that to "I am personally offended but don't want to admit it", in the same way as it's proverbially always correct to translate "it's not about the money" to "it's about the money" :-).
(For the avoidance of doubt, I don't in fact think that auto-transl...
I am going to make one more response (namely this one) and then stop, because the experience of talking to you is painful and unpleasant and I'd rather do something else.
And I am saying that your implication—that this is the best solution, or maybe even the only solution—is erroneous.
I don't think I've said anything like that here. I've said something like that elsewhere, but I certainly don't mean anything like "mysticism is the only solution to the problem of feeling unmotivated" since that's easy to disprove with plenty of counterexamples. My position is more like:
"There's a cluster of things which look vaguely like mysticism which I think is important for getting in touch with large and neglected parts of human value, as well as for the epistemic problem of how to deal with metacognitive blind spots. People who say vaguely mystical things are currently the experts on doing this although this need not be the case in principle, and I suspect whatever's of value that the mystics know could in principle be separated from the mysticism and distilled out in a form most rationalists would be happy with, but as far as I know that work mostly hasn't been done yet. Feeling more motivated is a side effect of getting in touch with these large parts of human value, although that can be done in many other ways."
Jordan Peterson certainly has a strong and appealing idea about what went wrong. But I think Eric Hoffer (a similar character half a century ago) already answered that question pretty much. And when I try to find examples that put their views into contrast, Hoffer wins.
For example, in this video Peterson gives one of his most powerful phrases: "Don't use language instrumentally!" The idea is that, if you allow yourself to twist your words away from what's perfectly truthful, you gradually begin to think like that too, leading straight to the horrors of fascism and communism. It all sounds very convincing.
But then I remember every product manager I've had as a programmer. They were all happy, well-adjusted people who had this incredible skill at using language instrumentally - convincing various decision makers to go along, sometimes not in perfectly honest ways. They all seemed to have learned that skill with their mother's milk, and it hasn't hurt them at all!
Hoffer wouldn't be surprised by that. His only message is that you should have self-esteem, and not join any mass movements to compensate for lack of self-esteem. If you can manage that, it's okay to be a liar or cy...
I've had trouble making up my mind about Jordan Peterson, and this post was enormously helpful in clarifying my thinking about him. Also:
A new expansion just came out for the Civilization 6 video game, and instead of playing it I’m nine hours into writing this post and barely halfway done. I hope I’m not the only one getting some meaning out of this thing.
This resulted in me updating heavily for the amount of effort involved in writing great content.
This post took about 13 hours, and I didn't even edit the first draft much. Just imagine how long great content would take!
On the other hand, from a couple of conversations I've had with Scott he seems to write much faster and with almost no editing needed. Something like this might take him 3-4 hours in a single sitting. I've only been writing seriously for a couple of years - maybe writers get faster with time, and maybe Scott is just in a different class in terms of talent.
Commenting note: this post is subject to LesserWrong frontpage moderation rules, but I want to offer my own guidelines, in line with Putanumonit policy.
I'm all up for Crocker's Rules - if you want to call me a moron please don't waste space sugarcoating it. Jordan Peterson is probably also beyond caring if you insult him. However, this doesn't extend to anyone else mentioned in the post (like Scott A and like Scott A), or to any other commenters.
With that said - don't be a fool. Make some effort not to confuse my own opinions with ...
Meta-comment for authors: take some time after each post to update on how actually contrarian your positions are. As far as I can tell the response to Jordan Peterson on LessWrong has been uniformly positive.
I sense that there are a lot of reasonable people with good ideas like yourself who feel reluctant to share "controversial" views (on e.g. fuzzy System 1 stuff) because they feel like embattled contrarians. Of course, this is probably correct in whatever other social sphere you get your training data from. However, the whole "please be r...
I've gotten a much more negative reception to fuzzy System 1 stuff at IRL LW meetups than online -- that could be what's going on there.
And it's possible for negative reception to be more psychologically impactful and less visible to outsiders than positive reception. This seems especially likely for culture war-adjacent topics like Jordan Peterson. Even if the reception is broadly positive, there might still be a few people who have very negative reactions.
(This is why I'm reluctant to participate in the public-facing community nowadays -- there were a few people in the rationalist community who had very negative reactions to things I said, and did things like track me down on Facebook and leave me profanity-laden messages, or try to hound me out of all the circles they had access to. With a year or two of hindsight, I can see that those people were a small minority and this wasn't a generally negative reaction. But it sure felt like one at the time.)
I just want to make it clear that sending other users on the page insulting or threatening messages is not cool, and that if anyone else ever experiences that, please reach out to me and I will be able to give you a bit of perspective and potentially take action against the person sending the messages (if they’ve done that repeatedly).
All of those frameworks are fake in the sense that introvert isn’t a basic physical entity the same way an up quark is.
The reductive materialism implicit in this is as fake as introverts - possibly even more fake because unless you have a particle accelerator on hand, "everything is made of quarks" translates 100% to hypotheticals rather than anything you can actually do or see in the world; and in the presence of a particle accelerator, that 100% is reduced by epsilon.
I think that Peterson overgeneralizes about gay men (and what about lesbians?), and he’s wrong about the impact of gay marriage on society on the object level. I’m also quite a fan of promiscuity, and I think it’s stupid to oppose a policy just because “neo-Marxists” support it.
Any political issue can be analyzed on the policy level or on the coalition level. Gay marriage seems like an example of an issue that has less to do with policy and more to do with coalitions. If gay marriage was about policy, people would not draw a meaningful distinction betwee...
Something's gone screwy with the formatting somewhere around the "Untitled" link, as a result of which the entire end of the post and all the comments are in italics. Jacob, perhaps you can fix whatever's broken in the post? LW2 mods, perhaps you can fix whatever's wrong in the rendering code that enables formatting screwups' effects to persist like that?
Metaphysical truth here describes self-fulfilling truths as described by Abram Demski, and whose existence are garanteed by e.g. Löb's theorem. In other words, metaphysical truth is truth, and rationalists should be aware of them.
[Note: somewhat taking you up on the Crocker's rules]
Peterson's truth-seeking and data-processing juice is in super-heavy weight class, comparable to Eliezer etc. Please don't make the mistake of lightly saying he's "wrong on many things".
At the level of analysis in your post and the linked Medium article, I don't think you can safely say Peterson is "technically wrong" about anything; it's overwhelmingly more likely you just didn't understand what he means. [it's possible to make more case-specific arguments here but I think the outside view meta-rationality should be enough...]
If you want to me to accept JBP as an authority on technical truth (like Eliezer or Scott are), then I would like to actually see some case-specific arguments. Since I found the case-specific arguments to go against Peterson on the issues where I disagree, I'm not really going to change my mind on the basis of just your own authority backing Peterson's authority.
For example: the main proof Peterson cites to show he was right about C-16 being the end of free speech is the Lindsay Shepherd fiasco. Except her case wasn't even in the relevant jurisdiction, which the university itself admitted! The Shepherd case was about C-16, but no one thinks (anymore) that she was in any in violation of C-16 or could be punished under it. I'll admit JBP was right when Shepherd is dragged to jail by the secret police.
Where I think Peterson goes wrong most often is when he overgeneralizes from the small and biased sample of his own experience. Eating nothing but chicken and greens helped cure Peterson's own rare autoimmune disease, so now everyone should stop eating carbs forever. He almost never qualifies his opinions or the advice he gives, or specifies that it only applie...
I'm worried we may be falling into an argument about definitions, which seems to happen a lot around JBP. Let me try to sharpen some distinctions.
In your quote, Chapman disagrees with Eliezer about his general approach, or perhaps about what Eliezer finds meaningful, but not about matters of fact. I disagree with JBP about matters of fact.
My best guess at what "truth-seeking juice" means comes in two parts: a desire to find the truth, and a methodology for doing so. All three of Eliezer/Scott/JBP have the first part down, but their methodologies are very different. Eliezer's strength is overcoming bias, and Scott's strength is integrating scientific evidence, and I believe they're very good at it because I've seen them do it a lot and be wrong about facts very very rarely. In this post I actually disagree with Eliezer about a matter of fact (how many people before modernity were Biblical literalists), and I do so with some trepidation.
JBP's methodology is optimized for finding his own truth, the metaphorical kind. Like Scott has a track record of being right on science debates, JBP has a track record of all his ideas fitting into a coherent and inspirational worldview - his big picture. When I say he's wrong I don't mean his big picture is bad. I mean he's wrong about facts, and that the Peterson mask is dangerous when one needs to get the facts right.
I notice that my default sense is that Jacob is making a reasonable ask here, but also that Squirrel seems to be trying to do something similar to what I just felt compelled to do on a different thread so I feel obliged to lean into it a bit.
I'm not sure...
a) how to handle this sort of disagreeing on vantage points, where it's hard to disentangle 'person has an important frame that you're not seeing that is worth at least having the ability to step inside' vs 'person is just wrong' and 'person is trying to help you step inside a frame' vs 'person is making an opaque-and-wrong appeal to authority' (or various shades of similar issues).
or, on the meta level:
b) what reasonable norms/expectations on LessWrong for handling that sort of thing are. Err on one side and a lot of people miss important things, err on another side and people waste a lot of time on views that maybe have interesting frames but... just aren't very good. (I like that Jacob set pretty good discussion norms on this thread but this is a thing I'm thinking a lot about right now in the general case).
As of now I have not read anything about Peterson besid...
Perhaps you can explain what Peterson really means when he says that he really believes that the double helix structure of DNA is being depicted in ancient Egyptian and Chinese art.
What does he really means when he says, "Proof itself, of any sort, is impossible, without an axiom (as Godel proved). Thus faith in God is a prerequisite for all proof."?
Why does he seems to believe in Jung's paranormal concept of "synchronicity"?
Why does he think quantum mechanics means consciousness creates reality, and confuse the Copenhagen interpretation with Wheeler's participatory anthropic principle?
Peterson gets many things wrong - not just technically wrong, but deeply wrong, wrong on the level of "ancient aliens built the pyramids". He's far to willing to indulge in mysticism, and has a fundamental lack of skepticism or anything approaching appropriate rigor when it comes to certain pet ideas.
He isn't an intellectual super-heavy weight, he's Deepak Chopra for people who know how to code. We can do better.
This is a cross-post from Putanumonit.com
It seems that most people haven’t had much trouble making up their minds about Jordan Peterson.
The psycho-philosophizing YouTube prophet rose to prominence for refusing to acquiesce to Bill C-16, a Canadian law mandating the use of preferred pronouns for transgender people. The sort of liberal who thinks that this law is a great idea slapped the alt-right transphobe label on Peterson and has been tweeting about how no one should listen to Peterson about anything. The sort of conservative who thinks that C-16 is the end of Western Civilization hailed Peterson as a heroic anti-PC crusader and has been breathlessly retweeting everything he says, with the implied #BooOutgroup.
As the sort of rationalist who googles laws before reacting to them, I assured myself that Peterson got the legal facts wrong: no one is actually getting dragged to jail for refusing to say zir. I’m going to use people’s preferred pronouns regardless, but I’m happy I get to keep doing it in the name of libertarianism and not-being-a-dick, rather than because of state coercion.
With that, I decided to ignore Peterson and go back to my media diet of rationalist blogs, Sam Harris, and EconTalk.
But Jordan Peterson turned out to be a very difficult man to ignore. He showed up on Sam Harris’ podcast, and on EconTalk, and on Joe Rogan and Art of Manliness and James Altucher. He wrote 12 Rules for Life: An Antidote to Chaos, a self-help listicle book inspired by Jesus, Nietzsche, Jung, and Dostoyevsky. [Let's see if I can tie all 12 rules to this essay] And he got rationalists talking about him, which I’ve done for several hours now. As a community, we haven’t quite figured out what to make of him.
Peterson is a social conservative, a Christian who reads truth in the Bible and claims that atheists don’t exist, and a man who sees existence at every level as a conflict between good and evil. The majority of the rationalist community (present company included) are socially liberal and trans-friendly, confident about our atheism, and mistake theorists who see bad equilibria more often than intentional malevolence.
But the most salient aspect of Peterson isn’t his conservatism, or his Christianity, or Manicheanism. It’s his commitment, above all else, to seek the truth and to speak it. [Rule 8: Tell the truth – or, at least, don’t lie] Rationalists can forgive a lot of an honest man, and Peterson shoots straighter than a laser gun.
Peterson loves to talk about heroic narratives, and his own life in the past few months reads like a movie plot, albeit more Kung Fu Panda than Passion of the Christ. Peterson spent decades assembling worldview that integrates everything from neurology to Deuteronomy, one that’s complex, self-consistent and close enough to the truth to withstand collision with reality. It’s also light-years and meta-levels away from the sort of simplistic frameworks offered by the mass media on either the right or the left.
When the C-16 controversy broke, said media assumed that Peterson would meekly play out the role of outgroup strawman, and were utterly steamrolled. A lot of the discussion about the linked interview has to do with rhetoric and argument, but to me, it showcased something else. A coherent worldview like that is a powerful and beautiful weapon in the hands of the person who is committed to it.
But it wasn’t the charismatic performances that convinced me of Peterson’s honesty, it’s clips like this one, where he was asked about gay marriage.
Most people are for or against gay marriage based on their object level feeling about gays, and their tribal affiliation. The blue tribe supports gay marriage, opposes first-cousin marriage, and thinks that the government should force a cake shop to bake a gay wedding cake because homophobia is bad. The red tribe merely flips the sign on all of those.
Some people go a meta-level up: I support gay marriage, support cousin marriage, and support bakers getting to decide themselves which cakes they bake for reasons of personal freedom [Rule 11: don’t bother children when they are skateboarding], and the ready availability of both genetic testing clinics and gay-friendly bakeries.
But to Peterson, everything is a super-meta-level tradeoff that has the power to send all of Western Civilization down the path to heaven or hell:
Few people besides Peterson himself can even fully understand his argument, let alone endorse it. And yet he can’t help himself from actually trying to figure out what his worldview says about gay marriage, and from saying it with no reservations.
I think that Peterson overgeneralizes about gay men (and what about lesbians?), and he’s wrong about the impact of gay marriage on society on the object level. I’m also quite a fan of promiscuity, and I think it’s stupid to oppose a policy just because “neo-Marxists” support it.
But I don’t doubt Peterson’s integrity, which means that I could learn something from him. [Rule 9: assume that the person you are listening to might know something you don’t].
So, what can Jordan Peterson teach rationalists?
In 12 Rules, Peterson claims that eating a large, low-carb breakfast helps overcome depression and anxiety. Is this claim true?
There’s a technical sort of truth, and here “technical” is itself a synonym for “true”, that’s discoverable using the following hierarchy of methods: opinion -> observation -> case report -> experiment -> RCT -> meta-analysis -> Scott Alexander “much more than you wanted to know” article. If you ask Scott whether a low-carb breakfast reduces anxiety he’ll probably say that there isn’t a significant effect, and that’s the technical truth of the matter.
So why does Peterson believe the opposite? He’s statistically literate… for a psychologist. He references a couple of studies about the connection between insulin and stress, although I’d wager he wouldn’t lose much sleep if one of them failed to replicate. It probably also helps that Gary Tabes is really playing the part of the anti-establishment truth-crusader. Ultimately, Peterson is answering a different question: if a patient comes to your psychiatry clinic complaining about mild anxiety, should you tell them to eat bacon and eggs for breakfast?
My rationalist steelman of Peterson would say something like this: maybe the patient has leaky gut syndrome that contributes to their anxiety, and reducing gluten intake would help. If not, maybe the link between insulin and cortisol will turn out to be real and meaningful. If not, maybe having a morning routine that requires a bit of effort (it’s harder to make eggs than eat a chocolate bar, but not too hard) will bring some needed structure to the patient’s life. If not, maybe getting any advice whatsoever from a serious looking psychologist would make the patient feel that they are being listened to, and that will placebo their anxiety by itself. And if not, then no harm was done and you can try something else.
But, Peterson would add, you can’t tell the patient all of that. You won’t help them by explaining leaky guts and p-values and placebo effects. They need to believe that their lives have fallen into chaos, and making breakfast is akin to slaying the dragon-goddess Tiamat and laying the foundation for stable order that creates heaven on Earth. This is metaphorical truth.
If you’re a rationalist, you probably prefer your truths not to be so… metaphorical. But it’s a silly sort of rationalist who gets sidetracked by arguments about definitions. If you don’t like using the same word to mean different things [Rule 10: be precise in your speech], you can say “useful” or “adaptive” or “meaningful” instead of “true”. It’s important to use words well, but it’s also important to eat a good breakfast. Probably. [Rule 2: treat yourself like you would someone you are responsible for helping]
One of the most underrated recent ideas in rationality is the idea of fake frameworks. I understand it thus: if you want to understand how lasers work, you should really use quantum physics as your framework. But if you want to understand how a cocktail party works, looking at quarks won’t get you far. You can use the Hansonian framework of signaling, or the sociological framework of class and status, or the psychometric framework of introverts and extroverts, etc.
All of those frameworks are fake in the sense that introvert isn’t a basic physical entity the same way an up quark is. Those frameworks are layers of interpretation that you impose on what you directly experience, which is human-shaped figures walking around, making noises with their mouths and sipping gin & tonics. You can’t avoid imposing interpretations, so you should gather a diverse toolbox of frameworks and use them consciously even you know they’re not 100% true.
Here’s a visual example:
Q: Which map is more true to the territory?
A: Neither. But if your goal is to meet Einstein on his way to work you use the one on the right, and if your goal is to count the trees on the golf course you use the one on the left.
By the way, there’s a decent chance that “fake frameworks” is what the post-rationalists have been trying to explain to me all along, except they were kind of rude about it. If it’s true that they had the same message, it took Valentine to get it through my skull because he’s an excellent teacher, and also someone I personally like. Likingshouldn’t matter to rationalists, but somehow it always seems to matter to humans. [Rule 5: do not let your children do anything that makes you dislike them]
That’s what Jordan Peterson is: a fake framework. He’s a mask you can put on, a mask that changes how you see the world and how you see yourself in the mirror. Putting on the Jordan Peterson mask adds two crucial elements that rationalists often struggle with: motivation and meaning.
The Secular Solstice is a celebration designed by rationalists to sing songs together and talk about meaning. [Rule 3: make friends with people who want the best for you] The first time I attended, the core theme was the story of Stonehenge. Once upon a time, humans lived in terror of the shortening of the days each autumn. But we built Stonehenge to mark the winter solstice and predict when spring would come – a first step towards conquering the cold and dark.
But how did Stonehenge get built?
First, the tribe had a Scott Alexander. Neolithic Scott listened to the shamans speak of the Sun God, and demanded to see their p-values. He counted patiently the days between the solstices of each year and drew arrows pointing to the exact direction the sun rose each day.
Finally, Scott spoke up:
And the tribe told him that it’s all much more than they wanted to know about the sun.
But Scott only gets us halfway to Stonehenge. The monument itself was built over several centuries, using 25-ton rocks that were brought to the site from 140 miles away. The people who hauled the first rock had to realize (unless subject to extreme planning fallacy) that not a single person they know, nor their children or grandchildren, would see the monument completed. Yet these people hauled the rocks anyway, and that required neolithic Peterson to inspire them.
Peterson is very popular with the sort of young people who have been told all their lives to be happy and proud of just who they are. But when you’re 19, short on money, shorter on status, and you start to realize you won’t be a billionaire rock star, you don’t see a lot to be satisfied with. Lacking anything to be proud of individually, they are tempted to substitute their self for a broader group identity. What the identity groups mostly do is complain that the world is unfair to them; this keeps the movement going but doesn’t do much to alleviate the frustration of its members.
And then Peterson tells them to lift the heaviest rock they can and carry it. Will it ease their suffering? No. Everyone is suffering, but at least they can carve meaning out of that. And if enough people listen to that message century after century, we get Stonehenge. [Rule 7: pursue what is meaningful, not what is expedient]
A new expansion just came out for the Civilization 6 video game, and instead of playing it I’m nine hours into writing this post and barely halfway done. I hope I’m not the only one getting some meaning out of this thing.
It’s not easy to tell a story that inspires a whole tribe to move 25-ton rocks. Peterson noticed that the Bible is one story that has been doing that for a good while. Eliezer noticed it too, and he was not happy about it, so he wrote his own tribe-inspiring work of fiction. I’ve read both, cover to cover. And although I found HPMoR more fun to read, I end up quoting from the Old Testament a lot more often when I have a point to make.
“Back in the old days, saying that the local religion is a work of fiction would have gotten you burned at the stake“, Eliezer replies. Well, today quoting research on psychology gets you fired from Google, and quoting research on climate change gets you fired from the EPA. Eppur si muove.
Jews wrote down commentaries suggesting that the story of Jonah is metaphorical a millennium before Galileo was born, and yet they considered themselves the People of the Book. The Peterson mask reminds us that people don’t have to take a story literally to take it seriously.
Peterson loves to tell the story of Cain and Abel. Humans discovered sacrifice: you can give away something today to get something better tomorrow. “Tomorrow” needs a face, so we call it “God” and act out a literal sacrifice to God to hammer the point home for the kids.
But sometimes, the sacrifice doesn’t work out. You give, but you don’t get, and you are driven to resentment and rage against the system. That’s what Cain does, and the story tells us that it’s the wrong move – you should ponder instead how to make a better sacrifice next time.
When I was younger, I went to the gym twice a week for a whole year. After a year I didn’t look any sexier, I didn’t get much stronger, and I was sore a lot. So I said fuck it and stopped. Now I started going to the gym twice a week again, but I also started reading about food and exercise to finally get my sacrifice to do something. I still don’t look like someone who goes to the gym twice a week, but I can bench 20 pounds more than I could last year and I rarely get sore or injured working out. [Rule 4: compare yourself with who you were yesterday, not with who someone else is today]
Knowing that the story of Cain and Abel is made up hasn’t prevented it from inspiring me to exercise smarter.
There’s a problem: many stories that sound inspirational are full of shit. After listening to a few hours of Peterson talking about archetypes and dragons and Jesus, I wasn’t convinced that he’s not full of it either. You should only wear a mask if it leaves you wiser when you take it off and go back to facing your mundane problems.
What convinced me about Peterson is this snippet from his conversation with James Altucher (24 minutes in):
Aside from the Jung quote, that’s the most Putanumonit piece of life advice I have ever heard on a podcast, complete with unnecessary arithmetic. If Peterson can put on a Putanumonit hat and come up with something that makes deep sense to me, perhaps I could do the same with a Peterson mask.
The rationalist project is about finding true answers to difficult questions. We have a formula that does that, and we’ve tracked many ways in which our minds can veer of the right path. But there are profound ways in which a person can be unready to seek the truth, ways that are hard to measure in a behavioral econ lab and assign a catchy moniker to.
I have written a lot about romance and dating in the last two years, including some mildly controversial takes. I could not have written any of them before I met my wife. Not because I didn’t know the facts or the game theory, but because I wasn’t emotionally ready. When I read private drafts I wrote about women from years ago, they are colored by frustration, guilt, exuberance or fear, all depending on the outcome of the last date I’ve been on. Those emotions aren’t exactly conducive to clarity of thought.
I think this was also the reason why Scott Aaronson wrote The Comment that led to Untitled only when he was married and had a child. Then, he could weather the resulting storm without backing down from his truth. It is hard to see something true about relationships when your own aren’t in order, let alone write something true. [Rule 6: set your house in perfect order before you criticize the world]
The flipside is: when you wear the Peterson mask, you are compelled to spread the word when you’ve found a path that leads somewhere true. There is no higher calling in Peterson’s worldview. The Kolmogorov Option becomes the Kolmogorov Mandate (and the Scott Alexander mask mostly agrees).
Let’s go back to the beginning: Peterson made noise by refusing to comply with a law that doesn’t actually do what he claims. How is that contributing to the truth?
For starters, I would have bet that Peterson was going to lose his job when the letters calling for his dismissal started rolling in, letters signed by hundreds of Peterson’s own colleagues at the University of Toronto. I would have bet wrong: the only thing that happened is that Peterson now makes several times his academic salary from his Patreon account (if you want me to start saying crazy things you can try clicking here).
This is critical: it created common public knowledge that if free speech is ever actually threatened by the government to the extent that Peterson claims, the support for free speech will be overwhelming even at universities. Speaking an unpopular truth is a coordination problem, you have to know that others will stand with you if you stand up first. [Rule 1: stand up straight with your shoulders back]
Now, more people know that there’s appetite in the West for people who stand up for truth. This isn’t a partisan thing, I hope that Peterson inspires people with inconvenient leftist opinions to speak up in red tribe-dominated spaces (e.g. the NFL protests).
Peterson was technically wrong, as he is on many things. But he sees the pursuit of truth as a heroic quest and he’s willing to toss some rocks around, and I think this helps the cause of truth even if one gets some technical details wrong.
Being wrong about the details is not good, but I think that rationalists are pretty decent at getting technicalities right. By using the Peterson Mask judiciously, we can achieve even more than that.
[Rule 12: pet a cat when you encounter one on the street], but don’t touch the hedgehog, they don’t like it.