There's a kind of game here on Less Wrong.
It's the kind of game that's a little rude to point out. Part of how it works is by not being named.
Or rather, attempts to name it get dissected so everyone can agree to continue ignoring the fact that it's a game.
So I'm going to do the rude thing. But I mean to do so gently. It's not my intention to end the game. I really do respect the right for folk to keep playing it if they want.
Instead I want to offer an exit to those who would really, really like one.
I know I really super would have liked that back in 2015 & 2016. That was the peak of my hell in rationalist circles.
I'm watching the game intensify this year. Folk have been talking about this a lot. How there's a ton more talk of AI here, and a stronger tone of doom.
I bet this is just too intense for some folk. It was for me when I was playing. I just didn't know how to stop. I kind of had to break down in order to stop. All the way to a brush with severe depression and suicide.
And it also ate parts of my life I dearly, dearly wish I could get back.
So, in case this is audible and precious to some of you, I'd like to point a way to ease.
The Apocalypse Game
The upshot is this:
You have to live in a kind of mental illusion to be in terror of the end of the world.
Illusions don't look on the inside like illusions. They look like how things really are.
Part of how this one does the "daughter's arm" thing is by redirecting attention to facts and arguments.
- "Here's why the argument about AI makes sense."
- "Do you have some alternative view of what will happen? How do you address XYZ?"
- "What makes it an 'illusion'? I challenge that framing because it dismisses our ability to analyze and understand yada yada."
None of this is relevant.
I'm pointing at something that comes before these thoughts. The thing that fuels the fixation on the worldview.
I also bet this is the thing that occasionally drives some people in this space psychotic, depressed, or into burnout.
The basic engine is:
- There's a kind of underlying body-level pain. I would tag this as "emotional pain" but it's important to understand that I really am pointing at physical sensations.
- The pain is kind of stored and ignored. Often it arose from a very young age but was too overwhelming, so child-you found methods of distraction.
- This is the basic core of addiction. Addictions are when there's an intolerable sensation but you find a way to bear its presence without addressing its cause. The more that distraction becomes a habit, the more that's the thing you automatically turn to when the sensation arises. This dynamic becomes desperate and life-destroying to the extent that it triggers a red queen race.
- A major unifying flavor of the LW attractor is intense thought as an addictive distraction. And the underlying flavor of pain that fuels this addiction is usually some variation of fear.
- In not-so-coincidental analogy to uFAI, these distracting thoughts can come to form autonomous programs that memetically evolve to have something like survival and reproductive instincts — especially in the space between people as they share and discuss these thoughts with each other.
- The rationalist memeplex focuses on AI Ragnarok in part because it's a way for the intense thought to pull fuel from the underlying fear.
In this case, the search for truth isn't in service to seeing reality clearly. The logic of economic races to the bottom, orthogonality, etc. might very well be perfectly correct.
But these thoughts are also (and in some cases, mostly) in service to the doomsday meme's survival.
But I know that thinking of memes as living beings is something of an ontological leap in these parts. It's totally compatible with the LW memeplex, but it seems to be too woo-adjacent and triggers an unhelpful allergic response.
So I suggested a reframe at the beginning, which I'll reiterate here:
Your body's fight-or-flight system is being used as a power source to run a game, called "OMG AI risk is real!!!"
And part of how that game works is by shoving you into a frame where it seems absolutely fucking real. That this is the truth. This is how reality just is.
And this can be fun!
And who knows, maybe you can play this game and "win". Maybe you'll have some kind of real positive impact that matters outside of the game.
But… well, for what it's worth, as someone who turned off the game and has reworked his body's use of power quite a lot, it's pretty obvious to me that this isn't how it works. If playing this game has any real effect on the true world situation, it's to make the thing you're fearing worse.
(…which is exactly what's incentivized by the game's design, if you'll notice.)
I want to emphasize — again — that I am not saying that AI risk isn't real.
I'm saying that really, truly orienting to that issue isn't what LW is actually about.
That's not the game being played here. Not collectively.
But the game that is being played here absolutely must seem on the inside like that is what you're doing.
Ramping Up Intensity
When Eliezer rang the doom bell, my immediate thought was:
"Ah, look! The gamesmaster has upped the intensity. Like preparing for a climax!"
I mean this with respect and admiration. It's very skillful. Eliezer has incredible mastery in how he weaves terror and insight together.
And I don't mean this at all to dismiss what he's saying. Though I do disagree with him about overall strategy. But it's a sincere disagreement, not a "Oh look, what a fool" kind of thing.
What I mean is, it's a masterful move of making the game even more awesome.
(…although I doubt he consciously intended it that way!)
I remember when I was in the thick of this AI apocalypse story, everything felt so… epic. Even questions of how CFAR dealt with garbage at its workshops seemed directly related to whether humanity would survive the coming decades. The whole experience was often thrilling.
And on the flipside, sometimes I'd collapse. Despair. "It's too much" or "Am I even relevant?" or "I think maybe we're just doomed."
These are the two sort of built-in physiological responses to fight-or-flight energy: activation, or collapse.
(There's a third, which is a kind of self-holding. But it has to be built. Infants aren't born with it. I'll point in that direction a bit later.)
In the spirit of feeling rationally, I'd like to point out something about this use of fight-or-flight energy:
If your body's emergency mobilization systems are running in response to an issue, but your survival doesn't actually depend on actions on a timescale of minutes, then you are not perceiving reality accurately.
Which is to say: If you're freaked out but rushing around won't solve the problem, then you're living in a mental hallucination. And it's that hallucination that's scaring your body.
Again, this isn't to say that your thoughts are incorrectly perceiving a future problem.
But if it raises your blood pressure or quickens your breath, then you haven't integrated what you're seeing with the reality of your physical environment. Where you physically are now. Sitting here (or whatever) reading this text.
So… folk who are wringing their hands and feeling stressed about the looming end of the world via AI?
Y'all are hallucinating.
If you don't know what to do, and you're using anxiety to power your minds to figure out what to do…
…well, that's the game.
The real thing doesn't work that way.
But hey, this sure is thrilling, isn't it?
As long as you don't get stuck in that awful collapse space, or go psychotic, and join the fallen.
But the risk of that is part of the fun, isn't it?
(Interlude)
A brief interlude before I name the exit.
I want to emphasize again that I'm not trying to argue anyone out of doing this intense thing.
The issue is that this game is way, way out of range for lots of people. But some of those people keep playing it because they don't know how to stop.
And they often don't even know that there's something on this level to stop.
You're welcome to object to my framing, insist I'm missing some key point, etc.
Frankly I don't care.
I'm not writing this to engage with the whole space in some kind of debate about AI strategy or landscape or whatever.
I'm trying to offer a path to relief to those who need it.
That no, this doesn't have to be the end of the world.
And no, you don't have to grapple with AI to sort out this awful dread.
That's not where the problem really is.
I'm not interested in debating that. Not here right now.
I'm just pointing out something for those who can, and want to, hear it.
Land on Earth and Get Sober
So, if you're done cooking your nervous system and want out…
…but this AI thing gosh darn sure does look too real to ignore…
…what do you do?
My basic advice here is to land on Earth and get sober.
The thing driving this is a pain. You feel that pain when you look out at the threat and doom of AI, but you cover it up with thoughts. You pretend it's about this external thing.
I promise, it isn't.
I know. I really do understand. It really truly looks like it's about the external thing.
But… well, you know how when something awful happens and gets broadcast (like the recent shooting), some people look at it with a sense of "Oh, that's really sad" and are clearly impacted, while others utterly flip their shit?
Obviously the difference there isn't in the event, or in how they heard about it. Maybe sometimes, but not mostly.
The difference is in how the event lands for the listener. What they make it mean. What bits of hidden pain are ready to be activated.
You cannot orient in a reasonable way to something that activates and overwhelms you this way. Not without tremendous grounding work.
So rather than believing the distracting thoughts that you can somehow alleviate your terror and dread with external action…
…you've got to stop avoiding the internal sensation.
When I talked earlier about addiction, I didn't mean that just as an analogy. There's a serious withdrawal experience that happens here. Withdrawal from an addiction is basically a heightening of the intolerable sensation (along with having to fight mechanical habits of seeking relief via the addictive "substance").
So in this case, I'm talking about all this strategizing, and mental fixation, and trying to model the AI situation.
I'm not saying it's bad to do these things.
I'm saying that if you're doing them as a distraction from inner pain, you're basically drunk.
You have to be willing to face the awful experience of feeling, in your body, in an inescapable way, that you are terrified.
I sort of want to underline that "in your body" part a bazillion times. This is a spot I keep seeing rationalists miss — because the preferred recreational drug here is disembodiment via intense thinking. You've got to be willing to come back, again and again, to just feeling your body without story. Notice how you're looking at a screen, and can feel your feet if you try, and are breathing. Again and again.
It's also really, really important that you do this kindly. It's not a matter of forcing yourself to feel what's present all at once. You might not even be able to find the true underlying fear! Part of the effect of this particular "drug" is letting the mind lead. Making decisions based on mental computations. And kind of like minds can get entrained to porn, minds entrained to distraction via apocalypse fixation will often hide their power source from their host.
(In case that was too opaque for you just yet, I basically just said "Your thoughts will do what they can to distract you from your true underlying fear." People often suddenly go blank inside when they look inward this way.)
So instead of trying to force it all at once, it's a matter of titrating your exposure. Noticing that AI thoughts are coming up again, and pausing, and feeling what's going on in your body. Taking a breath for a few seconds. And then carrying on with whatever.
This is slow work. Unfortunately your "drug" supply is internal, so getting sober is quite a trick.
But this really is the exit. As your mind clears up… well, it's very much like coming out of the fog of a bender and realizing that no, really, those "great ideas" you had just… weren't great. And now you're paying the price on your body (and maybe your credit card too!).
There are tons of resources for this kind of direction. It gets semi-independently reinvented a lot, so there are lots of different names and frameworks for this. One example that I expect to be helpful for at least some LWers who want to land on Earth & get sober is Irene Lyon, who approaches this through a "trauma processing" framework. She offers plenty of free material on YouTube. Her angle is in the same vein as Gabor Maté and Peter Levine.
But hey, if you can feel the thread of truth in what I'm saying and want to pursue this direction, but you find you can't engage with Irene Lyon's approach, feel free to reach out to me. I might be able to find a different angle for you. I want anyone who wants freedom to find it.
But… but Val… what about the real AI problem?!
Okay, sure. I'll say a few words here.
…although I want to point out something: The need to have this answered is coming from the addiction to the game. It's not coming from the sobriety of your deepest clarity.
That's actually a complete answer, but I know it doesn't sound like one, so I'll say a little more.
Yes, there's a real thing.
And yes, there's something to do about it.
But you're almost certainly not in a position to see the real thing clearly or to know what to do about it.
And in fact, attempts to figure the real thing out and take action from this drunk gamer position will make things worse.
(I hesitate to use the word "worse" here. That's not how I see it. But I think that's how it translates to the in-game frame.)
This is what Buddhists should have meant (and maybe did/do?) when they talk about "karma". How deeply entangled in this game is your nervous system? Well, when you let that drive how you interact with others, their bodies get alarmed in similar ways, and they get more entangled too.
Memetic evolution drives how that entangling process happens on large scales. When that becomes a defining force, you end up with self-generating pockets of Hell on Earth.
This recent thing with FTX is totally an example. Totally. Threads of karma/trauma/whatever getting deeply entangled and knotted up and tight enough that large-scale flows of collective behavior create an intensely awful situation.
You do not solve this by trying harder. Tugging the threads harder.
In fact, that's how you make it worse.
This is what I meant when I said that actually dealing with AI isn't the true game in LW-type spaces, even though it sure seems like it on the inside.
It's actually helpful to the game for the situation to constantly seem barely maybe solvable but to have major setbacks.
And this really can arise from having a sincere desire to deal with the real problem!
But that sincere desire, when channeled into the Matrix of the game, doesn't have any power to do the real thing. There's no leverage.
The real thing isn't thrilling this way. It's not epic.
At least, not any more epic than holding someone you love, or taking a stroll through a park.
To oversimplify a bit: You cannot meaningfully help with the real thing until you're sober.
Now, if you want to get sober and then you roll up your sleeves and help…
…well, fuck yeah! Please. Your service would be a blessing to all of us. Truly. We need you.
But it's gotta come from a different place. Tortured mortals need not apply.
And frankly, the reason AI in particular looks like such a threat is because you're fucking smart. You're projecting your inner hell onto the external world. Your brilliant mind can create internal structures that might damn well take over and literally kill you if you don't take responsibility for this process. You're looking at your own internal AI risk.
I hesitate to point that out because I imagine it creating even more body alarm.
But it's the truth. Most people wringing their hands about AI seem to let their minds possess them more and more, and pour more & more energy into their minds, in a kind of runaway process that's stunningly analogous to uFAI.
The difference is, you don't have to make the entire world change in order to address this one.
You can take coherent internal action.
You can land on Earth and get sober.
That's the internal antidote.
It's what offers relief — eventually.
And from my vantage point, it's what leads to real hope for the world.
I think that's a great option. I'd question a "master rationalist's" skills if they couldn't avoid such adversarial actors, or notice them if they slip through the cracks.
I like your preference. I'll say some things, but I want to start by emphasizing that I don't think you're making a wrong or bad choice.
I want to talk about what I think the Art could be, kind of for aesthetic reasons. This isn't to assert anything about what you or any given individual should or shouldn't be doing in any kind of moral sense.
So with that said, here are three points:
(1) I think there's a strong analogy here to studying combat and war. Yes, if you can be in a pacifist cluster and just exclude folk who are really into applied competitive strategy, then you have something kind of like a cooperate/cooperate equilibrium. But if that's the whole basis of your culture, it's extremely vulnerable, the way cooperate-bot is vulnerable in prisoners' dilemmas. You need military strength, the way a walled garden needs walls. Otherwise folk who have military strength can just come take your resources, even if you try to exclude them at first.
At the risk of using maybe an unfair example, I think what happened with FTX last year maybe illustrates the point.
Clearer examples in my mind are Ziz and Brent. The point not being "These people are bad!" But rather, these people were psychologically extremely potent and lots of folk in the community could neither (a) adequately navigate their impact (myself included!) nor (b) rally ejection/exclusion power until well after they'd already had their impact.
Maybe, you might hope, you can make the ejection/exclusion sensitivity refined enough to work earlier. But if you don't do that by studying the Dark Arts, and becoming intimately familiar with them, then what you get is a kind of naïve allergic response that Dark Artists can weaponize.
Again, I don't mean that you in particular or even rationalists in general need to address this. There's nothing wrong with a hobby. I'm saying that as an Art, it seems like rationality is seriously vulnerable if it doesn't include masterful familiarity with the Dark Arts. Kind of like, there's nothing wrong with practicing aikido as a sport, but you're not gonna get the results you hope for if you train in aikido for self-defense. That art is inadequate for that purpose and needs exposure to realistic combat to matter that way.
(2) …and I think that if the Art of Rationality were to include intimate familiarity with the Dark Arts, it would work way way better.
Things like the planning fallacy or confirmation bias are valuable to track. I could stand to improve my repertoire here for sure.
But the most potent forms of distorted thinking aren't about sorting out the logic. I think they look more like reaching deep down and finding ways to become immune to things like frame control.
Frame control is an amazing example in my mind precisely because of the hydra-like nature of the beast. How do you defend against frame control without breaking basic things about culture and communication and trust? How do you make it so your cultural and individual defenses don't themselves become the manual that frame controllers use to get their desired effects?
And this barely begins to touch on the kind of impact that I'd want to call "spiritual". By which I don't mean anything supernatural; I'm talking about the deep psychological stuff that (say) conversing with someone deep in a psilocybin trip can do to the tripper. That's not just frame control. That's something way deeper, like editing someone's basic personality operating system code. And sometimes it reaches deeper even than that. And it turns out, you don't need psychedelics to reach that deep; those chemical tools just open a door that you can open other ways, voluntarily or otherwise, sometimes just by having a conversation.
The standard rationalist defense I've noticed against this amounts to mental cramping. Demand everything go through cognition, and anything that seems to try to route around cognition gets a freakout/shutdown/"shame it into oblivion" kind of response. The stuff that disables this immune response is really epistemically strange — things like prefacing with "Here's a fake framework, it's all baloney, don't believe anything I'm saying." Or doing a bunch of embodied stuff to act low-status and unsure. A Dark Artist who wanted to deeply mess with this community wouldn't have to work very hard to do some serious damage before getting detected, best as I can tell (and as community history maybe illustrates).
If this community wanted to develop the Art to actually be skillful in these areas… well, it's hard to predict exactly what that'd create, but I'm pretty sure it'd be glorious. If I think of the Sequences as retooling skeptical materialism, I think we'd maybe see something like a retooling of the best of Buddhist psychotechnology. I think folk here might tend to underestimate how potent that could really be.
(…and I also think that it's maybe utterly critical for sorting out AI alignment. But while I think that's a very important point, it's not needed for my main message for this exchange.)
(3) It also seems relevant to me that "Dark Arts" is maybe something of a fake category. I'm not sure it even forms a coherent cluster.
Like, is being charismatic a Dark Art? It certainly can be! It can act as a temptation. It seems to be possible to cultivate charisma. But the issue isn't that charisma is a Dark Art. It's that charisma is mostly symmetric. So if someone has a few slightly anti-epistemic social strategies in them, and they're charismatic, this can have a net Dark effect that's even strategic. But this is a totally normal level of epistemic noise!
Or how about something simpler, like someone using confirmation bias in a way that benefits their beliefs? Astrology is mostly this. Is astrology a Dark Art? Is talking about astrology a Dark Art? It seems mostly just epistemically hazardous… but where's the line between that and Dark Arts?
How about more innocent things, like when someone is trying to understand systemic racism? Is confirmation bias a helpful pattern-recognizer, or a Dark Art? Maybe it's potentially in service to Dark Arts, but is a necessary risk to learn the patterns?
I think Vervaeke makes this point really well. The very things that allow us to notice relevance are precisely the things that allow us to be fooled. Rationality (and he explicitly cites this — even the Keith Stanovich stuff) is a literally incomputable practice of navigating both Type I and Type II errors in this balancing act between relevance realization and being fooled.
When I think of central examples of Dark Arts, I think mostly of agents who exploit this ambiguity in order to extract value from others.
…which brings me back to point (1), about this being more a matter of skill in war. The relevant issue isn't that there are "Dark Arts". It's that there are unaligned agents who are trying to strategically fool you. The skill isn't to detect a Dark toolset; it's to detect intelligent intent to deceive and extract value.
All of which is to say: