I would of course have a different response to someone who asked the incredibly different question, "Any learnable tricks for not feeling like crap while the world ends?"
(This could be seen as the theme of a couple of other brief talks at the Solstice. I don't have a 30-second answer that doesn't rely on context, and don't consider myself much of an expert on that question versus the part of the problem constraint that is maintaining epistemic health while you do whatever. That said, being less completely unwilling to spend small or even medium amounts of money made a difference to my life, and so did beginning a romantic relationship in the frame of mind that we might all be dead soon and therefore I ought to do more fun things and worry less about preserving the relationship, which led to a much stronger relationship relative to the wrong things I otherwise do by default.)
It's fancy and indirect, compared to getting out of bed.
They didn't need to deal with social media informing them that they need to be traumatized now, and form a conditional prediction of extreme and self-destructive behavior later.
That does sound similar to me! But I haven't gotten a lot of mileage out of TAPs and if you're referring to some specific advanced version of it, maybe I'm off. But the basic concept of mentally rehearsing the trigger, the intended action, and (in some variations) the later sequence of events leading up to an outcome you feel is good, sure sounds to me like trying to load a plan into a predictorlike thing that has been repurposed to output plan images.
This is just straight-up planning and doesn't require doing weird gymnastics to deal with a biological brain's broken type system.
Nope. Breaks the firewall. Exactly as insane.
Beliefs are for being true. Use them for nothing else.
If you need a good thing to happen, use a plan for that.
Oh, absolutely not. Our incredibly badly designed bodies do insane shit like repurposing superoxide as a metabolic signaling molecule. Our incredibly badly designed brains have some subprocesses that take a bit of predictive machinery lying around and repurpose it to send a control signal, which is even crazier than the superoxide thing, which is pretty crazy. Prediction and planning remain incredibly distinct as structures of cognitive work, and the people who try to deeply tie them together by writing wacky equations that sum them both together plus throwing in an entropy term, are nuts. It's like the town which showed a sign with its elevation, population, and year founded, plus the total of those numbers. But one reason why the malarky rings true to the knowlessones is that the incredibly badly designed human brain actually is grabbing some bits of predictive machinery and repurposing them for control signals, just like the human metabolism has decided to treat insanely reactive molecular byproducts as control signals. The other reason of course is the general class of malarky which consists of telling a susceptible person that two different things are the same.
The technique is older than the "active inference" malarky, but the way I wrote about it is influenced by my annoyance with "active inference" malarky.
This is about "insane" in the sense of people ceasing to meet even their own low bars for sanity.
...amazing.