I like this post. Sneaking "scary" ideas into fiction, where they can be faced in a context that feels safer - that makes a lot of sense to me. And while I think you're right that it's tricky to consciously use the technique on yourself, I've certainly had it happen that way for me accidentally. (Though I think it's worth mentioning that the moment of realization - the moment it hit me that the logical or moral conclusion I had accepted in a fictional context was also valid/applicable in real life - was still sometimes painful or at least jarring.)
You asked about other ways to "reduce the perceived hedonic costs of truthseeking". I have an example of my own that might be relevant, especially to the word "perceived". Have you ever seen that trick where someone pulls a tablecloth off a table quickly and smoothly enough that all the plates and glasses and things stay right where they were?
I was speaking to a friend-of-a-friend to whom I had just been introduced - call her Jenny. In casual conversation, Jenny brought up her belief in crystal healing and asked me directly what I thought of it. Our mutual friend winced in horror because she knows how I feel about woo and anticipated a scathing response, or at least a condescending lecture about evidence-based medicine.
I'm not completely tactless, and Jenny was nice. I didn't want to ruin her evening over some stupid crystals. I had an idea. I said, as near as I can recall, this:
"Oh, yes, I think crystal healing is amazing! Gosh, when you think that just by looking at a little piece of quartz or hematite or topaz and thinking about things like mental clarity or relaxation, we have the power to lower our anxiety levels, lessen our feelings of fatigue, even reduce our own blood pressure - I mean it's such a beautiful example of the power of the human mind, isn't it?"
And more in the same vein. Basically I gushed for five minutes about how cool the placebo effect is (without once using the term "placebo effect") and how cool the natural world is, and how cool it is that we're constantly learning more about things that used to be mysteries, and so on.
My friend was relieved and Jenny was nodding - a little hesitantly, like she was slightly bewildered by something she couldn't quite put her finger on, but she was listening and she wasn't upset or defensive or annoyed and the party proceeded without awkwardness or rancor.
I didn't tell any lies. Crystal healing does work, in the sense that it's better than nothing. Of course almost anything that doesn't do active harm or negate the effects of real treatments works better than nothing - that's the beauty of the placebo. Doesn't really matter if it's administered via sugar pill or amethyst or homeopathic milkshake, if the belief is there (and I've seen some intriguing evidence to suggest that even true belief isn't necessary, by the way - you might only need hope).
See what I mean about the tablecloth trick? I was able to introduce Jenny to a less-wrong way of thinking about crystals without the hedonic cost of totally dismantling her beliefs. Now, I don't think I convinced her that crystals aren't filled with mysterious healing energy, and we never got near the fact that real medicine should work better than a placebo, but it still felt like a win - because I slipped a line of retreat into her head without setting off her intruder-alert. I gave her the plans for a model where her beloved crystals are cool and interesting and not-useless and not-lame that doesn't rely on them being magic. I showed her that you could take away the tablecloth and leave her good china in place.
It's a small example but I think there's an argument for minimizing perceived hedonic cost by demonstrating to someone that the absence of one cherished belief does not necessarily mean that every cherished belief or value that apparently rests upon it must come crashing down. Relinquishing belief in the magic of crystals doesn't mean Jenny has to throw out her collection of pretty rocks. Relinquishing belief in God doesn't mean a life without joy or meaning or domestic felicity and I think that's the kind of thing a lot of people are really afraid of losing, not the abstract idea of God's existence itself. They need to know there's a table under there.
(Upvoted.) Just wanted to say, "Welcome to LessWrong."
Related: Leave a Line of Retreat
When I was smaller, I was sitting at home watching The Mummy, with my mother, ironically enough. There's a character by the name of Bernard Burns, and you only need to know two things about him. The first thing you need to know is that the titular antagonist steals his eyes and tongue because, hey, eyes and tongues spoil after a while you know, and it's been three thousand years.
The second thing is that Bernard Burns was the spitting image of my father. I was terrified! I imagined my father, lost and alone, certain that he would die, unable to see, unable even to properly scream!
After this frightening ordeal, I had the conversation in which it is revealed that fiction is not reality, that actions in movies don't really have consequences, that apparent consequences are merely imagined and portrayed.
Of course I knew this on some level. I think the difference between the way children and adults experience fiction is a matter of degree and not kind. And when you're an adult, suppressing those automatic responses to fiction has itself become so automatic, that you experience fiction as a thing compartmentalized. You always know that the description of consequences in the fiction will not by magic have fire breathed into them, that Imhotep cannot gently step out of the frame and really remove your real father's real eyes.
So, even though we often use fiction to engage, to make things feel more real, in another way, once we grow, I think fiction gives us the chance to entertain formidable ideas at a comfortable distance.
A great user once said, "Vague anxieties are powerful anxieties." Related to this is the simple rationality technique of Leaving a Line of Retreat: before evaluating the plausibility of a highly cherished or deeply frightening belief, one visualizes the consequences of the highly cherished belief being false, or of the deeply frightening belief being true. We hope that it will thereby become just a little easier to evaluate the plausibility of that belief, for if we are wrong, at least we know what we're doing about it. Sometimes, if not often, what you'd really do about it isn't as bad as your intuitions would have you think.
If I had to put my finger on the source of that technique's power, I would name its ability to reduce the perceived hedonic costs of truthseeking. It's hard to estimate the plausibility of a charged idea because you expect your undesired outcome to feel very bad, and we naturally avoid this. The trick is in realizing that, in any given situation, you have almost certainly overestimated how bad it would really feel.
But Sun Tzu didn't just plan his own retreats; he also planned his enemies' retreats. What if your interlocutor has not practiced the rationality technique of Leaving a Line of Retreat? Well, Sun Tzu might say, "Leave one for them."
As I noted in the beginning, adults automatically compartmentalize fiction away from reality. It is simply easier for me to watch The Mummy than it was when I was eight. The formidable idea of my father having his eyes and tongue removed is easier to hold at a distance.
Thus, I hypothesize, truth in fiction is hedonically cheap to seek.
When you recite the Litany of Gendlin, you do so because it makes seemingly bad things seem less bad. I propose that the idea generalizes: when you're experiencing fiction, everything seems less bad than its conceivably real counterpart, it's stuck inside the book, and any ideas within will then seem less formidable. The idea is that you can use fiction as an implicit line of retreat, that you can use it to make anything seem less bad by making it make-believe, and thus, safe. The key, though, is that not everything inside of fiction is stuck inside of fiction forever. Sometimes conclusions that are valid in fiction also turn out to be valid in reality.
This is hard to use on yourself, because you can't make a real scary idea into fiction, or shoehorn your scary idea into existing fiction, and then make it feel far away. You'll know where the fiction came from. But I think it works well on others.
I don't think I can really get the point across in the way that I'd like without an example. This proposed technique was an accidental discovery, like popsicles or the Slinky:
A history student friend of mine was playing Fallout: New Vegas, and he wanted to talk to me about which ending he should choose. The conversation seemed mostly optimized for entertaining one another, and, hoping not to disappoint, I tried to intertwine my fictional ramblings with bona fide insights. The student was considering giving power to a democratic government, but he didn't feel very good about it, mostly because this fictional democracy was meant to represent anything that anyone has ever said is wrong with at least one democracy, plausible or not.
"The question you have to ask yourself," I proposed to the student, "is 'Do I value democracy because it is a good system, or do I value democracy per se?' A lot of people will admit that they value democracy per se. But that seems wrong to me. That means that if someone showed you a better system that you could verify was better, you would say 'This is good governance, but the purpose of government is not good governance, the purpose of government is democracy.' I do, however, understand democracy as a 'current best bet' or local maximum."
I have in fact gotten wide-eyed stares for saying things like that, even granting the closing ethical injunction on democracy as local maximum. I find that unusual, because it seems like one of the first steps you would take towards thinking about politics clearly, to not equivocate democracy with good governance. If you were further in the past and the fashionable political system were not democracy but monarchy, and you, like many others, consider democracy preferable to monarchy, then upon a future human revealing to you the notion of a modern democracy, you would find yourself saying, regrettably, "This is good governance, but the purpose of government is not good governance, the purpose of government is monarchy."
But because we were arguing for fictional governments, our autocracies, or monarchies, or whatever non-democratic governments heretofore unseen, could not by magic have fire breathed into them. For me to entertain the idea of a non-democratic government in reality would have solicited incredulous stares. For me to entertain the idea in fiction is good conversation.
The student is one of two people with whom I've had this precise conversation, and I do mean in the particular sense of "Which Fallout ending do I pick?" I snuck this opinion into both, and both came back weeks later to tell me that they spent a lot of time thinking about that particular part of the conversation, and that the opinion I shared seemed deep.
Also, one of them told me that they had recently received some incredulous stares.
So I think this works, at least sometimes. It looks like you can sneak scary ideas into fiction, and make them seem just non-scary enough for someone to arrive at an accurate belief about that scary idea.
I do wonder though, if you could generalize this even more. How else could you reduce the perceived hedonic costs of truthseeking?