Buddhism advocates decompiling desire in order to avoid suffering, which is tantamount to nihilism.
"Desire is the cause of suffering" is a bad translation of "tanha is the cause of dukkha."
Briefly, the brahmaviharas are the qualities said to remain in a fully awakened Buddha. Those are loving kindness, compassion, appreciative joy, and equanimity. Equanimity is often mistaken for its near enemy, apathy. All of them could easily be defined as having a quality of desire for well being, which is incompatible with the version of the idea that Buddhists lobotomize themselves with altered states or something.
Versions of nihilism were also very popular at the time of the Buddha, and he repudiates them as wrong view.
Hm, ok, thanks. I don't I fully understand+believe your claims. For one thing, I would guess that many people do think and act, under the title "Buddhism", as if they believe that desire is the cause of suffering.
If I instead said "Clinging/Striving is the cause of [painful wheel-spinning in pursuit of something missing]", is that any closer? (This doesn't really fit what I'm seeing in the Wiki pages.) I would also say that decompiling clinging/striving in order to avoid [painful wheel-spinning in pursuit of something missing] is tantamount to nihilism. (But maybe to learn what you're offering I'd have to do more than just glance at the Wiki pages.)
Wisdom is well-calibrated intelligence: enough not to get exploited, but not so much that it provokes hatred.
Laterally through the chronophone
In "Archimedes's Chronophone", Yudkowsky asks: What would you say to Archimedes——what important message would you want to send back in time, to set the world on a hopeworthy course from then on——if you're barred from saying anything that's too anachronistic? That is: What would you say, if the message Archimedes receives is not literally what you said, but rather is whatever would be the output of the timeless principles that you used to generate your message, as applied in Archimedes's mind in his context? He then explains that the question points at advice we can give to ourselves for original thinking. More generally, he writes:
Lateral anachronism
This question doesn't only address what to say to Archimedes through the chronophone, or what to say to ourselves. It also addresses what advice we can give to our contemporaries, when our contemporaries are separated from us by a chasm that's like the chasm that separates us from Archimedes.
This sort of "lateral anachronism" shows up across major differences in mindset, such as between people living in different cultures, countries, or ideologies. (People going along parallel but separate timecourses, you could say.) Someone's context——their education, communities, language, and so on——will determine what {concepts, ways of thinking, ways of being, coordination points, values, possibilities} they'll understand and give weight to. If someone comes from a world different enough from your world, and they try to communicate something important to you, you're prone to, one way or another, not really take on board what they wanted to communicate to you. You'll misunderstand, overtranslate, dismiss, ignore, round off, pigeonhole, be defensive about, or fearfully avoid what they're saying.
Lateral anachronism also shows up in situations of conflict. Every motion the other person makes——every statement, every argument, every proposed conversational procedure, every negotiation, every plea, every supposed common ground——may be a lie, a ploy to mislead you about their beliefs or intentions, trolling bait, a performance to rally their troops or to garner third-party support or maintain their egoic delusion, an exploitation of your good will, a distraction from their hidden malevolent activity, interference with your line of thinking, or an attempt to propagandistically disrupt your own internal political will and motivation. Conflict is a hell of a drug. Any action can be rationalized as deeply nefarious with a bit of effort, and taking that interpretive stance towards another person is perhaps nearly a hardwired instinctive pattern that can trigger and self-sustainingly stay triggered.
Examples of lateral anachronism
You have a detailed argument for why cryonics is high expected value and I should sign up? That just tells me to use weird status moves to push people into ignoring AGI risk and being excited about the upside, because that's me using my accustomed [way to apply social pressure] to get people to buy into my preferred [coordination-point to make my sector of society behave optimistically, regardless of whether or not the "belief" involved actually makes sense].
You demand that people making factual claims relevant to public policy must put explicit probabilities on observable correlates of their statements? That just tells me to demand that people making policy claims must have a PhD and run a major AI lab, because that's [the externally verifiable standard that I'm already prepared to meet and that my ideological opponents are not already prepared to meet]. You double down, saying that explicit probabilities and updates can be mathematically shown to enable truth tracking in the long run? I double down, saying that the community of certified researchers is the only group qualified to discern useful and prudent research from useless and imprudent research, because that's [the underlying judgement criterion that I already like and expect to put me in power].
But it's actually true that you really are trying to be truth-seeking, not power-seeking, you object? And, you go on, you're truth-seeking about truth-seeking, and you'd truth-seekingly update about what you advocate as truth-seeking if there were truth-seekingly-relevant evidence that shows in what direction you truth-seekingly-should update your beliefs about what cognitive policies are truth-seeking? So when my chronophone interprets your underlying cognitive policy as being about power-seeking, the chronophone is being unfair and denying your existence as a truth-seeker, and objectively I should be translating what you say as being truth-seeking? Well, just as you have denied the existence of me, a power-seeker, by asserting, at this meta-level of our discourse, that what's "actually true" or "objective" should be our politically shared criterion, rather than what is the power-seekingly expedient thing to do, so too will I deny the existence of you, a truth-seeker, and discount your claims as just more power-seeking moves intended to gain the upper hand in who gets to adjudicate which cognitive policies will be followed. Because apparently that's what we're doing, denying other people's ways of being. Because that's my [locally-reflectively-stable object-and-meta-level cognitive policy for relating to other people].
Answer?: Rationality
The question is:
Which also answers the question:
Rationality is one class of answers. But that answer doesn't cross all chasms——sometimes it doesn't easily translate to other mindsets, it isn't picked up, it can't be argued into someone who doesn't accept the sort of justifications that come naturally to someone in a Rational mindset. A Rationalist might argue that, by its nature, Rationality can be eventually argued into more or less any person who hasn't completely closed themselves off from noticing new ways of thinking, because Rationality is soooo gewd in such great generality. But even if that's true, the "eventually" might be only after a long time or under specific circumstances. So Rationality doesn't exhaust the practical need for chasm-crossing policies.
What are some answers that more easily cross some chasms that Rationality doesn't easily cross?
Rationality can rightly claim to contain all other virtues——the nameless virtue, the virtue of the void, regenerates all other virtues, even including virtues that describe a self-consciously bounded-compute, boundedly-plastic agent. But we're created alrady in motion, so that there are fairly low-compute, low-exploration default fallback cognitive policies that are familiar and often okay. These fallbacks often contain core parts of our values, and often don't come already explicit, already in a form that's ready to cleanly update on evidence. There are Chesterton's fences, and crossing these fences sometimes leads into valleys of bad rationality before leading back out into winning.
We are cognitive misers. By strong default, we don't do anything that's resource intensive (i.e. using up energy, computation, attention, brainware, or plasticity), such as exploring, computing implications, updating our beliefs, or refactoring our goals. Hopping a fence and crossing a valley take resources. The winning Rationality, or wherever else you find yourself, is one destination after some choice of fences and valleys crossed, among many possible destinations (even if they're all on their way to a single more distant destination). Behind, under, or before your current destination, there are fallbacks, original intuitions, primordial instincts.
So: Even if someone doesn't go in for this whole "rationality" business, there are still shards of winning-behavior and shards of value in them. Those shards of winning and shards of value are enough to be powerful, and will have their own local reflective stability and local coherence. They may be enough to embed cognitive policies that are transmissible across chasms of lateral anachronism, more fluently than some more complete Rationality, and that would robustly lead to goodness.
Answer?: Wisdom
One conjectural answer to the dilemma: Wisdom. It would be wise to step back from the Precipice. It would be wise to step out of races and conflicts. It would be wise to back off from posturing. It would be wise to deeply consider the risk of changing everything. It would be wise to relinquish a grasping, clawing need to be the One who brings fire down from heaven. It would be wise to put the sweep of humanity and humaneness above personal ambition and interested looks.
Wisdom is appealing to some, maybe even sometimes where Rationality isn't. Rationality is demanding, hygienic, IQ-dependent, compute-hungry. Wisdom is esteemed. Wisdom makes you someone that others can look to. Wisdom makes life rich and creates the context needed for flourishing. Wisdom dances around suffering. Wisdom maintains balance across the narrow bridge.
How could wisdom be spoken through a chronophone? First we have to see what wisdom is.
What wisdom is
A very partial list:
This list is a blind man touching the elephant's ear. What else would you say about wisdom? Especially things that aren't just Good, Rational, Desirable, Competent.
What wisdom is not
Delimitations
Some things that are nearby wisdom but aren't wisdom:
Where wisdom fails
Perversions of wisdom
Hypothesis
What, if anything, is wisdom? Is it different from sanity?
A hypothesis: Wisdom is getting the first-order bits right.
This definition breaks down when what's needed is calculation, energy, new ideas. (But it would be wise to invest in calculating, directing energy, and creating new ideas, and unwise not to do so.) So the definition should be refined:
So for example, wisdom might get the wrong answer, but wisdom will pause and take its time to think, because pausing and taking the time to think is a familiar way of being. Wisdom doesn't avoid fear, or pretend that there's nothing to fear, but wisdom will track that acute fear compromises judgement——fear is a major fact about a mental state.
The "familiar internal language of living" means the mental elements that we're intimately familiar with, because they are us. It doesn't mean mental elements that we have words for. For example, wisdom will notice when thoughts have been [repeating themselves without going any of the branching paths that would build up a better understanding] and back off from doing that, even if there's not a short word for that. It's something that can be noticed, in the course of familiarly reflecting on familiar mental events, and is sometimes a first-order bit.
Whereas rationality (and sanity?) goes along with competence, it's not so strange to be wise and incompetent. That means you have the important things pointed in the right direction, even if you can't go very far in a direction on your own. We could say:
Questions
A wise person is a person who does things in a way that is not horribly wrong, along all salient dimensions, including reflectively. Is this something that supports an induction? Is someone who gets most first-order bits right likely to also get almost all first-order bits right, and to go on correcting the very wrong first-order bits? Is wisdom a form of correlated coverage? Or no?
Is it something that can be {taught, induced, evoked}? Is it something that spreads? Is there a basin of attraction around it? Do people find wisdom appealing? Should they? Can wisdom be truthfully made more appealing?
Who has been wise? Who has been unwise? When has wisdom mattered? When has wisdom not mattered?
Would a wise person destroy the world with their own hands?