Degrees of Radical Honesty
The Black Belt Bayesian writes:
Promoting less than maximally accurate beliefs is an act of sabotage. Don’t do it to anyone unless you’d also slash their tires, because they’re Nazis or whatever.
Eliezer adds:
If you'll lie when the fate of the world is at stake, and others can guess that fact about you, then, at the moment when the fate of the world is at stake, that's the moment when your words become the whistling of the wind.
These are both radically high standards of honesty. Thus, it is easy to miss the fact that they are radically different standards of honesty. Let us look at a boundary case.
Thomblake puts the matter vividly:
Suppose that Anne Frank is hiding in the attic, and the Nazis come asking if she's there. Harry doesn't want to tell them, but Stan insists he mustn't deceive the Nazis, regardless of his commitment to save Anne's life.
So, let us say that you are living in Nazi Germany, during WWII, and you have a Jewish family hiding upstairs. There's a couple of brownshirts with rifles knocking on your door. What do you do?
You don't need Kant
Related to: Comments on Degrees of Radical Honesty, OB: Belief in Belief, Cached Thoughts.
"Nothing worse could happen to these labours than that anyone should make the unexpected discovery that there neither is, nor can be, any a priori knowledge at all.... This would be the same thing as if one sought to prove by reason that there is no reason" (Critique of Practical Reason, Introduction).
You don't need Kant to demonstrate the value of honesty. In fact, summoning his revenant can be a dangerous thing to do. You end up in the somewhat undesirable situation of having almost the right conclusion, but having it for the wrong reasons. Reasons you weren't even aware of, because they were all collapsed into the belief, "I believe in person X".
One of the annoying things about philosophy is that the dead simply don't die. Once a philosopher or philosophical doctrine gains some celebrity in the community, it's very difficult to convince anyone afterward that said philosopher or doctrine was flawed. In other words, the philosophical community tends to have problems with relinquishment. Therefore, there are still many philosophers that spend their careers studying, for example, Plato, apparently not with the intent to determine what parts of what Plato wrote are correct or still applicable, but rather with the intent to defend Plato from criticism. To prove Plato was right.
Since the community doesn't value relinquishment, the cost of writing a flawed criticism is very low. Therefore, journals are glutted with so-called "negative results": "Kant was wrong", "Hegel was wrong", etc. No one seriously believes otherwise, but writing positive philosophical results is hard, and not writing at all isn't a viable career option for a professional philosopher.
To its credit, MBlume refrains from bringing up Kant in his article on radical honesty, where he cites other, more feasible variants of radical honesty. However, in the comments, Kant rears his ugly head.
Most Rationalists Are Elsewhere
Most healthy intellectual blogs/forums participate in conversations among larger communities of blogs and forums. Rather than just "preaching to a choir" of readers, such blogs often quote and respond to posts on other blogs. Such responses sometimes support, and sometimes criticize, but either way can contribute to a healthy conversation.
If folks at Less Wrong saw themselves as a part of a larger community of rationalists, they would realize that most rationalist authors and readers are not at Less Wrong. To participate in a healthy conversation among the wider community of rationalists, they would often respond to posts at other sites, and expect other sites to respond often to them. In contrast, an insular group defined by something other than its rationality would be internally focused, rarely participating in such larger conversations.
Today at Overcoming Bias I respond to a post by Eliezer here at Less Wrong. Though I post occasionally here at Less Wrong, I will continue to post primarily at Overcoming Bias. I consider myself part of a larger rationalist community, and will continue to riff off relevant posts here and elsewhere. I hope you will continue to see me as a part of your relevant world.
I worry a little that Less Wrong karma score incentives may encourage an inward focus, since karma is so far only scored for internal site activity.
Don't Revere The Bearer Of Good Info
Follow-up to: Every Cause Wants To Be A Cult, Cultish Countercultishness
One of the classic demonstrations of the Fundamental Attribution Error is the 'quiz study' of Ross, Amabile, and Steinmetz (1977). In the study, subjects were randomly assigned to either ask or answer questions in quiz show style, and were observed by other subjects who were asked to rate them for competence/knowledge. Even knowing that the assignments were random did not prevent the raters from rating the questioners higher than the answerers. Of course, when we rate individuals highly the affect heuristic comes into play, and if we're not careful that can lead to a super-happy death spiral of reverence. Students can revere teachers or science popularizers (even devotion to Richard Dawkins can get a bit extreme at his busy web forum) simply because the former only interact with the latter in domains where the students know less. This is certainly a problem with blogging, where the blogger chooses to post in domains of expertise.
Specifically, Eliezer's writing at Overcoming Bias has provided nice introductions to many standard concepts and arguments from philosophy, economics, and psychology: the philosophical compatibilist account of free will, utility functions, standard biases, and much more. These are great concepts, and many commenters report that they have been greatly influenced by their introductions to them at Overcoming Bias, but the psychological default will be to overrate the messenger. This danger is particularly great in light of his writing style, and when the fact that a point is already extant in the literature, and is either being relayed or reinvented, isn't noted. To address a few cases of the latter: Gary Drescher covered much of the content of Eliezer's Overcoming Bias posts (mostly very well), from timeless physics to Newcomb's problems to quantum mechanics, in a book back in May 2006, while Eliezer's irrealist meta-ethics would be very familiar to modern philosophers like Don Loeb or Josh Greene, and isn't so far from the 18th century philosopher David Hume.
If you're feeling a tendency to cultish hero-worship, reading such independent prior analyses is a noncultish way to diffuse it, and the history of science suggests that this procedure will be applicable to almost anyone you're tempted to revere. Wallace invented the idea of evolution through natural selection independently of Darwin, and Leibniz and Newton independently developed calculus. With respect to our other host, Hans Moravec came up with the probabilistic Simulation Argument long before Nick Bostrom became known for reinventing it (possibly with forgotten influence from reading the book, or its influence on interlocutors). When we post here we can make an effort to find and explicitly acknowledge such influences or independent discoveries, to recognize the contributions of Rational We, as well as Me.
Cached Selves
by Anna Salamon and Steve Rayhawk (joint authorship)
Related to: Beware identity
A few days ago, Yvain introduced us to priming, the effect where, in Yvain’s words, "any random thing that happens to you can hijack your judgment and personality for the next few minutes."
Today, I’d like to discuss a related effect from the social psychology and marketing literatures: “commitment and consistency effects”, whereby any random thing you say or do in the absence of obvious outside pressure, can hijack your self-concept for the medium- to long-term future.
To sum up the principle briefly: your brain builds you up a self-image. You are the kind of person who says, and does... whatever it is your brain remembers you saying and doing. So if you say you believe X... especially if no one’s holding a gun to your head, and it looks superficially as though you endorsed X “by choice”... you’re liable to “go on” believing X afterwards. Even if you said X because you were lying, or because a salesperson tricked you into it, or because your neurons and the wind just happened to push in that direction at that moment.
For example, if I hang out with a bunch of Green Sky-ers, and I make small remarks that accord with the Green Sky position so that they’ll like me, I’m liable to end up a Green Sky-er myself. If my friends ask me what I think of their poetry, or their rationality, or of how they look in that dress, and I choose my words slightly on the positive side, I’m liable to end up with a falsely positive view of my friends. If I get promoted, and I start telling my employees that of course rule-following is for the best (because I want them to follow my rules), I’m liable to start believing in rule-following in general.
All familiar phenomena, right? You probably already discount other peoples’ views of their friends, and you probably already know that other people mostly stay stuck in their own bad initial ideas. But if you’re like me, you might not have looked carefully into the mechanisms behind these phenomena. And so you might not realize how much arbitrary influence consistency and commitment is having on your own beliefs, or how you can reduce that influence. (Commitment and consistency isn’t the only mechanism behind the above phenomena; but it is a mechanism, and it’s one that’s more likely to persist even after you decide to value truth.)
Rationalist Poetry Fans, Unite!
Related to: Little Johnny Bayesian, Savanna Poets
There are certain stereotypes about what rationalists can talk about versus what's really beyond the pale. So far, Less Wrong has pretty consistently exploded those stereotypes. In the past three weeks, we've discussed everything from Atlantis to chaos magick to "9-11 Truth". But I don't think anything surprised me quite as much as learning that there are a couple of rationalists here with a genuine interest in poetry.
Poetry has not been very friendly to the rational worldview over the past few centuries. What with all the 19th century's talk of unweaving rainbows and the 20th century's talk of quadrupeds swooning into billiard balls, it's tempting to think it reflects some natural order of things, some eternal conflict between Art and Science.
But for most of human history, science and art were considered natural allies. Lucretius' De Rerum Natura, an argument for atheism and atomic theory famous for being the ancient Roman equivalent of The God Delusion, was written in poetry. All through the Middle Ages, artists worked to a philosophy of trying to depict and celebrate natural truth. And the eighteenth century saw a golden age of what was sometimes called "rationalist poetry", a versified celebration of Enlightenment principles.
When William Wordsworth launched his poetic jihad against rationalism, he called his declaration of war The Tables Turned. On a mundane level, the title referred to an argument he was having with his friend, but on a grander scale he was consciously inverting the previous order of Reason as the virtue of poetry. Thus:
Enough of Science and of Art;
Close up these barren leaves;
Come forth, and bring with you a heart
That watches and receives.
Over the next few years, he and fellow jihadis John Keats and Percy Bysshe Shelley were wildly successful in completely changing the poetic ideal. I can't begrudge them their little movement; their poetry ranks among the greatest art ever produced by humankind. But it bears repeating that there was a strong rationalist tradition in poetry before, during, and after the Romantic Era. In its honor, I thought I would share some of my favorite rationalist poems. I make no claims that this is exhaustive, representative, or anything else besides my personal choices.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)