This post is unusually white. The two arguments -- all shades of gray being seen as the same shade and science being a demonstrably better "religion" -- have seriously expanded my mind. Thank you!
That which I cannot eliminate may be well worth reducing.
I wish this basically obvious point was more widely appreciated. I've participated in dozens of conversations which go like this:
Me: "Government is based on the principle of coercive violence. Coercive violence is bad. Therefore government is bad." Person: "Yeah, but we can't get rid of government, because we need it for roads, police, etc." Me: " $%&*@#!! Of course we can't get rid of it entirely, but that doesn't mean it isn't worth reducing!"
Great post. I encourage you to expand on the idea of the Quantitative Way as applied to areas such as self improvement and everyday life.
Seriously, what some people call "government" is the ground upon which civilization, and ultimately all rationality, rests.
I was nodding along until: "The ground upon which all rationality rests".
You seem to have fallen into same trap of self-defeating hyperbole that the quoted straw-libertarian has fallen into. It is enough to make your point that government, and the implied threat of violence is not all bad and is even useful. Don't try to make ridiculous claims about "all rationality". Apart from being a distasteful abuse of 'rational' as an applause light it is also false. With actual rational agents all sorts of alternative arrangements not fitting the label "government" would be just as good---it is the particular quirks of humans that make government more practical for us right now.
All who love this post, do you love it because it told you something you didn't know before, or because you think it would be great to show others who you don't think understand this point? I worry when our reader's favorite posts are based on how much they agree with the post, instead of how much they learned from it.
It's possible both are true: that the reader understood the point already, but learned a better way to articulate it in an effort to advance another conversation.
For me, the main point is incremental advancement towards perfection means expending resources and creating other consequences. The questions ultimately have to be 'how much is it worth to move closer to perfection? What other consequences probably will happen?' This question obviously depends on your context. It appears that some kinds of perfectionism, as far as I can tell, have negative effects on the holder of perfectionistic standards, in the view of psychologists, relevant experts on the matter, and that costs have to be considered when moving in...
Robin, I think people tend to be enthusiastic when an idea they've known on a more or less intuitive level for a long time is laid out eloquently, and in a way they could see relaying to their particular audience. It's a form of relief, maybe.
So it's not so much "I like it because I agree with it," it's more "I like it because I knew it before but I could never explain it that well."
/unscientific guessing
Robin,
I'm with LG, the answer to your question is 'neither'. I also enjoy posts which reinform my way of thinking, but a straight account of what I already think myself wouldn't draw praise. Crystallization of a hitherto-unclear concept can be invaluable - I quote:
"What I got out of the paragraph was something which seems so obvious in retrospect that I could have conceivably picked it up in a hundred places; but something about that one paragraph made it click for me."
Mike, any action or updating of beliefs will have a net effect on 'whiteness' ...
Then why do they say "Science is based on faith too!" in that angry-triumphal tone, rather than as a compliment?
When used appropriately, the "science is based on faith too" point is meant to cast doubt upon specific non-falsifiable conclusions that scientists take for granted: for instance, that the only things that exist are matter (rather than, say, an additional immaterial spirit) or that evolution happens by itself (rather than, say, being directed by an intelligent designer). Scientific evidence doesn't distinguish between these h...
Utilitarian, you said:
non-falsifiable conclusions that scientists take for granted: for instance, that the only things that exist are matter (rather than, say, an additional immaterial spirit) or that evolution happens by itself (rather than, say, being directed by an intelligent designer).
How much time did you spend trying to come up with predictions from these hypotheses before declaring them unfalsifiable?
How much time did you spend trying to come up with predictions from these hypotheses before declaring them unfalsifiable?
Not much; it's possible that these hypotheses are falsifiable (in the sense of having a likelihood ratio < 1 compared against the other corresponding hypothesis). I was assuming this wasn't true given only the evidence currently available, but I'd be glad to hear if you think otherwise.
It's easy to think of potential observations that would very strongly favor dualism or intelligent design, and the absence of those observations counts as falsifying evidence.
I think it's worth keeping the distinction between falsification (a likelihood ratio of 0) and disconfirmation (a likelihood ratio < 1). Usually when people say "unfalsifiable" they really mean "undisconfirmable" or "unstronglydisconfirmable".
Dan Burfoot, permit me to join in those conversations:
Me: "No, coercive violence is merely a shade of gray. Another harm of the status quo, like sick children, may be a darker shade of gray, in which case I'm willing to become a little darker so I can gain more lightness overall. For example, I don't think there's much opposition to using coercive violence to protect the life of infants (criminalizing infanticide, taxation to support wards of state, etc.). Of course, opinions on the relative light/darkness of coercive violence vs. other 'bad' differ, and therein lies the popular contention between 'big govt' vs. 'small govt,' not whether government based on coercive violence, or that coercive violence is bad."
This post reminds me of Isaac Asimov's The Relativity of Wrong, which is excellent. Wikipedia page
LG, Doesn't that mean you like the post, specifically becuase it appeals to confirmation bias, one of the known biases we should be seeking to overcome?
In other words, "numbers matter". But I suppose mentioning numbers eliminates most of your audience.
Thanks, Eliezer, for an excellent article. Some of my favorite quotables:
the Quantitative Way
Everything is shades of gray, but there are shades of gray so light as to be very nearly white, and shades of gray so dark as to be very nearly black.
If science is a religion, it is the religion that heals the sick and reveals the secrets of the stars.
"Everyone is imperfect" is an excellent example of replacing a two-color view with a one-color view.
Then there's the fallacy of shades of gray: that every space can be reasonably modeled as 1-dimensional.
This was a very useful post and one I will be adding into my daily dossier I know. I agree this is a good "start post" because it is lucid, clear, and useful. There's little I feel to add at the moment as doing so would simply be glorifying the item itself rather than using the knowledge gained, so thank you for the post.
I'm glad this post is here! Today, I came across this lovely little statement on Xanga: "Richard Dawkins admitted recently that he can't be sure that God does not exist. He is generally considered the World's most famous Atheist. So this question is for Atheists. Can you be sure that God does not exist?"
It made me cranky right away (I promise, I was more patient many many instances of this sentiment ago), and my first response was to link here in a comment. Well, I'm glad this post is here to link to. Grr.
Surprised no-one's yet noted that the proper name for this is the continuum fallacy or sorites fallacy.
i don't follow the relevance of article, as it seems quite obvious. the real problem with the black and white in the world of rationality is the assumption there is a universal answer to all questions. the idea of "grey" helps highlight that many answers have no one correct universal answer. what i dont understand about rationalists (LW rationalists) is that the live in a world in which everything is either right or wrong. this simplifies a world that is not so simple. what am i missing?
Science is not based on faith, nor on anything else. Scientific knowledge is created by conjecture and criticism. See Chapter I of "Realism and the Aim of Science" by Karl Popper.
I came across a good example of this. I recently graduated from a coding bootcamp and am looking for jobs. I applied to a selective company and was declined. They said, "unfortunately we won't be able to move forward with your candidacy at this time". They didn't say anything about the actual reason why I was rejected.
(paraphrased conversation with my friend)
My favorite part of this post directly after reading was the highlighting of the apparent contradiction between the faithist's pride in their faith and the condemnation in their accusation of faith's use by science.
But I noticed I didn't feel I totally understood the dynamics in play in such a mind, and decided to think about it over pasta.
My tentative conclusion:
This is not, I think, a case of bare-faced irrationality per se, as per "What would you do with immortality" when conjoined with "I have an immortal soul."
The condemnation in t...
When I first read this post back in ~2011 or so, I remember remembering a specific scene in a book I had read that talked about this error and even gave it the same name. I intended to find the quote and post it here, but never bothered. Anyway, seeing this post on the front page again prompted me to finally pull out the book and look up the quote (mostly for the purpose of testing my memory of the scene to see if it actually matched what was written).
So, from Star Wars X-Wing: Isard's Revenge, by Michael A Stackpole (page 149 of the paperback edition):
...T
I think one important problem, elided here, is that when problems are highly multidimensional then shades of grey will be harder to distinguish. At the extremes, yes, we can say that Gandhi and Stalin are imperfect in quantitatively different amounts. But most of the important life decisions we make can be evaluated on so many different dimensions of value that discriminating and integrating across them feels intractable. Even 3 or 4 dimensions makes the problem so effortful (and perhaps impossible if the dimensions are not commensurable) that falling back to intuition becomes the only pragmatic solution.
A related pattern I noticed recently:
Alice asked for a one-variable model with limited but positive predictive power, and Bob replied with a zero-variable model with no predictive power whatsoever.
Necro but maybe I can add something to the debate....
A problem I see is there are common cases where it is rational to be irrational, for example if being rational causes you emotional distress due to circumstances beyond your control.
And this is a big problem if one's will to be "rational" is at root based on an emotional will to be "less wrong" for the purpose of improving internal feelings of one's own value.
Because if that is the naked honest goal, then that rationalism is Hedonism by yet another name.
But realizing that might be destabilizing to the ra...
“I try to mostly just straightforwardly apply economic theory, adding little personal or cultural judgment.”
Another problem with this is "economic theory" is not monolithic. There are different schools of thought within economics, and applying economic theory No. 1 from X school might imply completely different things than applying it from Y school. Economics is a fractured, competitive field of concepts to say the least. Go listen to an argument between Neoclassical economists and Post-Keynesian economists and see what they agree on.
I don't know if the Sophisticate's mistake has an official name, but I call it the Fallacy of Gray. We saw it manifested in yesterday's post—the one who believed that odds of two to the power of seven hundred and fifty millon to one, against, meant "there was still a chance". All probabilities, to him, were simply "uncertain" and that meant he was licensed to ignore them if he pleased.
"The Moon is made of green cheese" and "the Sun is made of mostly hydrogen and helium" are both uncertainties, but they are not the same uncertainty.
Everything is shades of gray, but there are shades of gray so light as to be very nearly white, and shades of gray so dark as to be very nearly black. Or even if not, we can still compare shades, and say "it is darker" or "it is lighter".
Years ago, one of the strange little formative moments in my career as a rationalist was reading this paragraph from Player of Games by Iain M. Banks, especially the sentence in bold:
Now, don't write angry comments saying that, if societies impose fewer of their values, then each succeeding generation has more work to start over from scratch. That's not what I got out of the paragraph.
What I got out of the paragraph was something which seems so obvious in retrospect that I could have conceivably picked it up in a hundred places; but something about that one paragraph made it click for me.
It was the whole notion of the Quantitative Way applied to life-problems like moral judgments and the quest for personal self-improvement. That, even if you couldn't switch something from on to off, you could still tend to increase it or decrease it.
Is this too obvious to be worth mentioning? I say it is not too obvious, for many bloggers have said of Overcoming Bias: "It is impossible, no one can completely eliminate bias." I don't care if the one is a professional economist, it is clear that they have not yet grokked the Quantitative Way as it applies to everyday life and matters like personal self-improvement. That which I cannot eliminate may be well worth reducing.
Or consider this exchange between Robin Hanson and Tyler Cowen. Robin Hanson said that he preferred to put at least 75% weight on the prescriptions of economic theory versus his intuitions: "I try to mostly just straightforwardly apply economic theory, adding little personal or cultural judgment". Tyler Cowen replied:
Yes, but you can try to minimize that effect, or you can do things that are bound to increase it. And if you try to minimize it, then in many cases I don't think it's unreasonable to call the output "straightforward"—even in economics.
"Everyone is imperfect." Mohandas Gandhi was imperfect and Joseph Stalin was imperfect, but they were not the same shade of imperfection. "Everyone is imperfect" is an excellent example of replacing a two-color view with a one-color view. If you say, "No one is perfect, but some people are less imperfect than others," you may not gain applause; but for those who strive to do better, you have held out hope. No one is perfectly imperfect, after all.
(Whenever someone says to me, "Perfectionism is bad for you," I reply: "I think it's okay to be imperfect, but not so imperfect that other people notice.")
Likewise the folly of those who say, "Every scientific paradigm imposes some of its assumptions on how it interprets experiments," and then act like they'd proven science to occupy the same level with witchdoctoring. Every worldview imposes some of its structure on its observations, but the point is that there are worldviews which try to minimize that imposition, and worldviews which glory in it. There is no white, but there are shades of gray that are far lighter than others, and it is folly to treat them as if they were all on the same level.
If the moon has orbited the Earth these past few billion years, if you have seen it in the sky these last years, and you expect to see it in its appointed place and phase tomorrow, then that is not a certainty. And if you expect an invisible dragon to heal your daughter of cancer, that too is not a certainty. But they are rather different degrees of uncertainty—this business of expecting things to happen yet again in the same way you have previously predicted to twelve decimal places, versus expecting something to happen that violates the order previously observed. Calling them both "faith" seems a little too un-narrow.
It's a most peculiar psychology—this business of "Science is based on faith too, so there!" Typically this is said by people who claim that faith is a good thing. Then why do they say "Science is based on faith too!" in that angry-triumphal tone, rather than as a compliment? And a rather dangerous compliment to give, one would think, from their perspective. If science is based on 'faith', then science is of the same kind as religion—directly comparable. If science is a religion, it is the religion that heals the sick and reveals the secrets of the stars. It would make sense to say, "The priests of science can blatantly, publicly, verifiably walk on the Moon as a faith-based miracle, and your priests' faith can't do the same." Are you sure you wish to go there, oh faithist? Perhaps, on further reflection, you would prefer to retract this whole business of "Science is a religion too!"
There's a strange dynamic here: You try to purify your shade of gray, and you get it to a point where it's pretty light-toned, and someone stands up and says in a deeply offended tone, "But it's not white! It's gray!" It's one thing when someone says, "This isn't as light as you think, because of specific problems X, Y, and Z." It's a different matter when someone says angrily "It's not white! It's gray!" without pointing out any specific dark spots.
In this case, I begin to suspect psychology that is more imperfect than usual—that someone may have made a devil's bargain with their own mistakes, and now refuses to hear of any possibility of improvement. When someone finds an excuse not to try to do better, they often refuse to concede that anyone else can try to do better, and every mode of improvement is thereafter their enemy, and every claim that it is possible to move forward is an offense against them. And so they say in one breath proudly, "I'm glad to be gray," and in the next breath angrily, "And you're gray too!"
If there is no black and white, there is yet lighter and darker, and not all grays are the same.
Addendum: G points us to Asimov's The Relativity of Wrong: "When people thought the earth was flat, they were wrong. When people thought the earth was spherical, they were wrong. But if you think that thinking the earth is spherical is just as wrong as thinking the earth is flat, then your view is wronger than both of them put together."