Optimism versus cryonics
Within the immortalist community, cryonics is the most pessimistic possible position. Consider the following superoptimistic alternative scenarios:
- Uploading will be possible before I die.
- Aging will be cured before I die.
- They will be able to reanimate a whole mouse before I die, then I'll sign up.
- I could get frozen in a freezer when I die, and they will eventually figure out how to reanimate me.
- I could pickle my brain when I die, and they will eventually figure out how to reanimate me.
- Friendly AI will cure aging and/or let me be uploaded before I die.
Cryonics -- perfusion and vitrification at LN2 temperatures under the best conditions possible -- is by far less optimistic than any of these. Of all the possible scenarios where you end up immortal, cryonics is the least optimistic. Cryonics can work even if there is no singularity or reversal tech for thousands of years into the future. It can work under the conditions of the slowest technological growth imaginable. All it assumes is that the organization (or its descendants) can survive long enough, technology doesn't go backwards (on average), and that cryopreservation of a technically sufficient nature can predate reanimation tech.
It doesn't even require the assumption that today's best possible vitrifications are good enough. See, it's entirely plausible that it's 100 years from now when they start being good enough, and 500 years later when they figure out how to reverse them. Perhaps today's population is doomed because of this. We don't know. But the fact that we don't know what exact point is good enough is sufficient to make this a worthwhile endeavor at as early of a point as possible. It doesn't require optimism -- it simply requires deliberate, rational action. The fact is that we are late for the party. In retrospect, we should have started preserving brains hundreds of years ago. Benjamin Franklin should have gone ahead and had himself immersed in alcohol.
There's a difference between having a fear and being immobilized by it. If you have a fear that cryonics won't work -- good for you! That's a perfectly rational fear. But if that fear immobilizes you and discourages you from taking action, you've lost the game. Worse than lost, you never played.
This is something of a response to Charles Platt's recent article on Cryoptimism: Part 1 Part 2
A Rational Identity
How facts backfire (previous discussion) discusses the phenomenon where correcting people's mistaken beliefs about political issues doesn't actually make them change their minds. In fact, telling them the truth about things might even reinforce their opinions and entrench them even firmer in their previous views. "The general idea is that it’s absolutely threatening to admit you’re wrong", says one of the researchers quoted in the article.
This should come as no surprise to the people here. But the interesting bit is that the article suggests a way to make people evaluate information in a less biased manner. They mention that one's willingness to accept contrary information is related to one's self-esteem: Nyhan worked on one study in which he showed that people who were given a self-affirmation exercise were more likely to consider new information than people who had not. In other words, if you feel good about yourself, you’ll listen — and if you feel insecure or threatened, you won’t.
I suspect that the beliefs that are the hardest to change, even if the person had generally good self-esteem, are those which are central to their identity. If someone's identity is built around capitalism being evil, or socialism being evil, then any arguments about the benefits of the opposite economical system are going to fall on deaf ears. Not only will that color their view of the world, but it's likely that they're deriving a large part of their self-esteem from that identity. Say something that challenges the assumptions built into their identity, and you're attacking their self-esteem.
Keith Stanovich tells us that simply being intelligent isn't enough to avoid bias. Intelligent people might be better at correcting for bias, but there's no strong correlation between intelligence and the disposition to actually correct for your own biases. Building on his theory, we can assume that threatening opinions will push even non-analytical people into thinking critically, but non-threatening ones won't. Stanovich believes that spreading awareness of biases might be enough to help a lot of people, and to some degree it might. But we also know about the tendency to only use your awareness of bias to attack arguments you don't like. In the same way that telling people facts about politics sometimes only polarizes opinions, telling people about biases might similarly only polarize the debate as everyone thinks their opposition is hopelesly deluded and biased.
So we need to create a new thinking disposition, not just for actively attacking the perceived threats, but for critically evaluating your opinions. That's hard. And I've found for a number of years now that the main reason I try to actively re-evaluate my opinions and update them as necessary is because doing so is part of my identity. I pride myself on not holding onto ideology and for changing my beliefs when it feels like they should be changed. Admitting that somebody else is right and I am wrong does admittedly hurt, but it also feels good that I was able to do so despite the pain. And when I'm in a group where everyone seems to agree about something as self-evident, it frequently works as a warning sign that makes me question the group consensus. Part of the reason why I do that is that I enjoy the feeling of knowing that I'm actively on guard against my mind just adopting whatever belief happens to be fashionable in the group I'm in.
It seems to me that if we want to actually raise the sanity waterline and make people evaluate things critically, and not just conform to different groups than is the norm, a crucial part of that is getting people to adopt an identity of critical thinking. This way, the concept of identity ceases to be something that makes rational thinking harder and starts to actively aid it. I don't really know how one can effectively promote a new kind of identity, but we should probably take lessons from marketers and other people who appeal strongly to emotions. You don't usually pick your identity based on logical arguments. (On the upside, this provides a valuable hint to the question of how to raise rationalist children.)
Undiscriminating Skepticism
Tl;dr: Since it can be cheap and easy to attack everything your tribe doesn't believe, you shouldn't trust the rationality of just anyone who slams astrology and creationism; these beliefs aren't just false, they're also non-tribal among educated audiences. Test what happens when a "skeptic" argues for a non-tribal belief, or argues against a tribal belief, before you decide they're good general rationalists. This post is intended to be reasonably accessible to outside audiences.
I don't believe in UFOs. I don't believe in astrology. I don't believe in homeopathy. I don't believe in creationism. I don't believe there were explosives planted in the World Trade Center. I don't believe in haunted houses. I don't believe in perpetual motion machines. I believe that all these beliefs are not only wrong but visibly insane.
If you know nothing else about me but this, how much credit should you give me for general rationality?
Certainly anyone who was skillful at adding up evidence, considering alternative explanations, and assessing prior probabilities, would end up disbelieving in all of these.
But there would also be a simpler explanation for my views, a less rare factor that could explain it: I could just be anti-non-mainstream. I could be in the habit of hanging out in moderately educated circles, and know that astrology and homeopathy are not accepted beliefs of my tribe. Or just perceptually recognize them, on a wordless level, as "sounding weird". And I could mock anything that sounds weird and that my fellow tribesfolk don't believe, much as creationists who hang out with fellow creationists mock evolution for its ludicrous assertion that apes give birth to human beings.
You can get cheap credit for rationality by mocking wrong beliefs that everyone in your social circle already believes to be wrong. It wouldn't mean that I have any ability at all to notice a wrong belief that the people around me believe to be right, or vice versa - to further discriminate truth from falsity, beyond the fact that my social circle doesn't already believe in something.
Back in the good old days, there was a simple test for this syndrome that would get quite a lot of mileage: You could just ask me what I thought about God.
The Value of Nature and Old Books
People have always had a religious or quasi-religious reverence for nature. In modern times, some people have started to see nature more as an enemy to be conquered than as a god to be worshiped. Such people point out that uncontrolled nature causes a tremendous amount of human suffering (to say nothing of all the misery that it causes other creatures), and that vast improvements to human welfare have largely been the result of us ceasing to love and fear nature and starting to control it.
There are several common responses to this. One response is that it is solipsistic for humans to measure the value of nature in terms of what is and is not good for us. This strikes me as right only insofar as it ignores the welfare of non-human creatures who have enough going on in terms of consciousness and/or sentience to matter; I think the objection would be without merit if one were to broaden the scope of concern to something like all creatures, present and future, capable of having experiences (who else is there to care about?). A second response is that seeing ourselves as highly effective lords over nature leads to dangerous overconfidence, which leads to costly mistakes in how we deal with nature. This is a very fair point, but what it really amounts to is a claim that we shouldn't underestimate the enemy, not that the enemy is really a friend. Anyway, the solution to that problem is to become better rationalists and get better at being skeptical regarding our powers, not to retreat into quasi-mystical Gaia worship. A third response is that getting into a "conquer nature" frame of mind puts people into a "conquer everything" frame of mind and leads to aggression against other people. This might have merit historically, but that problem is also best confronted directly, in this case by more effectively promulgating liberal humanistic values.
"Open-Mindedness" - the video
An interesting little Flash-like video on "openmindedness" by someone named QualiaSoup (hopefully ironically).
Does anyone know how much effort is required to produce this sort of video, perhaps from a script? We need at least another thousand of these.
What I Tell You Three Times Is True
"The human brain evidently operates on some variation of the famous principle enunciated in 'The Hunting of the Snark': 'What I tell you three times is true.'"
-- Norbert Weiner, from Cybernetics
Ask for a high-profile rationalist, and you'll hear about Richard Dawkins or James Randi or maybe Peter Thiel. Not a lot of people would immediately name Scott Adams, creator of Dilbert. But as readers of his blog know, he's got a deep interest in rationality, and sometimes it shows up in his comics: for example, this one from last week. How many people can expose several million people to the phrase "Boltzmann brain hypothesis" and have them enjoy it?
So I was very surprised to find Adams was a believer in and evangelist of something that sounded a lot like pseudoscience. "Affirmations" are positive statements made with the belief that saying the statement loud enough and long enough will help it come true. For example, you might say "I will become a syndicated cartoonist" fifteen times before bed every night, thinking that this will in fact make you a syndicated cartoonist. Adams partially credits his success as a cartoonist to doing exactly this.
He admits "it sounds as if I believe in some sort of voodoo or magic", and acknowledges that "skeptics have suggested, and reasonably so, that this is a classic case of selective memory" but still swears that it works. He also has "received thousands of e-mails from people recounting their own experiences with affirmations. Most people seem to be amazed at how well they worked."
None of this should be taken too seriously without a controlled scientific study investigating it, of course. But is it worth the effort of a study, or should it be filed under "so stupid that it's not worth anyone's time to investigate further"?
I think there's a good case to be made from within a rationalist/scientific worldview that affirmations may in fact be effective for certain goals. Not miraculously effective, but not totally useless either.
Where's Your Sense of Mystery?
Related to: Joy in the Merely Real, How An Algorithm Feels From Inside, "Science" As Curiosity-Stopper
Your friend tells you that a certain rock formation on Mars looks a lot like a pyramid, and that maybe it was built by aliens in the distant past. You scoff, and respond that a lot of geological processes can produce regular-looking rocks, and in all the other cases like this closer investigation has revealed the rocks to be completely natural. You think this whole conversation is silly and don't want to waste your time on such nonsense. Your friend scoffs and asks:
"Where's your sense of mystery?"
You respond, as you have been taught to do, that your sense of mystery is exactly where it should be, among all of the real non-flimflam mysteries of science. How exactly does photosynthesis happen, what is the relationship between gravity and quantum theory, what is the source of the perturbations in Neptune's orbit? These are the real mysteries, not some bunkum about aliens. And if we cannot learn to take joy in the merely real, our life will be empty indeed.
But do you really believe it?
I loved the Joy in the Merely Real sequence. But it spoke to me because it's one of the things I have the most trouble with. I am the kind of person who would have much more fun reading about the Martian pyramid than about photosynthesis.
And the one shortcoming of Joy in the Merely Real was that it was entirely normative, and not descriptive. It tells me I should reserve my sense of mystery for real science, but doesn't explain why it's so hard to do so, or why most people never even try.
So what is this sense of mystery thing anyway?
Storm by Tim Minchin
I'm sure many of you have already seen this performance. Tim Minchin's beat poem "Storm" is about the sceptical, secular understanding of the world, stupidity of quackery and supernatural, weight of dishonesty, and joy in the merely real. Contains strong language.
The Skeptic's Trilemma
Followup to: Talking Snakes: A Cautionary Tale
Related to: Explain, Worship, Ignore
Skepticism is like sex and pizza: when it's good, it's very very good, and when it's bad, it's still pretty good.
It really is hard to dislike skeptics. Whether or not their rational justifications are perfect, they are doing society a service by raising the social cost of holding false beliefs. But there is a failure mode for skepticism. It's the same as the failure mode for so many other things: it becomes a blue vs. green style tribe, demands support of all 'friendly' arguments, enters an affective death spiral, and collapses into a cult.
What does it look like when skepticism becomes a cult? Skeptics become more interested in supporting their "team" and insulting the "enemy" than in finding the truth or convincing others. They begin to think "If a assigning .001% probability to Atlantis and not accepting its existence without extraordinarily compelling evidence is good, then assigning 0% probability to Atlantis and refusing to even consider any evidence for its existence must be great!" They begin to deny any evidence that seems pro-Atlantis, and cast aspersions on the character of anyone who produces it. They become anti-Atlantis fanatics.
Wait a second. There is no lost continent of Atlantis. How do I know what a skeptic would do when confronted with evidence for it? For that matter, why do I care?
= 783df68a0f980790206b9ea87794c5b6)
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)