Of course I'm keeping my eye out for literature on improving empathy. All the reviews I found so far said that we're not sure how to do that yet, because the studies do not give strong and clear results. Most of the literature is about trying to train medical workers to have empathy.
Emotional awareness is a skill that can be cultivated, and increases one's agreeableness. Watch a disagreeable person in action and it's pretty obvious that they're not really picking up how other people are reacting to their behavior. Note that it's much easier to see disagreeable behavior is in others than in oneself. The challenge in becoming more agreeable lies partly in seeing oneself as others see you.
if you really want to know how valid a particular idea you've read is, there are quantitative ways to get closer to answering that question.
The ultimate in quantitative analysis is to have a system predict what your opinion should be on any arbitrary issue. The TakeOnIt website does this by applying a collaborative filtering algorithm on a database of expert opinions. To use it you first enter opinions on issues that you understand and feel confident about. The algorithm can then calculate which experts you have the highest correlation in opinion with. It then extrapolates what your opinion should be on issues you don't even know about, based on the assumption that your expert agreement correlation should remain constant. I explained the concept in more detail a while ago on Less Wrong here, but have since actually implemented the feature. Here are TakeOnIt's predictions of Eliezer's opinions. The more people add expert opinions to the database, the more accurate the predictions become.
Note that the website currently requires you to publicly comment on an issue in order to get your opinion predictions. A few people have requested that you should be able to enter your opinion without having to comment. If enough people want this, I'll implement that feature.
one of my dreams is that one day we could develop tools that ... allowed you to estimate how contentious that claim was, how many sources were for and against it... and links to ... tell you about who holds what opinions, and allowed you to somewhat automate the process of reading and making sense of what other people wrote.
That's more or less the goal of TakeOnIt. I'd stress that the biggest challenge here is populating the database of expert opinions rather than building the tools.
An even more ambitious project: making a graph of which studies invalidate or cast doubt on which other studies, on a very big scale, so you could roughly pinpoint the most certain or established areas of science. This would require some kind of systematic method of deducing implication, though.
Each issue on TakeOnIt can be linked to any other issue by adding an "implication" between two issues. Green arrows link supporting positions; red arrows link contradictory positions. So for example, the issue of cryonics links to several other issues, such as the issue of whether information-theoretic death is the most real interpretation of death (which if true, supports the case for cryonics).
When I wrote "What is Bunk?" I thought I had a pretty good idea of the distinction between science and pseudoscience, except for some edge cases. Astrology is pseudoscience, astronomy is science. At the time, I was trying to work out a rubric for the edge cases (things like macroeconomics.)
Now, though, knowing a bit more about the natural sciences, it seems that even perfectly honest "science" is much shakier and likelier to be false than I supposed. There's apparently a high probability that the conclusions of a molecular biology paper will be false -- even if the journal is prestigious and the researchers are all at a world-class university. There's simply a lot of pressure to make results look more conclusive than they are.
In the field of machine learning, which I sometimes read the literature in, there are foundational debates about the best methods. Ideas which very smart and highly credentialed people tout often turn out to be ineffective, years down the road. Apparently smart and accomplished researchers will often claim that some other apparently smart and accomplished researcher is doing it all wrong.
If you don't actually know a field, you might think, "Oh. Tenured professor. Elite school. Dozens of publications and conferences. Huge erudition. That means I can probably believe his claims." Whereas actually, he's extremely fallible. Not just theoretically fallible, but actually has a serious probability of being dead wrong.
I guess the moral is "Don't trust anyone but a mathematician"?
I guess the moral is "Don't trust anyone but a mathematician"?
Safety in numbers? ;)
Perhaps it's useful to distinguish between the frontier of science vs. established science. One should expect the frontier to be rather shaky and full of disagreements, before the winning theories have had time to be thoroughly tested and become part of our scientific bedrock. There was a time after all when it was rational for a layperson to remain rather neutral with respect to Einstein's views on space and time. The heuristic of "is this science established / uncontroversial amongst experts?" is perhaps so boring we forget it, but it's one of the most useful ones we have.
To evaluate a contrarian claim, it helps to break down the contentious issue into its contentious sub-issues. For example, contrarians deny that global warming is caused primarily by humans, an issue which can be broken down into the following sub-issues:
Have solar cycles significantly affected earth's recent climate?
Does cosmic radiation significantly affect earth's climate?
Has earth's orbit significantly affected its recent climate?
Does atmospheric CO2 cause significant global warming?
Do negative feedback loops mostly cushion the effect of atmospheric CO2 increases?
Are recent climatic changes consistent with the AGW hypothesis?
Is it possible to accurately predict climate?
Have climate models made good predictions so far?
Are the causes of climate change well understood?
Has CO2 passively lagged temperature in past climates?
Are climate records (of temperature, CO2, etc.) reliable?
Is the Anthropogenic Global Warming hypothesis falsifiable?
Does unpredictable weather imply unpredictable climate?
It's much easier to assess the liklihood of a position once you've assessed the liklihood of each of its supporting positions. In this particular case, I found that the contrarians made a very weak case indeed.
Not only to recognize my mistakes, but to actually speak outloud about them frequently has given me great strengh in doing it in questions that really matter. If you have social status, it is worth sparing some change in getting used to not only being wrong, but being socially recognized as wrong by your peers...
If you have social status, it is worth sparing some change in getting used to not only being wrong, but being socially recognized as wrong by your peers...
Emperor Sigismund, when corrected on his Latin, famously replied:
I am king of the Romans and above grammar.
I know that most men — not only those considered clever, but even those who are very clever and capable of understanding most difficult scientific, mathematical, or philosophic, problems — can seldom discern even the simplest and most obvious truth if it be such as obliges them to admit the falsity of conclusions they have formed, perhaps with much difficulty — conclusions of which they are proud, which they have taught to others, and on which they have built their lives.
— Leo Tolstoy, 1896 (excerpt from "What Is Art?")
OK, now take the next step. Since most people who are choosing love, belonging, and esteem over accuracy are not aware they are giving up accuracy, then you have to wonder how you can tell when you are doing so. If you are tempted to think that you are an exception who is willing to choose accuracy instead, ask if this is just another kind of group you want to join, or another kind of esteem you hope to acquire. If so, when would this lead you to actually choose more accuracy, vs. just to tell yourself you so choose?
Illusory superiority seems to be the cognitive bias to overcome here.
While the income/happiness correlation does exist, it is an internal comparison rather than an external one. See for example this breakdown by country. The data suggests that people construct their notion of happiness in part comparatively to the material wealth of those around them. While this might not apply to very basic needs (i.e. people starving) it seems that this starts to have a substantial impact before all the physiological needs are met. I'm incidentally not convinced that the top tier on Maslow's pyramid is anything other than a culturally mediated set of values rather than a set of intrinsic goods. There's also a fair bit of empirical criticism of the levels of the hierarchy as a whole This is one good criticism that favors something approximating rationalism over Maslow's set. I've been told that Wahba and Bridgewell's papers in the 1970s also provide a lot of empirical evidence against Maslow although I haven't read them myself.
Voted up.
if you have to choose between fitting in with your group etc and believing the truth, you should shun the truth.
I think many people develop a rough map of other people's beliefs, to the extent that they avoid saying things that would compromise fitting in the group they're in. Speaking of which:
irrationalists free-ride on the real-world achievements of rationalists
Trying to get to level 4 are we? (Clearly I'm not ;)) Conversely, you could argue that "irrationalists" are better at getting things done due to group leverage and rationalists free-ride of those achievements.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Blogroll / Side Bar Section for Links to Rationality Related Websites. I love Overcoming Bias, but it seems a bit biased that Overcoming Bias is the only other website linked from here.
Reply to this comment with a comment for each website nomination?
Hmm... maybe with this feature new links could be added by users (presuming a minimum karma criteria), and then each link other users could vote up and down, so that the ordering of the list was organic.