Conservatives have the term "hate facts" for true statements that the left considers it hateful for anyone to believe to be true. Calling something a "hate fact" yourself can be a good way to disarm potential critics who would otherwise make the mistake of assuming what you are saying is wrong because it's distasteful.
can be a good way to disarm potential critics
I don't know why would they be disarmed by that. You're just inviting them to call you evil because you're so full of hate X-/
It goes well for me when I use the term because it starts a conversation about what the term means, and most people agree in the abstract that you shouldn't dislike someone for believing in facts.
The only downside is it tends to be correlated with an identity that people reject off hand. I know lots of alt-right/paleo-con sites use hatefacts, and sometimes play fast and loose with the term.
PS: Huge fan of your interview series. I've listened to them all!
A lot of belief boils down to trust.
You can believe that certain religious beliefs about miracles are true because you consider the people who told you that they are true to be trustworthy authorities.
We usually believe that scientific papers are accurate because we trust the authors and the scientific community not to forge the results.
When Carson speaks about the pyramids storing grain, he can't defend that belief with an appeal to authority.
..his reverend or something like that told him (hypothetically). He beleives it the same way Obama beleives that a man can rise form the dead. He was told by people he trusted, but the opinion wasn't widespread. I doubt that he came up with the idea on his own. It seems to have some group of believers.
Trust in a single person and trust in an institution are two different qualities.
To stay with the analogy, even when we believe individual scientists can fail us we trust in the scientific community as a group. A person who believes that global warming is a hoax because he found a scientist who told him, is doing it wrong. It's the same for a person who takes the beliefs of a random priest as the truth instead of the Christian community
This seems to me to be like a counterpart of 'keep your identity small'. It's healthy to keep the identity inside your brain small, and it can be healthy to keep the identity you present to your audience small too.
When talking to public, it would probably be best to split yourself into as many personas as many different things you want to talk about. For example, you want to blog about five different topics? Create five different blogs, and use five different pseudonyms on them. Then people who disagree with one of your opinions won't automatically dismiss the remaining ones.
(The downside is that people who deeply agree with you on one of your opinions won't automatically trust you more about the remaining ones. Also, more blogs = less traffic per blog. And you might want a place to explore the interactions between your individual ideas.)
I actually do a similar thing with comments. If I have two remarks, I make two comments, so people can upvote at higher resolution :)
You're right in noticing that public belief signaling is somewhat in conflict with private truth-seeking. Like it or not, we evaluate people's competency based on their stated beliefs, which tend to cluster. Combine this with the fact that deviations from common clusters carry more information than membership in those clusters, and you get today's world.
I think you point out the reasons that it can be correct to judge people more harshly for weirder beliefs (as in less common, not as in less plausible). Someone claiming a common belief might be doing so just to pander to the masses, while someone claiming a weird belief probably actually believes it deeply.
I'm guilty of over updating towards stupid/crazy when ever someone has a cranky belief. I was on board with the bullying of Ben Carson, but in hindsight the man is a neurosurgeon; I'm pretty sure he's smarter than me.
I think we should also look not on believe itself, but in the way it is pesented, for example if a person knows that his believe is untypical and that publicly claiming it could damage his reputation.
For example if one say: "I give 1 per cent probability to very unusual idea that pyramids was built for X, because I read Y" it will be good signaling about his intelligence.
Another thing is that if we search entire internet history of a person for most stupid claim he ever did, we will be biased to underestimate his intelligence.
tl;dr: Social pressure is a thing.
Holding non-mainstream views has always been dangerous (regardless of their truth value). Nowadays, perhaps less so than was typical historically.
I'm growing increasingly convinced the unfortunate correlations between types of people and types of arguments leads to persistent biases in uncovering actual knowledge.
As an example, MR wrote this article (they just linked to again today) on Ben Carson in 2015/11. Cowen's argument is that, while perhaps implausible (although it may have tenuous support), the idea that Carson believes that the pyramids were used as grain storage isn't in any possible way less unrealistic than any other religious beliefs. If anything, that singular belief is relatively realistic compared to more widely accepted miracles in Christianity, or similar religions.
So why does he get so much flak for it? Cowen argues that he shouldn't, that it's unfounded and irrational/inconsistent. Is it? He obviously has a fair point. The downside is that despite the belief when analyzed not being particularly ridiculous, we all have a shared estimation/expectation that people who hold this type of belief (let's call them Class B religious beliefs) ARE particularly ridiculous.
This then creates a new equilibrium, where only those people who take their Class B religious beliefs *very* seriously will share them. As a result when Carson says the pyramids have grain, our impulse is "wacky!" But when Obama implies he believes Jesus rose from the dead, our impulse is "Boilerplate -- he probably doesn't give it too much thought -- it's a typical belief, which he might not even believe."
As a result we get this constant mismatch between the type of person to hold a belief, and the truth value of the belief itself. I don't mean to only bring up controversial examples, but it's no surprise that this is where these examples thrive. HBD is another common one. While there is something there, which after a fair amount of reading I suspect is overlooked, the type of person to be really passionate about HBD is (more often than not, with exceptions), not the type of person you want over for dinner.
This can suck for people like us. On one hand we want to evaluate individual pieces of information, models, or arguments, based on how they map to reality. On the other hand, if we advocate or argue for information that is correlated with an unsavory type of person, we are classified as that type of person. In this sense, for someone who values a good social standing and no risk to their career as primary objectives, it would be irrational to publicly blog about controversial topics. It's funny, Scott Alexander was retweeted by Anne Coulter for his SCC blog on Trump. He was thrilled, but imagine if he was an aspiring professor? I think he would probably still be fine, because his unique level of genius would still shine through, but lately professors I know who have non-mainstream political views have stopped publicly sharing for fear of controversy.
This is a topic I think about a lot, and now notice becoming a bigger issue in the US. And I wonder directly how to respond. The contradiction between rationally evaluating an idea and irrationally sharing analysis is growing.
*(http://marginalrevolution.com/marginalrevolution/2015/11/bully-for-ben-carson.html)