"I feel like I'm not the sort of person who's allowed to have opinions about the important issues like AI risk."
"What's the bad thing that might happen if you expressed your opinion?"
"It would be wrong in some way I hadn't foreseen, and people would think less of me."
"Do you think less of other people who have wrong opinions?"
"Not if they change their minds when confronted with the evidence."
"Would you do that?"
"Yeah."
"Do you think other people think less of those who do that?"
"No."
"Well, if it's alright for other people to make mistakes, what makes YOU so special?"
A lot of my otherwise very smart and thoughtful friends seem to have a mental block around thinking on certain topics, because they're the sort of topics Important People have Important Opinions around. There seem to be two very different reasons for this sort of block:
- Being wrong feels bad.
- They might lose the respect of others.
Be wrong
If you don't have an opinion, you can hold onto the fantasy that someday, once you figure the thing out, you'll end up having a right opinion. But if you put yourself out there with an opinion that's unmistakably your own, you don't have that excuse anymore.
This is related to the desire to pass tests. The smart kids go through school and are taught - explicitly or tacitly - that as long as they get good grades they're doing OK, and if they try at all they can get good grades. So when they bump up against a problem that might actually be hard, there's a strong impulse to look away, to redirect to something else. So they do.
You have to understand that this system is not real, it's just a game. In real life you have to be straight-up wrong sometimes. So you may as well get it over with.
If you expect to be wrong when you guess, then you're already wrong, and paying the price for it. As Eugene Gendlin said:
What is true is already so. Owning up to it doesn't make it worse. Not being open about it doesn't make it go away. And because it's true, it is what is there to be interacted with. Anything untrue isn't there to be lived. People can stand what is true, for they are already enduring it.
What you would be mistaken about, you're already mistaken about. Owning up to it doesn't make you any more mistaken. Not being open about it doesn't make it go away.
"You're already "wrong" in the sense that your anticipations aren't perfectly aligned with reality. You just haven't put yourself in a situation where you've openly tried to guess the teacher's password. But if you want more power over the world, you need to focus your uncertainty - and this only reliably makes you righter if you repeatedly test your beliefs. Which means sometimes being wrong, and noticing. (And then, of course, changing your mind.)
Being wrong is how you learn - by testing hypotheses.
In secret
Getting used to being wrong - forming the boldest hypotheses your current beliefs can truly justify so that you can correct your model based on the data - is painful and I don't have a good solution to getting over it except to tough it out. But there's a part of the problem we can separate out, which is - the pain of being wrong publicly.
When I attended a Toastmasters club, one of the things I liked a lot about giving speeches there was that the stakes were low in terms of the content. If I were giving a presentation at work, I had to worry about my generic presentation skills, but also whether the way I was presenting it was a good match for my audience, and also whether the idea I was pitching was a good strategic move for the company or my career, and also whether the information I was presenting was accurate. At Toastmasters, all the content-related stakes were gone. No one with the power to promote or fire me was present. Everyone was on my side, and the group was all about helping each other get better. So all I had to think about was the form of my speech.
Once I'd learned some general presentations at Toastmasters, it became easier to give talks where I did care about the content and there were real-world consequences to the quality of the talk. I'd gotten practice on the form of public speaking separately - so now I could relax about that, and just focus on getting the content right.
Similarly, expressing opinions publicly can be stressful because of the work of generating likely hypotheses, and revealing to yourself that you are farther behind in understanding things than you thought - but also because of the perceived social consequences of sounding stupid. You can at least isolate the last factor, by starting out thinking things through in secret. This works by separating epistemic uncertainty from social confidence. (This is closely related to the dichotomy between social and objective respect.)
Of course, as soon as you can stand to do this in public, that's better - you'll learn faster, you'll get help. But if you're not there yet, this is a step along the way. If the choice is between having private opinions and having none, have private opinions. (Also related: If we can't lie to others, we will lie to ourselves.)
Read and discuss a book on a topic you want to have opinions about, with one trusted friend. Start a secret blog - or just take notes. Practice having opinions at all, that you can be wrong about, before you worry about being accountable for your opinions. One step at a time.
Before you're publicly right, consider being secretly wrong. Better to be secretly wrong, than secretly not even wrong.
(Cross-posted at my personal blog.)
I think this clarifies an important area of disagreement:
I claim that there are lots of areas where people have implicit strong beliefs, and it's important to make those explicit to double-check. Credences are important for any remaining ambiguity, but for cognitive efficiency, you should partition off as much as you can as binary beliefs first, so you can do inference on them - and change your mind when your assumptions turn out to be obviously wrong. This might not be particularly salient to you because you're already very good at this in many domains.
This is what I was trying to do with my series of blog posts on GiveWell, for instance - partition off some parts of my beliefs as a disjunction I could be confident enough in to think about it as a set of beliefs I could reason logically about. (For instance, Good Ventures either has increasing returns to scale, or diminishing, or constant, at its given endowment.) What remains is substantial uncertainty about which branch of the disjunction we're in, and that should be parsed as a credence - but scenario analysis requires crisp scenarios, or at least crisp axes to simulate variation along.
Another way of saying this is that from many epistemic starting points it's not even worth figuring out where you are in credence-space on the uncertain parts, because examining your comparatively certain premises will lead to corrections that fundamentally alter your credence-space.
This was helpful to me, thanks.
I think I'd still endorse a bit more of a push towards thinking in credences (where you're at a threshold of that being a reasonable thing to do), but I'll consider further.