Hm, I would interpret Cowen's position somewhat differently.
I think his advice is roughly: if someone is saying something you generally agree with, but they have other bad beliefs and also there is likely to be some social cost due to offensiveness or maybe predicted offensiveness given that the position they have wasn't arrived at with a good process, then you should not align yourself with them.
You may be pro position X, but that doesn't mean you have to endorse every group that espouses position X -- you should pick your allies, and agreeing on X doesn't mean that it's worth it to associate yourself with them.
I don't think he's saying that you should not talk about X though. Like, you can believe that covid restrictions should be lifted, and argue for that position and endorse some who hold that position, without endorsing the convoy. The convoy, on top of being expensive in reputational terms to ally with, has a cluster of salient beliefs that you probably don't endorse if you listen to advice from Tyler.
In other words, I don't read Tyler as saying that you should pick your beliefs based on reputations of people who are loudly advocating for something you agree with. I think Tyler would generally advocate believing things that are true as the most important element. But you should update less to poorly founded opinions compared to well founded opinions (obv) and you should try to associate with people who think things for good reasons as opposed to agreeing with you for bad ones.
I think his advice is roughly: if someone is saying something you generally agree with, but they have other bad beliefs and also there is likely to be some social cost due to offensiveness or maybe predicted offensiveness given that the position they have wasn't arrived at with a good process, then you should not align yourself with them.
I don't think Tyler argues here about social cost.
Tyler generally believes that it's good to yield to experts in their domains. He is saying things like economists should listen more to philosophers when it comes to topics with philosophic implications.
Postscript to: Rock is Strong
Always choosing Rock, broadly construed, is freerolling on the efforts of others and/or you are lying. You are in at some important sense defecting. The problem of an information cascade is real. You are not contributing new true information, instead reinforcing the information and heuristics already out there.
Often this is the most accurate option available to you, or at least the most efficient. Doing the work yourself is hard. Many can’t do better that way and even for those who can it is a lot of work. On average those who disagree are more wrong and end up doing worse. Other times, accuracy is not what is desired. None of that makes it easy to pick the correct rock, but often it is straightforward enough.
The defection that comes from not doing your part for the long term epistemic commons often is made up for not only by your lack of a practical option to do otherwise, but also in some cases by the gains from accuracy.
Rewarding those who worship the rock, like rewarding any form of defection, is also defecting. The Queen causes everyone to die in a volcanic eruption because she set up bad incentives and is encouraging everyone to spend down the commons.
This can also be thought of partially as explore versus exploit. The majority of actions should be exploit, and most people should mostly be exploiting. There is more variance in ability to explore than exploit, so most people should explore even less than average and let others explore and report back. Exploration is a public good, so it needs to be rewarded beyond its private payoff or there won’t be enough of it. When competition is sufficiently intense or other factors reward exploitation too much, exploration can die out entirely.
This is the context for a Marginal Revolution post that nagged at me enough to give me enough initial motivation to write this whole thing. Quoted in full, and this is not about the examples it is about the method:
When I saw this, it set off loud alarm bells in my head. Something felt deeply wrong.
Essentially it was saying that if the Wrong Kind of People, who expressed Wrong Views that oppose Narrative, were advocating or backing something, you needed to run away even if they were right this time. Either the Covid restrictions need to be lifted or they don’t. ‘The people who advocate lifting restrictions hold offensive other views’ doesn’t even have an obvious sign on its direction in terms of what it makes more likely to be true. On the one hand, you could argue those people generally have worse epistemics. On the other hand, there are a lot of offensive views that are true, so anyone who has no offensive views also does not have great epistemics. If the objection is that such folks let others know they hold offensive views, that too has its impacts on epistemics.
And since the argument ‘people who are known to hold offensive views hold this view’ is used to argue this view too is offensive or wrong, there is an obvious bias in the discourse against such views. So even if you think that offensive-view-holders are more often wrong about other things, you need to ask if the marketplace of ideas is updating too much versus not enough on that information before you too can update. Right now, it seems like there is too strong a bias against offensive-view-holders, stronger than is justified when seriously examining their other views, and that this is causing many non-offensive views such folks often hold to be wrongly discounted, and sometimes causing them to become offensive views.
Thus it seems more like ‘offensive-view-holders hold this view, therefore this view will likely become offensive even if it is true and/or holding this view will cause others to lower their status of you because they think you hold offensive views’ and advice to therefore run the other way. With a side of ‘if you hold this view it may cause you to adapt other offensive views because of the associations.’
Which is not bad advice on that level, if you care about such things, but as a Kantian imperative it gives far too much power to those who decide what is offensive and to how things look and sound, and too little power to truth.
A similar thing can be thought about Ron Paul. It does seem true that many who supported Ron Paul now are in worse spaces. Even given that this was predictable, why should it bear on the question of the quality of the ideas of Ron Paul? His statements are either true or false, his proposals worthwhile or otherwise, and ‘who supports X’ being a way to judge whether you should support X seems like letting coalitional politics (simulacra level 3 considerations) outweigh physical world modeling (simulacra level 1 considerations). Which always gives me a giant pit of horror.
Yes, there is a reasonable counter that the OP here is simply saying to stick to the smarter versions of these ideas, but that effectively translates to suggesting in-practice opposition to the practical versions that are on offer. Ron Paul.
And also that specific advice is also worded in a burn-the-commons kind of way that raises my alarms: “For whatever set of views you think is justified, try to stick to the versions of those views held by well-educated, reasonable, analytically-inclined people.”
This reads once more as a ‘do not attempt to think for yourself and decide what is true, instead rely on the opinions of others’ although at least there is room to evaluate potential others a bit. Whereas if you are aware and smart enough to figure out who such people are, it seems like you are also aware and smart enough to be one of them.
The argument that one should choose views for their peer effects scares me. I agree that peer effects of this sort are real, but going down the road where one chooses to believe things for that reason seems terrifying with lots of obvious downsides. A person doing so too seriously, in a real sense, does not have a mind.
And all of this seems to entwine the questions of who should be supported or opposed or lowered or raised in status with the question of what beliefs one should hold generally, instead of keeping them distinct, perhaps on the belief that most people cannot keep those distinct and it is foolish to suggest that they try.
One could however flip this. The above is suggesting that the OP calls for the use of a rock, but its central warning is the opposite.
It is saying beware those who are worshiping rocks, for they worship too strongly.
It is especially saying those who worship rocks that are already giving some crazy answers now, are going to give increasingly worse answers over time. Do not hitch your wagon to them.
If you are a strong analytical thinker, what you very much are not is a rock. You are not following a simple heuristic, or will know when to disregard it.
Ron Paul had a mix of good and bad ideas. A more thoughtful and analytical libertarian would also have had a mix of good and bad ideas. One difference would hopefully be a better mix with more good ideas and less bad ideas. The more salient difference here is the decision algorithm generating Ron Paul’s ideas. Thus, even if he happens to have a lot of ideas you agree with, when you see his other ideas or he generates new ones, they’re unlikely to be as good. The moment his ‘FREEDOM!’ rock goes wrong you’re in a lot of trouble, and you know this because you already have examples, unless one disagrees and thinks he was doing something better. That goes double for those whose rock said “RON PAUL!”
One could also say at least Ron Paul has a rock, and thus has predictable thoughts that are roughly consistent and are not corrupted by various political considerations as much as those of his rivals. Whereas the alternatives are much worse than rocks. Or alternatively, that the other candidates all have rocks that say “DO WHAT GETS YOU ELECTED” and you’d prefer an actual analytical thinker but you’ll take “FREEDOM!” over “DO WHAT GETS YOU ELECTED” any day.
Thus, one could evaluate the Convoy in a similar fashion, and assume that things are bound to go off the rails regardless of whether you agree with where they start out.
It is also a good example of how having a rock, in this case perhaps again that says “FREEDOM!” provides a simple heuristic but one that is not terribly useful. Like any good rock it can be interpreted any number of ways and depends on the context. Somehow the output of this rock became blocking freedom of movement. Most heuristics that seem simple are not at core simple, you still have to massage the data into the correct format and that is trickier than it may sound and often messed up.
It would certainly be a mistake to buy into a highly error-prone intellectual package, once one notices that is full of nonsense. One would not want to become a ‘fellow traveler’ and try to get into Aumann agreement with them or anything, or otherwise link your status wagon to it in the bigger picture, but neither of those need not be the relevant standard.
No wagons need be hitched in order to evaluate what is true. Nor should it much matter that some other solution is first best. Not when evaluating claims and ideas. One needs to hold true to the Litany of Tarski, or else paper stops beating rock. Then rock will truly be strong.
Yet sometimes, at least within a given magisterium, one has to effectively write some name down on a rock, even if it does not fully substitute for your brain. If there is no simple heuristic that will work, making sure there is a non-rock at the end of the chain does seem wise.