All of plutonic_form's Comments + Replies

This might be cool but I don't see how it's beautiful.

A problem, in three parts:

  • I am aware that the reason I believe a good number of the things I believe is that I'm surrounded by very smart people who are full of good arguments for those things, rather than because I have done a lot of my own independent research. If I had different very smart friends who had a different set of good arguments, my beliefs would almost certainly look very different-- not because I consciously adopt the strategy "believe what my friends believe," but because my friends are smart and good at expressing themselves, and if they a
... (read more)

You could try to explicitly model how correlated your friends are. Do they all talk to each other and reach consensus? Then from 10 friends you're not getting 10 opinions, it could be more like 2 (or could even be <1, if they're not keeping track of who is confident based on what evidence, cf. https://en.wikipedia.org/wiki/Information_cascade ). Do most of them use mostly the same strategies for evaluating something (same strategies to search for information, to find counterarguments, to think of ways they made unhelpful assumptions, etc. etc.)? Then yo... (read more)

Together with Bayes's formula (which in practice is mostly remaining aware of base rates when evidence comes to light), another key point about reasoning under uncertainty is to avoid it whenever possible. Like with long-term irrelevance of news, cognitive and methodological overhead makes uncertain knowledge less useful. There are exceptions, like you do want to keep track of news about an uncertain prospect of a war breaking out in your country, but all else equal this is not the kind of thing that's worth worrying about too much. And certainty is not th... (read more)

2Viliam
With limited intelligence and limited time, I will never be correct about everything. That sucks, but such is life. I can still keep trying to do better, despite knowing that the results will never be perfect. I try to listen to people who are not my friends (or to make/keep friends outside my usual bubbles), even if they are obviously wrong about some things... because they still might be right about some other things. I try to listen to it all, then filter it out somehow. But time is limited, so I do not do this too often. A typical outcome is that my opinions are similar to the opinions of my smart friends, but with way less certainty; plus an occassional fringe belief. And yes, even this is not perfect.
3Zack_M_Davis
This is indeed a conundrum! Ultimately, I think it is possible to do better, and that doing better sort of looks like biting the bullet on "discount arguments for being convincing and coming from people you trust", but that that's a somewhat misleading paraphrase: more precisely than "discounting" evidence from the people you trust, you want to be "accounting for the possibility of correlated errors" in evidence from the people you trust. In "Comment on 'Endogenous Epistemic Factionalization'", I investigated a toy model by James Owen Weatherall and Cailin O'Connor in which populations of agents that only update on evidence from agents with similar beliefs, end up polarizing into factions, most of which are wrong about some things. In that model, if the agents update on everyone's reports (rather than only those from agents with already-similar beliefs in proportion to that similarity), then they converge to the truth. This would seem to recommend a moral of: don't trust your very smart friends just because they're your friends; instead, trust the aggregate of all the very smart people in the world (in proportion to how very smart they are). But this moral doesn't seem like particularly tractable advice. Sure, it would be better to read more widely from all the very smart authors in the world with different cultures and backgrounds and interests than my friends, but I don't have the spare time for that. In practice, I am going to end up paying more attention to my friends' arguments, because I spend more time talking to my friends than anyone else. So, I'm stuck ... right? Not entirely. The glory of subjective probability is that when you don't know, you can just say so. To the extent that I think I would have had different beliefs if I had different but equally very smart friends, I should be including that in my model of the relationship between the world and my friends' beliefs. The extent to which I don't know how the argument would shake out if I could exha
6Nisan
Become unpersuadable by bad arguments. Seek the best arguments both for and against a proposition. And accept that you'll never be epistemically self-sufficient in all domains.
6Gunnar_Zarncke
When I realized this it helped me empathize much better with people who seem to know less - they make do with what they got!  I realize this doesn't answer your problem. I'm not sure there is a full answer but I think some progress can be made on understanding the mechanism you have outlined. See that it works, understand why it works, and when it seems to break down.