Review

The second way of cultivating excitement-based motivation I'll discuss in this sequence is cultivating agency, by which I mean the ability to try to shape the world. The key idea I want to convey here is that the feeling of agency is in many cases a self-fulfilling prophecy, because the hardest part of many things is mustering the energy and initiative to get started. If you think of yourself as agentic, then you’ll continually be looking out for opportunities, and approach the world with a “why not?” attitude. That’ll give you more chances to do well, which will give you more evidence that you’re the sort of person who can make stuff happen, and reinforce your sense of agency.

Why don’t people have that sense by default? One reason is that most people have experienced a huge amount of social pressure towards conformity, especially as children. Doing unusual or novel things often leads to criticism or ridicule, which ingrains a sense of learned helplessness about defying consensus. This is especially strong for teenagers, most of whom spend a huge amount of effort trying to be liked and fit in. And it’s not just enough to be willing to stand out: your success might make others look bad, which creates incentives for them to ”cut you down to size”, or deter you from trying in the first place (e.g. by mocking anyone who cares enough about something to try hard to achieve it).

A more intellectual justification for lacking agency comes from epistemic modesty: the view that you should pay relatively little attention to your own view on most topics, and defer instead to other people. One intuition that often motivates epistemic modesty is the idea that markets are often efficient, in the sense that you can’t consistently beat them. However, people often apply epistemic modesty to conclude that they shouldn’t trust their inside view even in domains that are very different from markets: founding a startup, or doing research, or reasoning about politics, or even finding a partner. In those domains, epistemic modesty often involves thinking that you’re not so special: if others disagree, then there’s little reason to think you’re mistaken rather than them; and if others haven’t taken an opportunity, then there’s probably some reason you shouldn’t (or can’t) either.

Certainly there are some people who should pay more attention to epistemic modesty. However, there are at least three clusters of arguments for why epistemic modesty is strongly overrated: epistemic, normative, and social. Many important arguments in the first cluster are laid out by Yudkowsky in his book Inadequate Equilibria. One of his core points is that efficient market arguments only apply when inefficiencies are exploitable—i.e. you can benefit from noticing the inefficiency. This is often not true, e.g. in cases where the decision-makers behind the inefficiency don’t have skin in the game.[1] More generally, decisions made by large bureaucracies are often dumb even from the perspectives of the people in those bureaucracies, since their optimization for local incentives can easily make the bureaucracy as a whole dysfunctional. One of the few upsides of covid is that it’s provided us many straightforward examples of repeated, lethal societal incompetence, in terms of our gain-of-function policies and vaccine development policies and mask policies and lockdown policies and vaccine deployment policies and many others.

A second cluster of arguments against epistemic modesty relates to what you should actually do. Even in the cases where epistemic modesty is the right mindset for making predictions, it’s often a very bad mindset for choosing actions—because the biggest rewards often come from defying consensus. For example, spotting a new technology with untapped potential, and backing yourself to develop it, could improve the lives of millions and earn you billions. Maybe you’re wrong about that type of thing more often than you’re right, but it only takes one win to make up for all the losses. And even when you’re wrong, the lessons learned in trying to beat a consensus are often much more valuable than types of education which involve passively learning the same things other people know.

This can be seen as an extension of the argument from my post on scarcity and abundance. Taking risks was a much worse idea when failure meant immiseration or death. But most westerners now live lives of sufficient abundance that backing ourselves is a far better policy than strong epistemic modesty, even if it leads to some failures. This leads me to believe that a large part of the impulse towards epistemic modesty comes from status regulation, driven by the fear of the social consequences of being seen as arrogant. In other words, the huge advantage that often comes from being less afraid of social judgment can be seen as an “exploitable inefficiency” in the world, if you’re willing to put in the emotional work to overcome that fear.

I don’t expect that these three arguments will be sufficient to sway most people away from epistemic modesty, but I do hope that they’ll help nudge people to spot the counterexamples to it that are all around them. Maybe you’ll see people around you deciding to do crazily ambitious things, and succeeding, and instead of sinking into jealousy you’ll think “that could be me”. Maybe you’ll see respected authorities screwing up over and over again without any consequences, like they did during covid, and instead of downplaying each individual case you’ll think “I should go and help fix that”. You shouldn’t permanently inhabit a frame in which the world is blisteringly incompetent, because decisions driven primarily by anger or despair tend to be counterproductive—but you should be able to sometimes take on that frame, and consider it seriously, and use it to inform your actions.

How can you speed up this process, if you don’t want to just wait around for more examples of civilizational incompetence to update you on a visceral level? To finish this post I’ll run through three background ideas which I find useful for cultivating agency:

  1. positive-sum mindset, based on understanding how much the world has improved throughout history. Seeing how radically safer and more abundant the world has become makes it much more intuitive that the world can become far better still. Crucially, this need not happen via reallocating existing wealth, but instead works best by “growing the pie”—a point conveyed well by many of Paul Graham’s essays.
  2. Definite optimism, based on understanding that many people who changed the world did so by making deliberate plans to do so. Indefinite optimism is the sense that the future will go well, but that it’s not actually necessary or even possible to contribute to that process; whereas definite optimism is about designing and building a better future, rather than just waiting for one to come to pass. The world may seem too vast and complex to deliberately change, but it was cobbled together by accidents and blind selection and people like us, and all of those can be improved upon with focused effort. That’s particularly true for social reality, which often projects an illusion of solidity, even though it’s constituted only by people’s expectations about social conventions. Fantasy wizards can say a few words and reshape reality—but living in a modern society allows you to do the same thing (sometimes on a much larger scale). If you’re not seeing the magic, you’re not looking!
  3. Belief in a heavy-tailed world—i.e. a world dominated by extreme outcomes. This is best-understood in the context of startups, where the biggest successes make up almost all the returns of venture capital. It’s a more general principle, though—as exemplified by the fact that the best approaches to improving the world are far better than typical approachesThe world is full of trillion-dollar bills lying on the ground; they’re not easy to grab, but there are huge payoffs if you do.
  1. ^

     Note that exploitability is relative to your values: inefficiencies may exist because others don’t care about the potential gains. For example, EAs have found many ways to improve on traditional philanthropy from a utilitarian perspective.

New Comment
1 comment, sorted by Click to highlight new comments since:

I think you've written a great and rational post, but if it in any way confuses or bothers you that most people aren't living by the advice you're giving here, it's probably *because* you're assuming that people are as rational as you.
I think the main reason that a lot of people lack agency stems from low confidence and a fear of sticking out. Logically, original thinking has a lot of potential, but subconsciously, all that most people are going to feel when their thoughts drift outside of the norm is a fear of judgement which quickly pulls them back again.
Simply put, the balance between making oneself an individual and of identifying (and sort of merging oneself) with something larger than oneself is determined by ones confidence.

My internal monitoring of abundance vs scarcity is automatic and instinctual, it drives my exploration vs exploitation ratio, my short vs long-term thinking ratio, my estimation of risk and of consequences, and so on.

The trait of being self-fulfilling is very common in everything human, and akin to "the rich get richer and the poor get poorer". Again, it's mostly about confidence (internally and externally, also in forms like faith, fame, reputation, etc), and the evidence need not be logical. If you hesitate in answering your phone, your brain might see it as "evidence" that phone calls are dangerous, and protect you by reinforcing your phobia of phone calls.

In conclusion, I believe that the psychological viewpoint shouldn't be overlooked.
Personally, I find rational thinking to be the easy part, and the conflict between it and my irrational/conditioned self to be the challenging part. So rather than trying to change how my brain works, I exploit how it currently works.

You might already be aware of this perspective on the problem? In which case we simply just assign different values to these perspectives according to what works for us personally (i.e. I'm just projecting an internal personal conflict between system 1 and system 2 thinking)

PS: First time posting on LW, enjoy my deviation as an instance of agency :)