I've been reading the hardcover SSC collection in the mornings, as a way of avoiding getting caught up in internet distractions first thing when I get up. I'd read many of Scott Alexander's posts before, but nowhere near everything posted; and I hadn't before made any attempt to dive the archives to "catch up" to the seeming majority of rationalists who have read everything Scott Alexander has ever written.
(The hardcover SSC collection is nowhere near everything on SSC, not to mention Scott's earlier squid314 blog on livejournal. I'm curious how much shelf space a more complete anthology would occupy.)
Anyway, this has gotten me thinking about the character of Scott Alexander's writing. I once remarked (at a LessWrong meetup) that Scott Alexander "could never be a cult leader". I intended this as a sort of criticism. Scott Alexander doesn't write with conviction in the same way some other prominent rationalist authors do. He usually has the attitude of a bemused bystander who is merely curious about a bunch of things. Some others in the group agreed with me, but took it as praise: compared to some other rationalist authors, Scott Alexander isn't an ideologue.
(now I fear 90% of the comments are going to be some variation of "cults are bad")
What I didn't realize (at the time) was how obsessed Scott Alexander himself is with this distinction. Many of his posts grapple with variations on question of just how seriously we can take our ideas without going insane, contrasting the holy madman in the desert (who takes ideas 100% seriously) with the detached academic (who takes an intellectual interest in philosophy without applying it to life).
- Beware Isolated Demands for Rigor is the post which introduces and seriously fleshes out this distinction. Scott says the holy madman and the detached academic are two valid extremes, because both of them are consistent in how they call for principles to be applied (the first always applies their intellectual standards to everything; the second never does). What's invalid is when you use intellectual standards as a tool to get whatever you want, by applying the standards selectively.
- Infinite Debt forges a middle path, praising Giving What We Can for telling people that you can just give 10% to charity and be an "Officially Recognized Good Person" -- you don't need to follow your principles all the way to giving away everything, or alternately, ignore your principles entirely. By following a simple collectively-chosen rule, you can avoid applying principles selectively in a self-serving (or overly not-self-serving) way.
- Bottomless Pits Of Suffering talks about the cases where utilitarianism becomes uncomfortable and it's tempting to ignore it.
But related ideas are in many other posts. It's a thread which runs throughout Scott's writing. (IMHO.)
This conflict is central to the human condition, or at least the WASP/WEIRD condition. I imagine most of Scott's readers felt similar conflicts around applying their philosophies in practice.
But this is really weird from a decision-theoretic perspective. An agent should be unsure of principles, not sure of principles but unsure about applying them. (Related.)
It's almost like Scott implicitly believes maximizing his own values would be bad somehow.
Some of this makes sense from a Goodhart perspective. Any values you explicitly articulate are probably not your values. But I don't get the sense that this is what's going on in Scott's writing. For example, when he describes altruists selling all their worldly possessions, it doesn't sound like he intends it as an example of Goodhart; it sounds like he intends it as a legit example of altruists maximizing altruist values.
In contrast, blogs like Minding our way to the heavens give me more of a sense of pushing the envelope on everything; I associate it with ideas like:
- If you aren't putting forth your full effort, it probably means this isn't your priority. Figure out whether it's worth doing at all, and if so, what the minimal level of effort to get what you want is. (Or, if it really is important, figure out what's stopping you from giving it your full effort.) You can always put forth your full effort at the meta-level of figuring out how much effort to put into which things.
- If you repeatedly don't do things in line with your "values", you're probably wrong about what your values are; figure out what values you really care about, so that you can figure out how best to optimize those.
- If you find that you're fighting yourself, figure out what the fight is about, and find a way to best satisfy the values that are in conflict.
In more SSC-like terms, it's like, if you're not a holy madman, you're not trying.
I'm not really pushing a particular side, here, I just think the dichotomy is interesting.
I think that the approaches based on being a holy madman greatly underestimates the difficulty on being a value maximiser running on corrupted, basic human hardware.
I'd be extremely skeptical on anyone who claims to have found a way to truly maximise it's utility function, even if they claim to have avoided all the obvious pitfalls of burning out and so-so.
It would be extremely hard to conciliate "put forth your full effort" and staying rational enough to notice you're burning yourself out or noticing that you're getting stuck in some suboptimal route because you're not leaving yourself enough slack to notice better opportunities.
The detached academic seems to me an odd way to describe Scott Alexander, who seems to make a really effective effort to spread his values and live his life rationally, for him most of the issues he talks about seem to be pretty practical and relevant, even if he often takes interest on what makes him curious and isn't dropping everything to work on AI - maximise the number of competent people who would work on AI.
I'm currently in a now-nine-months-long attempt to move from detached-lazy-academic to make an extraordinary effort.
So far every attempt to accurately predict how much of a full effort I can make without getting backlash that makes me worse at it in the next period has failed.
Lots of my plans have failed, so if I had went along with plans that required me to make sacrifices, as taking an idea Seriously would require you to do, would have left me at a serious loss.
What worked most and obtained the most result was keeping a curious attitude toward plans and subjects that are related to my goal, studying to increase my competence in related areas even if I don't see any immediate way it could be of help, and monitoring on how much "weight" I'm putting on the activities that produce the results I need.
I feel I started out being unbelievably bad at working seriously at something, but in nine months I got more results than in a lifetime (in a broad sense, not just related to my goal) and I feel like I went up a couple levels.
I try to avoid going toward any state that resembles a "holy madman" for fear of crashing hard, and I notice that what I'm doing already has me pass as one to even my most informed friends on related subjects, when I don't censor to look normally modest and uninterested.
I might be just at such a low level in the skill of "actually working" that anything that would work great for a functional adult with a good work ethic is deadly to me.
But I'd strongly advise anyone trying the holy madman path to actively pump for as much "anti-holy-madmannes" as they can, since making the full effort to maximise for something seems to me the best way to make sure your ambition burns through any defence your naive, optimistic plans think you have put in place to protect your rationality and your mental health.
Cults are bad, becoming a one-man-cult is entirely possible and slightly worse.
Mh... I guess "holy madman" is a definition too vague to make a rational debate on it? I had interpreted it as "sacrifice everything that won't negatively affect your utility function later on". So the interpretation I imagined would be someone that won't leave himself an inch of comfort more than what's needed to keep the quality of his work constant.
I see slack as leaving yourself enough comfort that you'd be ready to use your free energy in ways you can't see at the moment, so I guess I was automatically assuming an "holy madman" would optimise for outp... (read more)