Alright, I think it'll make me a more responsible intellectual citizen if I try to distinguish these items a bit based on how I expect to view them in a decade or two. Let's do it.
Well overall, I expect that my current attentional foci are substantially influenced by current news, political narratives, and intellectual fads. I look back at what things I was saying and paying attention to in 2010, and I see few major differences and hard reversals, but I do see a lot of noteworthy omissions, changes of emphasis, and different compressions.
I think (34) will be fairly obsolete in 15 years. I dunno how remote learning and telemedicine have impacted things in the wake of covid, but it's plausible to me that the signaling equilibria will change enough that (34) will at least sound like an outdated opinion.
(29-32) are fairly timeless, but I wouldn't be surprised if fads in news and politics change enough in 15 years that they seem like a questionable focus.
Gods, I hope (23-25) become less necessary to say in 15 years. How much of this incipient cyberpunk weirdtopia do folks need to experience before they expand their horizons a couple centimeters?
I anticipate (21) being painfully more relevant in only 10 years. Unless we somehow get a lot of lucky breaks in a row.
The toxic status quo around news and (social) media just seems entirely unsustainable to me. I expect (21) to be fully out-of-date in 10 years, for better and/or for worse.
It's hard to imagine changing my mind about (19) any time soon, but it's possible. Perhaps I'll want to change the list to include/exclude different works. Or maybe I'll update hard against the value of mainstream mindshare. I doubt it though. See my response to niplav's comment for the generator behind (19).
(To reiterate the disclaimer: items (1-18) were adopted unmodified from John Nerst's blog post)
I get the feeling that (5-9), (18), and maybe (12) and (16) will feel less relevant in 15 years than they do right now. I think their loading with certain culture-war-related valence makes them feel more relevant right now, which is probably partly why they are on Nerst's mind (and mine).
Okay, so that's the pre-hindsight about what I originally wrote. But what about things I omitted?
I could see a world 15 years from now where it looks utterly ignorant to not include a whole paragraph about privacy.
Developments around self-driving cars triggered a gout of Trolley Problem memes. This hasn't actually been such a big deal, but I could imagine some other technology requiring a deep examination and refactoring of our moral intuitions. I tried to keep it pretty broad, but it's possible this refactoring will make my current list look a little weird.
Maybe China will be culturally ascendant in the next 20 years and I will feel the need to explicitly say something about individualism vs collectivism or something.
I might eventually be compelled to put more focus on lifestyle stuff. For example, I might dedicate several bullet points to the importance of diet, exercise, contemplative practice, work-life balance, and writing.
Some number of my family and friends will perma-die in the next 20 years, after which I may be compelled to push the cryonics stuff harder.
In the age of automation, I may feel the need to express niche opinions about economics and political philosophy. I do not yet know what these niche opinions might be.
Echo-chamber awareness, bad-faith detection, the principle of charity, asymmetrical weapons, and so on may become even more important as tools in my everyday epistemic toolkit. In contrast to the more eternal, abstract epistemic principles.
I hope not, but the need to resist Dark Side Epistemology may become urgent and take up a few bullet points.
Nice post.
I mostly agree, but this bit stood out to me:
- Every modern intellectual citizen ought to become familiar with at least some of the major ideas in the rationalist canon. This includes R:AZ, The Codex, Superforecasting, How to Measure Anything, Inadequate Equilibria, and Good and Real.
I am not sure what exactly you mean with "modern intellectual citizen". At the broadest, it could encompass all adults, at the narrowest, it would be limited to college professors & public intellectuals.
I also doubt that this is a productive method of raising the Sanity Waterline. We're here in a place where many people have had their minds pretty strongly changed by these texts, but reading e.g. the reviews of R:AZ on Amazon & Goodreads, I observe that many people read it, say "meh" and go on with their lives – a pretty disappointing result for reading ~2000 pages!
Furthermore, aren't sufficiently intellectual people already familiar with some of the ideas in the "rationalist canon", just by exposure to the scientific method? I think yes, and I also think that the most valuable aspect of these texts is not the ideas in and of themselves, but rather the type/structure of thinking they demand? (E.g. scout vs. soldier mindset).
Thanks, good questions. I had originally written "every responsible intellectual citizen" but that didn't feel quite right. I didn't want so much to morally condemn people who haven't read what I find important, but to highlight the fact that news of general intellectual progress does not seem to move as fast as news of progress in science. So I could forgive someone for not knowing about Fun Theory calculations nowadays, even if they were a circumspect philosopher in the 1980s--they've let themselves fall out of touch, but news travels slowly and communication has changed so it's not totally their fault.
I also doubt that this is a productive method of raising the Sanity Waterline. We're here in a place where many people have had their minds pretty strongly changed by these texts, but reading e.g. the reviews of R:AZ on Amazon & Goodreads, I observe that many people read it, say "meh" and go on with their lives – a pretty disappointing result for reading ~2000 pages!
Yeah, you're probably right about the Sanity Waterline. I didn't know about those amazon reviews though :[
Furthermore, aren't sufficiently intellectual people already familiar with some of the ideas in the "rationalist canon", just by exposure to the scientific method?
Well to illustrate my motivation here, I've occasionally made bets with my most infovorous coworkers, but they would always insist on doing even odds. I tried to explain odds ratios, loss aversion, and the linear utility of small amounts of money, but of course that never worked. But when I'm hanging out with rationalists, this problem doesn't happen.
EDIT: Here's the same frustration from a different angle: suppose I have these three intellectual friends. Alice is a normal-ish physics student who likes to feed her extra-curricular curiosity mostly by reading and listening to Sean Carroll. Bob is a reddit junkie who watches Science YouTube and supplements with Sam Harris and Eric Weinstein's podcasts. Charlie is a tech worker who likes to read Vox for infotainment, and he's been exposed to a handful of EA ideas and a few SSC posts, all of which made him think "whoa, cool", but none of which made him slide down the rabbit hole.
Maybe I can get Alice to make bets with me and to agree that anthropics is an important part of the frontier of philosophy, but for some reason she is still just so weak with futurism--she seems to still be leaning on Jetsons-style archetypes without realizing it. I can have serious, productive conversations with Bob about our coming cyberpunk weirdtopia, but when I bring up prediction markets, he--lacking the background knowledge about EV, odds ratios, and betting--doesn't really seem to get it. I'm arguing politics with Charlie, and I make reference to "the naive view of free will" and he asks me to stop and explain. Oh right, I think and start looking for an alternate approach to what I'm trying to say.
Alice, Bob, and Charlie are all getting some relatively high-quality exposure to the scientific method in action, but whenever I talk to one of them, I end up thinking, gods, when will this concept become more widespread?
- Morality is in the mind and moral realism is mistaken. “What should I want” collapses into “what do I want?” “What is good?” collapses into “what do I like?”
That's not implied by 13. 13 indicates that morality, while not fully realistic, is defined at group level.
Following John Nerst’s noble example: https://everythingstudies.com/2018/07/16/30-fundamentals
Only items after #18 are due to me, the rest are copied from the link above.
a dystopiastuck in some inadequate equilibria. We know what better incentives and saner institutions might look like, but we don't know how to put them in place.