The sequences are pretty long, and even the Highlights of the Sequences and the CFAR Handbook are pretty long, even though they're the next best thing. Much too long for me to recommend to the busy people in my life so they can try out rationality for the first time.
Akash made an effort to heavily trim down the Highlights of the Sequences, but too much was lost without the evocative examples.
What I'm wondering is: what is there here on Lesswrong, and elsewhere, that amps up someone's intelligence/rationality as much as possible, but is short enough that nobody looks at it and thinks "I'm too busy/tired for this"? News articles seem optimized for this, they seem to gravitate towards <1000 and there's some pretty strong market optimization pressures there. I've heard some good things about Scott Alexander's blog.
If provably increasing intelligence/rationality is too much to ask, then what affected you the most, what resulted in enduring, net positive change, such that after reading it, you had permanently diverged from the kind of person you were before?
I don't think there's a shortcut for finding a particular blogpost that's relevant to a particular person. People's reasoning process lumpy. Different people are in need of different insights. The mechanism by which the Sequences improve reasoning is by
a) filling in a bunch of individual concepts people might be missing (but where any given person isn't necessarily missing all of them. Of 100 concepts, maybe they're missing maybe 35, and sort of half-understood another 50 but didn't realize all the implications). But which 35 missing ideas varies from person to person.
b) connecting a bunch of ideas together into a worldview more useful than the sum of it's parts.
The first one requires you to have some knowledge of what would actually be helpful to an individual friend. The second one really does just require reading a lot, and being generally interested.
So if you only have a couple blogposts to hook someone with, you should be more optimizing for peaking their interest and reading more, if they're the sort of person who wants to read more.
I do think... provably increasing intelligence rationality is just way out of scope for what any of these blogposts can do.
(Incidentally, if you're the guy who's been posting the "CFAR handbook" doubles your intelligence on Dank EA Memes, um, that's a pretty confused, harmful concept to be promoting and I do think you should, like, stop)