"Follow the right people on twitter" is probably the best option. People will often post twitter threads explaining new papers they put out. There's also stuff like:
can you and others please reply with lists of people you find notable for their high signal to noise ratio, especially given twitter's sharp decline in quality lately?
Here are some Twitter accounts I've found useful to follow (in no particular order): Quintin Pope, Janus @repligate, Neel Nanda, Chris Olah, Jack Clark, Yo Shavit @yonashav, Oliver Habryka, Eliezer Yudkowsky, alex lawsen, David Krueger, Stella Rose Biderman, Michael Nielsen, Ajeya Cotra, Joshua Achiam, Séb Krier, Ian Hogarth, Alex Turner, Nora Belrose, Dan Hendrycks, Daniel Paleka, Lauro Langosco, Epoch AI Research, davidad, Zvi Mowshowitz, Rob Miles
For tracking ML theory progress I like @TheGregYang, @typedfemale, @SebastienBubeck, @deepcohen, @SuryaGanguli.
I listen to these podcasts which often have content related to AI alignment or AI risk. Any other suggestions?
Other podcasts that have at least some relevant episodes: Hear This Idea, Towards Data Science, The Lunar Society, The Inside View, Machine Learning Street Talk
Here are some resources I use to keep track of technical research that might be alignment-relevant:
How I gain value: These resources help me notice where my understanding breaks down i.e. what I might want to study, and they get thought-provoking research on my radar.
I haven't kept up with it so can't really vouch for it but Rohin's alignment newsletter should also be on your radar. https://rohinshah.com/alignment-newsletter/
This is probably not the most efficient way for keeping up with new stuff, but aisafety.info is shaping up to be a good repository of alignment concepts.
I've lately been thinking I should prioritize a bit more keeping up with alignment-relevant progress outside of LessWrong/Alignment Forum.
I'm curious if people have recommendations that stand out as reliably valuable, and/or have tips for finding "the good stuff" on places where the signal/noise ratio isn't very good. (Seems fine to also apply this to LW/AF)
Some places I've looked into somewhat (though not made major habits around so far) include:
I generally struggle with figuring out how much to keep up with stuff – it seems like there's more than one full-time-job's worth of stuff to keep up with, and it's potentially overanchoring to think about "the stuff people have worked on" as opposed to "stuff that hasn't been worked on yet."
I'm personally coming at this from a lens of "understand the field well enough to think about how to make useful infrastructural advances", but I'm interested in hearing thoughts about various ways people keep-up-with-stuff and how they gain value from it.