Introducing the Fund for Alignment Research (We're Hiring!)
Cross-posted to the EA Forum The Fund for Alignment Research (FAR) is hiring research engineers and communication specialists to work closely with AI safety researchers. We believe these roles are high-impact, contributing to some of the most interesting research agendas in safety. We also think they offer an excellent opportunity...