I often talk to developers who prefer not destroying the world by accident (specifically by accelerating AGI risk), but neither them nor me can decide if specific companies qualify for this.
Could someone knowledgable help? A few short replies could probably change someone's career decisions
Can you help with future questions?
Please subscribe to the this comment. I'll reply to it only when there's a new open question.
Thank you!
Adding: Reply anonymously here
Deep mind in general: wdyt?
I was thinking of the possibility of affecting decision-making, either directly by rising the ranks (not very likely) or indirectly by being an advocate for safety at an important time and pushing things into the Overton window within an organization.
I imagine Habryka would say that a significant possibility here is that joining an AGI lab will wrongly turn you into an AGI enthusiast. I think biasing effects like that are real, though I also think it's hard to tell in cases like that how much you are biased v.s. updating correctly on new information,... (read more)