B. Sc. in Biology M. Sc. in Cognitive Science Left my PhD after being disappointed with the inefficiencies of the (German) academic system. Working at visiolab.io on narrow AI for self-checkout systems.
I wonder if “Solving the alignment problem” seems impossible given the currently invested resources, we should rather focus on different angles of approaches.
The basic premise here seems to be
Not solving the alignment problem → Death by AGI
However, I do not think this is quite right or at least not the full picture. While it is certainly true that we need to solve the alignment problem in order to thrive with AGI
Thriving with AGI → Solving the alignment problem
the implication is not bidirectional, as we could solve the alignment problem and still create an AGI without applying that solution, still leading to the death of humanity.
Therefore, I think that instead of focussing on... (read more)
I wonder if “Solving the alignment problem” seems impossible given the currently invested resources, we should rather focus on different angles of approaches.
The basic premise here seems to be
Not solving the alignment problem → Death by AGI
However, I do not think this is quite right or at least not the full picture. While it is certainly true that we need to solve the alignment problem in order to thrive with AGI
Thriving with AGI → Solving the alignment problem
the implication is not bidirectional, as we could solve the alignment problem and still create an AGI without applying that solution, still leading to the death of humanity.
Therefore, I think that instead of focussing on... (read more)