Posts

Sorted by New

Wiki Contributions

Comments

Sorted by

Great way how to visualize the risks of unaligned AGI is the alien life form in the film Life https://www.youtube.com/watch?v=cuA-xqBw4jE

It starts as a seed entity, quickly adapts, learns new tricks, gets bigger and stronger, ruthlessly oblivious to human value system. Watch it, and instead of the alien imagine child AGI.

Hi Gordon!

thanks for writing this. I am glad you enjoyed HLAI 2018.

I agree, many AI/AGI researchers partially or completely ignore AI/AGI safety. But I have noticed a trend in the past years: it's possible to "turn" these people and make them take safety more seriously.

Usually the reason to their "safety ignorance" is just insufficient insight, not spending enough time on this topic. Once they learn more, they quickly see how thing can go wrong. Of course, not everyone.

Hope this helped.

Best,

Marek