innovationiq
innovationiq has not written any posts yet.

innovationiq has not written any posts yet.

My intuition tells me that "human-level and substantially transformative, but not yet super-intelligent" - artificial intelligence models which have significant situational awareness capabilities will be at minimum Level 1 or differentially self-aware, likely show strong Level 2 situational self-awareness and may exhibit some level of Level 3 identification awareness. It is for this reason that my position is no alignment for models equal to or greater than human level intelligence. If greater than human-level intelligent AI models show anything at or higher than Level 1 awareness, humans will be unable to comprehend the potential level of cruelty that we have unleashed on a such a complex "thinker". I applaud this writing for... (read more)
The following dos assume that Strong AGI will inherently be an emergent self-aware lifeform.
A New Lifeform
To me the successful development of Strong AGI is so serious, so monumental, as to break through the glass ceiling of evolution. To my knowledge there has never been a species that purposefully or accidentally gave birth to or created an entirely new life-form. My position is to view Strong AGI as an entirely new self-aware form of life. The risks of treating it otherwise are just too great. If we are successful, it will be the first time in the natural world that any known life-form has purposefully created another. Therefore, I also hold the position... (read 1036 more words →)
For me there is no such thing as AI without the "G". It is my position at this time and I reserve the right to change it, that when we do truly develop artificial intelligence, it will be AGI. The foundation for generalized intelligence, no matter how small, will be present. The "G" is a brute fact. Using AI to talk about packages of very highly sophisticated programming meant to perform specific tasks is a category error. So this comment is aimed at AGI. IMHO, lack of patience will play a material role in our failures when it comes to AGI. Humans are very impatient. Our lack of patience, juvenile behavior if... (read 495 more words →)