I just discovered this site, and "SIngularity", this evening. Something I've contemplated for quite a few years is the idea that we have quite enough intelligence and technology in the world already; our collective problem is that we lack the wisdom, ethics and morality to make the best use of it.
We already have the means to feed and clothe every person in this world. We have the means to control population growth, provide adequate housing for all, etc. The fact that we do not do these things is not because of a lack of intelligence but rather a lack of humanity.
Greed, jealousy, and all the rest of the "seven deadly sins" continue to plague us. Without an end to them, it will surely , eventually, be the end of us. The goal of simply surviving isn't enough. Wasps and crocodiles have survived millions of years...how would AI be any different if the goal is simply greater intelligence? What good is greater intelligence if we ignore improving the less "rational" components of being human?
I can't site the studies off hand but I have read many articles claiming there appears to be an inverse relationship between intelligence and empathy for others. From a purely rational stand point, why would any AI entity view mankind in its current condition to be anything other than a hindrance ? Be careful what you wish for-
I just discovered this site, and "SIngularity", this evening. Something I've contemplated for quite a few years is the idea that we have quite enough intelligence and technology in the world already; our collective problem is that we lack the wisdom, ethics and morality to make the best use of it. We already have the means to feed and clothe every person in this world. We have the means to control population growth, provide adequate housing for all, etc. The fact that we do not do these things is not because of a lack of intelligence but rather a lack of humanity. Greed, jealousy, and all the rest of the "seven deadly sins" continue to plague us. Without an end to them, it will surely , eventually, be the end of us. The goal of simply surviving isn't enough. Wasps and crocodiles have survived millions of years...how would AI be any different if the goal is simply greater intelligence? What good is greater intelligence if we ignore improving the less "rational" components of being human? I can't site the studies off hand but I have read many articles claiming there appears to be an inverse relationship between intelligence and empathy for others. From a purely rational stand point, why would any AI entity view mankind in its current condition to be anything other than a hindrance ? Be careful what you wish for-