Tarzan,_me_Jane2

Posts

Sorted by New

Wiki Contributions

Comments

Sorted by

There has never been, so far as I able to determine, any force so unfriendly to humans as humans. Yet we read day after day about one very smart man's philosophizing about the essence of humanity, supposedly so that it can be included in the essence of fAI. Wouldn't it be incredible if tomorrow, or sometime in the near future, someone who has been working and actually come up with some designs for fAI or AGI produces a real product, and it makes all the hubris of these responses irrelevant? What is the purpose of an intelligence that is able to take all the unkind things mankind has been able to do, and do them faster and more efficiently? Paper clips may be the answer; certainly humans cannot use their record to debate it. Finally, the fact that one man, no matter how gifted, thinks that he is the only possible answer makes one shudder at the potential attitude of the superhuman intelligence he would create. It will not only have the attitude, "I am unlikely to to take your advice, or even to take it seriously, so stop wasting your time", as E. Y. said to one poster, but it will have that attitude toward it's programmers as well, at the level of superhuman effectiveness. I want fAI as much as anyone. All this public rumination is not the approach.