XiXiDu comments on Singularity Institute $100,000 end-of-year fundraiser only 20% filled so far - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (47)
Given his influence he seems to be worth the time that it takes to try to explain to him how he is wrong?
The only way to approach general intelligence may be by emulating the human algorithms. The opinion that we are capable of inventing an artificial and simple algorithm exhibiting general intelligence is not a mainstream opinion among AI and machine learning researchers. And even if one assumes that all those scientists are not nearly as smart and rational as SI folks, they seem to have much headway when it comes to real world experience about the field of AI and its difficulties.
I actually share the perception that we have no reason to suspect that we could reach a level above ours without massive and time-costly experimentation (removing our biases merely sounds easy when formulated in English).
I think that you might be attributing too much to an expression uttered in an informal conversation.
What do you mean by "feelings" and "preferences". The use of intuition seems to be universal, even within the field of mathematics. I don't see how computational bounded agents could get around "feelings" when making predictions about subjects that are only vaguely understood and defined. Framing the problem in technical terms like "predictive algorithms" doesn't change anything about the fact that making predictions about subjects that are poorly understood is error prone.
Yes. He just doesn't seem to be someone whose opinion on artificial intelligence should be considered particularly important. He's just a layman making the typical layman guesses and mistakes. I'm far more interested in what he has to say on warps in spacetime!