You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

V_V comments on What if Strong AI is just not possible? - Less Wrong Discussion

7 Post author: listic 01 January 2014 05:51PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (101)

You are viewing a single comment's thread. Show more comments above.

Comment author: V_V 08 January 2014 10:27:18PM *  0 points [-]

If I replicate the brain algorithm of a human, but I do it in some other form (e.g. as a computer program, instead of using carbon based molecules), is that an "AI"?

Possibly, that's a borderline case.

If I make something very very similar, but not identical to the brain algorithm of a human, but I do it in some other form (e.g. as a computer program, instead of using carbon based molecules), is that an "AI?"

In my original reply my intent was "provided that there are no souls/inputs from outside the universe required to make a functioning human, then we are able to create an AI by building something functionally equivalent to a human, and therefore strong AI is possible".

Even if humans are essentially computable, in a theoretical sense, it doesnt follow that it is physically possible to build something functionally equivalent on a different type of hardware, under practical constraints.
Think of running Google on a mechanical computer like Babbage's Analytical Engine.