MugaSofer comments on The Level Above Mine - Less Wrong

42 Post author: Eliezer_Yudkowsky 26 September 2008 09:18AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (387)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: MugaSofer 24 January 2013 10:05:56AM -1 points [-]

Why would them feeling it help them "react believably to their environment and situation and events"? If they're dumb enough to "run lots of small, stupid, suffering conscious agents on a home computer", I mean.

Of course, give Moore time and this objection will stop applying.

Comment author: DaFranker 24 January 2013 02:58:53PM *  1 point [-]

We're already pretty close to making game characters have believable reactions, but only through clever scripting and a human deciding that situation X warrants reaction Y, and then applying mathematically-complicated patterns of light and prerecorded sounds onto the output devices of a computer.

If we can successfully implement a system that has that-function-we-refer-to-when-we-say-"consciousness" and that-f-w-r-t-w-w-s-"really feel pain", then it seems an easy additional step to implement the kind of events triggering the latter function and the kind of outputs from the former function that would be believable and convincing to human players. I may be having faulty algorithmic intuitions here though.

Comment author: MugaSofer 25 January 2013 09:24:54AM 0 points [-]

Well, if they were as smart as humans, sure. Even as smart as dogs, maybe. But if they're running lots of 'em on a home PC, then I must have been mistaken about how smart you have to be for consciousness.