Furslid comments on How sure are you that brain emulations would be conscious? - Less Wrong

15 Post author: ChrisHallquist 26 August 2013 06:21AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (174)

You are viewing a single comment's thread. Show more comments above.

Comment author: Furslid 24 August 2013 09:37:18PM *  1 point [-]

That is probably the best answer. It has the weird aspect of putting consciousness on a continuum, and one that isn't easy to quantify. If someone with 50% cyber brain cells is 50% conscious, but their behavior is the same as as a 100% biological, 100% conscious brain it's a little strange.

Also, it means that consciousness isn't a binary variable. For this to make sense consciousness must be a continuum. That is an important point to make regardless of the definition we use.

Comment author: Error 27 August 2013 09:30:14PM 0 points [-]

It has the weird aspect of putting consciousness on a continuum,

I find I feel less confused about consciousness when thinking of it as a continuum. I'm reminded of this, from Heinlein:

"Am not going to argue whether a machine can 'really' be alive, 'really' be self-aware. Is a virus self-aware? Nyet. How about oyster? I doubt it. A cat? Almost certainly. A human? Don't know about you, tovarishch, but I am."

Comment author: Furslid 29 August 2013 06:18:17AM 1 point [-]

Absolutely. I do too. I just realized that the continuum provides another interesting question.

Is the following scale of consciousness correct?

Human > Chimp > Dog > Toad > Any possible AI with no biological components

The biological requirement seems to imply this. It seems wrong to me.