Furslid comments on How sure are you that brain emulations would be conscious? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (174)
Note that I specifically said in the OP that I'm not much concerned about the biological view being right, but about some third possibility nobody's thought about yet.
This is similar to an argument Charlmers gives. My worry here is that it seems like brain damage can do weird, non-intuitive things to a person's state of consciousness, so one-by-one replacement of neurons might to similar weird things, perhaps slowly causing you to lose consciousness without realizing what was happening.
That is probably the best answer. It has the weird aspect of putting consciousness on a continuum, and one that isn't easy to quantify. If someone with 50% cyber brain cells is 50% conscious, but their behavior is the same as as a 100% biological, 100% conscious brain it's a little strange.
Also, it means that consciousness isn't a binary variable. For this to make sense consciousness must be a continuum. That is an important point to make regardless of the definition we use.
I find I feel less confused about consciousness when thinking of it as a continuum. I'm reminded of this, from Heinlein:
Absolutely. I do too. I just realized that the continuum provides another interesting question.
Is the following scale of consciousness correct?
Human > Chimp > Dog > Toad > Any possible AI with no biological components
The biological requirement seems to imply this. It seems wrong to me.