Juno_Watt comments on How sure are you that brain emulations would be conscious? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (174)
This question is more subtle than that.
Is there any variation in "implementation" that could be completely hidden from outside investigation? can thre be completely indetectable phsycial differences?
We can put something in a box, and agree not to peek inside the box, and we can say that two such systems are equivalent as far as what is allowed to manifest outside the box. But differnt kinds of black box will yield different equivalences. If you are allowed to know that box A needs and oxygen supply, and that box B needs an electrcity supply, that's a clue. Equivalence is equivalence of an chosen subset of behaviours. No two things are absolutely, acontextually equivalent unless they are phsycially identical. And to draw the line between relevant behaviour and irrelevant implementation correctly would require a pre-existing perfect understanding of the mind-matter relationship.
I wasn't arguing that differences in implementation are not important. For some purposes they are very important. I'm just pointing out that you are restricted to discussing differences in implementation and so OP should not be surprised that people who wish to claim that WBEs would not be "conscious" support implausible theories as "only biological systems can be conscious".
We should not discuss the question of what can be conscious, however, without first tabooing "consciousness" as I requested.
I am not arguing they are important. I am arguing that there are no facts about what is an implementation unless a human has decided what is being implemented.
I don't think they argument requires consc. to be anything more than:
1) something that is there or not (not a matter of interpetation or convention).
2) something that is not entirely inferable from behaviour.
Fine, but what is it?
What makes you think I know?
If you use the word "consciousness", you ought to know what you mean by it. You should always be able to taboo any word you use. So I'm asking you, what is this "consciousness" that you (and the OP) talk about?
The same applies to you. Any English speaker can attach a meaning to "consciousness". That doesn't imply the possession of deep metaphysical insight. I don't know what dark matter "is" either. I don't need to fully explain what consc. "is", since ..
"I don't think the argument requires consc. to be anything more than:
1) something that is there or not (not a matter of interpretation or convention).
2) something that is not entirely inferable from behaviour."
You repeatedly miss the point of my argument. If you were teaching English to a foreign person, and your dictionary didn't contain the word "Conscoiusness", how would you explain what you meant by that word?
I'm not asking you to explain to an alien. You can rely on shared human intuitions and so on. I'm just asking you what the word means to you, because it demonstrably means different things to different people, even though they are all English users.
I have already stated those aspects of the meaning of "consciousness" necessary for my argument to go through. Why should I explain more?
You mean these aspects?
A lot of things would satisfy that definition without having anything to do with "consciousness". An inert lump of metal stuck in your brain would satisfy it. Are you saying you really don't know anything significant about what the word "consciousness" means beyond those two requirements?