Lumifer comments on [Link] First almost fully-formed human [foetus] brain grown in lab, researchers claim - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (40)
You can't talk about ethical boundaries being pushed unless you place that ethical boundary somewhere first. Otherwise we're back to hand-waving: Can I say that because no one "can yet offer any satisfactory theory of consciousness', chewing on a salad is ethically problematic?
Basically, you can't be both worried and unhappy, and completely unspecific :-/
Is there any particular reason to believe that a salads might be capable of consciousness? No.
Is there any particular reason to believe that brains might be capable of consciousness? Yes - namely the fact that most brains insist on describing themselves as such. Does this imply brains are conscious if and only if they insist on describing themselves as such? No. No more than than a bird is only capable of flight when it's actually literally soaring in the air.
How can you tell without "any satisfactory theory of consciousness"?
The same way I don't need to understand aerodynamics to know that I have no reason to believe that turtles might be capable of flight. I've never seen a turtle do anything that sits in the neighbourhood of the notion of "flight" in the network of concepts in my head. This type of argument doesn't work against the putative consciousness of foetal brains, since we have good reason to believe that at least brains at a certain stage of development are in fact conscious. To argue that this means we can only have an ethical problem with running dubious experiments on brains at that stage of development is rather like arguing that since you've only ever seen white swans fly, the supposition that black swans might fly too is not justified as such.
You don't need to know the underlying mechanics, but you do need to know what flight is.
You're saying we don't even know what consciousness is.
No one is arguing that. I am saying that if you claim to have a problem, you have to be more specific about what your problem is and what might convince you that it is not a problem.
"Prove to me something I don't know what" is not a useful attitude.
Not in the least. I know what consciousness is because I am a consciousness. The need for a theory of consciousness is necessary to tie the concept to the material world, so that you can make statements like "a rock cannot be conscious, in principle".
What might convince me is a satisfactory theory of consciousness. Do I have to provide a full specification of what would be "satisfactory" just to recognize an ethical problem? If so there is hardly anything about which I could raise an ethical concern, since I'd perpetually be working on epistemic aesthetics until all necessary puzzles are solved. This is just in fact not how anyone operates. We proceed with vague concepts, heuristic criteria for satisfactoriness, incomplete theories, etc. To say that this should be disallowed unless you can unfold your theory's logical substructure in a kind of Principia Ethica is waaay more useless than interpreting ideas through partial theories.
Not "full", but some, yes. Otherwise anyone can squint at anything and say "I think there is an ethical problem here. I can't quite put my finger on it, but my gut feeling ("visceral level") is that there is" -- and there is no adequate response to that.
As an instance of the limits of replacing words with their definitions to clarify debates, this looks like an important conversation.
The fuzziest starting point for "consciousness" is "something similar to what I experience when I consider my own mind". But this doesn't help much. Someone can still claim "So rocks probably have consciousness!", and another can respond "Certainly not, but brains grown in labs likely do!". Arguing from physical similarity, etc. just relies on the other person sharing your intuitions.
For some concepts, we disagree on definitions because we don't know actually know what those concepts refer to (this doesn't include concepts like "art", etc.). I'm not sure what the best way to talk about whether an entity possesses such a concept is. Are there existing articles/discussions about that?
If I don't know what I'm referring to when I say "consciousness," it seems reasonable to conclude that I ought not use the term.
What it is, to know what one is referring to? If I see a flying saucer, I may be wrong in believing it's an alien spaceship, but I am not wrong about seeing something, a thing I also believe to be an alien spaceship.
pangel says:
and that is the brute fact from which the conundrum of consciousness starts. The fact of having subjective experience is the primary subject matter. That we have no idea how, given everything else we know about the world, there could be any such thing as experience, is not a problem for the fact. It is a problem for those seeking an explanation for the fact. Ignorance and confusion are in the map, not the territory.
All attempts to solve the problem have so far taken one of two forms:
Here is something objectively measurable that correlates with the subjective experience. Therefore that thing is the subjective experience.
We can't explain it, therefore it doesn't exist.
Discussion mostly takes the form of knocking down everyone else's wrong theories. But all the theories are wrong, so there is no end to this.
The actual creation of brains-in-vats will certainly give more urgency to the issue. I expect the ethical issues will be dealt with just by prohibiting growing beyond a certain stage.