jkaufman comments on Arguments Against Speciesism - LessWrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (474)
How small a subsystem can experience pleasure or pain? If we developed configurations specifically for this purpose and sacrificed all the other things you normally want out of a brain we could likely get far more sentience per gram of neurons than you get with any existing brain. If someone built a "happy neuron farm" of these, would that be a good thing? Would a "sad neuron farm" be bad?
EDIT: expanded this into a top level post.
I don't think that we should be confident that such things are all that matter (indeed, I think that's not true), or that the value is independent of features like complexity (a thermostat program vs an autonomous social robot).
I would answer "yes" and "yes," especially in expected value terms.