I think you're committing the typical mind fallacy here. It seems you have a lot of hot empathy, so because that is the most visible part of your altruistic cognition, you easily think it's the only one. Some of your thinking seems to be motivated by this.
Mind projection fallacy is when you confuse map with territory and preferences with facts. What I'm doing is assuming other humans are like me - a heuristic which does in fact generally work.
But even so, I did mention:
I don't actually care about Hot Empathy either . What I care about are your preferences - do you care about others as a (non-instrumental) value? Hot Empathy is where most humans derive their altruistic preferences from, but if you derive altruistic preferences via some other route then that works for me.
Does that amelieorate the criticism?
Even these wider concepts don't imply sadism either. Be careful not confuse them, as that has potential to insult a lot of people.
Does that mean you are offended? My apologies if so, I should have been more precise with langauge. However, I'm not sure why you think i confused sociopathy (lack of guilt, sympathetic pain) with sadism (pleasure via pain of others). Those two are almost opposites.
Insisting on visible behavioural output means you don't care about paralyzed people.
Of course not. You still have to use the computation, but morally speaking your interested in the outputs of the computation. In the case of the paralized person, you look at their brain, see what their outputs would be if they were in a different situation, and act accordingly.
The reason we can't just define suffering as a specific computation present in the brain is because when we are faced with other minds who use different computations to arrive at a roughly same output per input, we won't recognize them as suffering...unless we define suffering in relation to intput-output in the first place.
For example, most humans compute altruism via interactions between the amygdala and the vmPFC. Now, if someone doesn't compute altruism that way, but still exhibits altruistic behavior...then isn't it exactly the same thing? Weren't you disturbed when you thought that I was presuming to judge a person based on their internal states rather than their behavior previously in this conversation?
We obviously still look at the computation, but the reason we are looking is to figure out what it wishes to output in response to various inputs. That's what a computation is...a bridge between inputs into outputs.
I'm not sure if I'm explaining this correctly...a computation can't be intrinsically suffering or intrinsically pleasure, and claiming that it is commits some sort of essentialism which doesn't have a name yet...computational essentialism? You could take the exact same computation that represents suffering in one creature and re-purpose it into a different purpose entirely by changing the other computations with which it interacts. You can't just point to some computation and say, "this is Suffering, no matter what the surrounding context is".
I'm sorry, you don't seem to be applying the rigor. From my POV you're taking suffering, taking everything that's important about it, throwing it in the trash can and inventing your own concept that has nothing to do with what people mean when they use the word. Why should I care about this concept you produced from thin air?
Acknowledged. Like I said:
Part of the problem is that, in order to explain my idea, I took certain words and re-defined them away from their common usage to suite my purposes. I don't know how to say this using the words we have now though. And the other problem is that it's sloppy. I haven't thought through this nearly enough.
But your experiential definition of suffering is, by definition, inaccessable. If you define suffering that way, then the word will dissolve later on, much like words like "free will" tend to either dissolve or change definition so drastically that it scarcely seems like the same thing. The definition needs to change because the original definiton doesn't make sense. Qualia only applies to you, not to others.
I know from personal experience that suffering sucks, and I want less of it in this universe.
(by the way, this is pretty much the definition of the amygdala-vmPFC brand of "empathy" so I'm not sure why you refer to yourself as "low empathy". Or did you think that by "empathy" I was referring to mere mirroring the affective states of those around you, like how people cry at movies or something?)
comments this long
Can't be helped I'm afraid - this is one of those situations where brevity would take more effort. Not to worry, I don't feel offended if people don't reply to my comments, if that's why you felt the need to mention that you might not be able to reply!
Mind projection fallacy is when you confuse map with territory and preferences with facts. What I'm doing is assuming other humans are like me - a heuristic which does in fact generally work.
He said “typical mind fallacy”, not “mind projection fallacy”.
I felt like this draft paper by Anders Sandberg was a well-thought-out essay on the morality of experiments on brain emulations. Is there anything you disagree with here, or think he should handle differently?
http://www.aleph.se/papers/Ethics%20of%20brain%20emulations%20draft.pdf