If you endorse naive FCToM, you would say "that's just me!" But far more ethically relevant than the emulation is the experience of the many people enslaved in this system.
Can't I say that the emulation is me, and does morally matter (via FCToM), and also the many people enslaved in the system morally matter (via regular morality)?
It seems like you're saying that FCToM implies that if a physical system implements a morally relevant mathematical function, then the physical system itself cannot include morally relevant bits, and I don't see why that has anything to do with FCToM.
Can't I say that the emulation is me, and does morally matter (via FCToM), and also the many people enslaved in the system morally matter (via regular morality)?
Yes.
What I'm saying is that, to use the language of the debate I referenced, "what kind of paper the equation is written on DOES matter".
It seems like you're saying that FCToM implies that if a physical system implements a morally relevant mathematical function, then the physical system itself cannot include morally relevant bits, and I don't see why that has anything to do with FCToM.
I'm saying "naive FCToM", as I've characterized it, says that. I doubt "naive FCToM" is even coherent. That's sort of part of my broader point (which I didn't make yet in this post).
Thesis: FCToM is underspecified, since they rely on a mapping (which I call the "level of analysis") from physical systems to mathematic functions or computational algorithms (respectively), but do not specify that mapping. (I'm collapsing functional and computational theories of mind, since I think the distinction isn't relevant here.
I believe this issue has great ethical significance: if we accept a naive version of FCToM, we may end up using a misleading level of analysis, and (e.g.) committing massive mind crime. One form of naive FCToM would ignore this issue and say: "if two systems can be described as performing the same computations, then they have the same 'mind' (and hence the same consciousness and same status as moral patients)"
The reductio ad absurdum: Imagine a future totalitarian society where individual humans are forced to play the role of logic gates in a computer which hosts an emulation of your brain. They communicate via snail-mail, and severe punishment, social isolation, and redundancies are used to ensure that they perform their task faithfully. If you endorse naive FCToM, you would say "that's just me!" But far more ethically relevant than the emulation is the experience of the many people enslaved in this system. Note: this is a thought experiment, and may not be physically realizable (for instance, the people playing the gates may be too difficult to control); I think exploring that issue can provide a complementary critique of FCToM, but I'll skip it for now.
Historical note: the idea for writing this post, although not the content, is somewhat inspired by a debate between Massimo Pigliucci and Eliezer Yudkowsky on blogging heads (EDIT: e.g., jumping in the middle here: https://www.youtube.com/watch?v=onvAl4SQ5-Q&t=2118s ). I think Massimo won that argument.