The argument from marginal cases claims that you can't both think that humans matter morally and that animals don't, because no reasonable set of criteria for moral worth cleanly separates all humans from all animals. For example, perhaps someone says that suffering only matters when it happens to something that has some bundle of capabilities like linguistic ability, compassion, and/or abstract reasoning. If livestock don't have these capabilities, however, then some people such as very young children probably don't either.
This is a strong argument, and it avoids the noncentral fallacy. Any set of qualities you value are going to vary over people and animals, and if you make a continuum there's not going to be a place you can draw a line that will fall above all animals and below all people. So why do I treat humans as the only entities that count morally?
If you asked me how many chickens I would be willing to kill to save your life, the answer is effectively "all of them". [1] This pins down two points on the continuum that I'm clear on: you and chickens. While I'm uncertain where along there things start getting up to significant levels, I think it's probably somewhere that includes no or almost no animals but nearly all humans. Making this distinction among humans, however, would be incredibly socially destructive, especially given how unsure I am about where the line should go, and so I think we end up with a much better society if we treat all humans as morally equal. This means I end up saying things like "value all humans equally; don't value animals" when that's not my real distinction, just the closest schelling point.
[1] Chicken extinction would make life worse for many other people, so I wouldn't actually do that, but not because of the effect on the chickens.
I also posted this on my blog.
It could be a feature if the secular value is a big house or something, and the sacred value is whatever you might donate to an effective charity for.
It's definitely not a feature if the sacred value is sacred to the individual because when they imagine compromising it, they imagine scorn from their peers, and if society has handed it down as a sacred value of that sort since antiquity.
Also, not all values that people will treat as lexically lower than their most sacred values (In reality, there are probably more than 2 tiers of values, of course) are things you would probably want to get rid of. Most of fun theory is probably much lower on the hierarchy of things that cannot be traded away than human life, and yet you still want concern for it to have a significant role in shaping the future.
And then there's taboo tradeoffs between a certain amount of a thing, and a smaller amount of the same thing, and following the kind of thought process I warned against leads you to into the territory of clear madness like choosing specks over torture no matter the number of specks.
A more troubling counterargument to what I said is that no matter what you do you are living an answer to the question, so you can't just ignore it. This is true if you are an effective altruist (who has already rejected working on existential risk), and trying to decide whether to focus on helping humans or helping animals. Then you really need to do a utilitarian calculus that requires that number.
If I needed that number, I would first try to spend some time around the relevant sort of animal (or the closest approximation I could get), and try to gather as much information as possible about what it was cognitively capable of, hang out with some animal-loving hippies to counteract social pressure in favor of valuing humans infinitely more than animals. Then I might try and figure things out indirectly, through separate comparisons to a third variable (Perhaps my own time? I don't think I feel any hard taboo tradeoffs at work when I think about how much time I'd spend to help animals or how much I'd spend to help humans, though maybe I've just worn out the taboo by thinking about trading my time for lives as much as I have (Edit: that sounded a lot more evil than is accurate. To clarify, killing people to save time does sound horrifying to me, but not bothering to save distant strangers doesn't)).