...at least not if you accept a certain line of anthropic argument.
Thomas Nagel famously challenged the philosophical world to come to terms with qualia in his essay "What is it Like to Be a Bat?". Bats, with sensory systems so completely different from those of humans, must have exotic bat qualia that we could never imagine. Even if we deduce all the physical principles behind echolocation, even if we could specify the movement of every atom in a bat's senses and nervous system that represents its knowledge of where an echolocated insect is, we still have no idea what it's like to feel a subjective echolocation quale.
Anthropic reasoning is the idea that you can reason conditioning on your own existence. For example, the Doomsday Argument says that you would be more likely to exist in the present day if the overall number of future humans was medium-sized instead of humongous, therefore since you exist in the present day, there must be only a medium-sized number of future humans, and the apocalypse must be nigh, for values of nigh equal to "within a few hundred years or so".
The Buddhists have a parable to motivate young seekers after enlightenment. They say - there are zillions upon zillions of insects, trillions upon trillions of lesser animals, and only a relative handful of human beings. For a reincarnating soul to be born as a human being, then, is a rare and precious gift, and an opportunity that should be seized with great enthusiasm, as it will be endless eons before it comes around again.
Whatever one thinks of reincarnation, the parable raises an interesting point. Considering the vast number of non-human animals compared to humans, the probability of being a human is vanishingly low. Therefore, chances are that if I could be an animal, I would be. This makes a strong anthropic argument that it is impossible for me to be an animal.
The phrase "for me to be an animal" may sound nonsensical, but "why am I me, rather than an animal?" is not obviously sillier than "why am I me, rather than a person from the far future?". If the doomsday argument is sufficient to prove that some catastrophe is preventing me from being one of a trillion spacefaring citizens of the colonized galaxy, this argument hints that something is preventing me from being one of a trillion bats or birds or insects.
And this could be that animals lack subjective experience. This would explain quite nicely why I'm not an animal: because you can't be an animal, any more than you can be a toaster. So Thomas Nagel can stop worrying about what it's like to be a bat, and the rest of us can eat veal and foie gras guilt-free.
But before we break out the dolphin sausages - this is a pretty weird conclusion. It suggests there's a qualitative and discontinuous difference between the nervous system of other beings and our own, not just in what capacities they have but in the way they cause experience. It should make dualists a little bit happier and materialists a little bit more confused (though it's far from knockout proof of either).
The most significant objection I can think of is that it is significant not that we are beings with experiences, but that we know we are beings with experiences and can self-identify as conscious - a distinction that applies only to humans and maybe to some species like apes and dolphins who are rare enough not to throw off the numbers. But why can't we use the reference class of conscious beings if we want to? One might as well consider it significant only that we are beings who make anthropic arguments, and imagine there will be no Doomsday but that anthropic reasoning will fall out of favor in a few decades.
But I still don't fully accept this argument, and I'd be pretty happy if someone could find a more substantial flaw in it.
That's an interesting observation.
There's a problem in assuming that consciousness is a 0/1 property; that you're either conscious, or not.
There's another problem in assuming that YOU are a 0/1 property; that there is exactly one atomic "your consciousness".
Reflect on the discussion in the early chapters of Daniel Dennet's "Consciousness Explained", about how consciousness is not really a unitary thing, but the result of the interaction of many different processes.
An ant has fewer of these processes than you do. Instead of asking "What are the odds that 'I' ended up as me?", ask, "For one of these processes, what are the odds that it would end up in me, rather than in an ant?"
According to Wikipedia's entry on biomass, ants have 10-100 times the biomass of humans today.
According to Wikipedia's list of animals by neuron count, ants have 10,000 neurons.
According to that page, and this one, humans have 10^11 neurons.
Information is proportional not to the number of neurons, but to the number of patterns that can be stored in those neurons, which is likely somewhere between N and N^2. I'm gonna call it NlogN.
I weigh as much as 167,000 ants. Each of them has ~ 10,000 log(10,000) bits of info. I have ~ 10^11 log(10^11) bits of info. I contain as much information as 165 times my body-mass worth of ants.
So if we ignore how much longer ants have lived than humans, the odds are better that a random unit of consciousness today would turn up in a human, than in an ant.
(Also note that we can only take into account ants in the past, if reincarnation is false. If reincarnation is true, then you can't ask about the chances of you appearing in a different time. :) )
If you're gonna then say, "But let's not just compare ourselves to ants; let's ask about turning up in a human vs. turning up in any other species", then you have the dice-labelling problem argued below: You're claiming humans are the 1 on the die.
No, it's proportional to the log of the number of patterns that can be (semi-stably) stored. E.g. n bits can store 2^n patterns.
I'd like to see a lot more justification for this. If each connection were binary (it's not), and connections were possible between all N neurons (they're not), than we would have N^2 bits.