The reasoning about huge numbers of beings is a recurring theme here. Knuth's up-arrow notation is often used, with 3^^^3 as the number of beings.
I want to note that if a being is made of 10^30 parts, with 10^30 distinct states of each part, the number of distinct being states is (10^30)^(10^30) = 10^(3*10^31) . That's not a very big number; stacking uparrows quickly gets you to much larger numbers.
To quote from Torture Vs Dust Specks:
- 3^3 = 27.
- 3^^3 = (3^(3^3)) = 3^27 = 7625597484987.
- 3^^^3 = (3^^(3^^3)) = 3^^7625597484987 = (3^(3^(3^(... 7625597484987 times ...)))).
3^^^3 is an exponential tower of 3s which is 7,625,597,484,987 layers tall. You start with 1; raise 3 to the power of 1 to get 3; raise 3 to the power of 3 to get 27; raise 3 to the power of 27 to get 7625597484987; raise 3 to the power of 7625597484987 to get a number much larger than the number of atoms in the universe, but which could still be written down in base 10, on 100 square kilometers of paper; then raise 3 to that power; and continue until you've exponentiated 7625597484987 times. That's 3^^^3. It's the smallest simple inconceivably huge number I know.
That's an unimaginably bigger number than 10^(3*10^31) . You just can't have 3^^^3 distinct humans (or the beings that are to human as human is to amoeba, or that repeated zillion times, or distinct universes for that matter). Most of them will be exactly identical to very many others among the 3^^^3 and have exactly identical experience*.
Of course, our reasoning does not somehow subconsciously impose a reasonable cap on number of beings and end up rational afterwards. I'm not arguing that gut feeling includes such consideration. (I'd say it usually just considers substantially different things incomparable and in-convertible, plus the space of utility needs not be one dimensional)
I've made this pigeon-hole example to demonstrate a failure with really huge numbers, that can undermine by an inconceivably huge factor the reasoning that seems rational and utilitarian and carefully done.
Also, it does seem to me that if the reasoning with huge numbers is likely to result in reasoning errors, then it can be rational to adopt some constraints/safeguards (e.g. veto approval of torture on basis of dust specks, veto pascal's mugging with very huge numbers, perhaps in general veto conversion between things of very different magnitude) as a rational strategy when one is aware that one is likely processing huge numbers incorrectly, not just on the gut feeling level but on conscious work with pencil and paper level as well.
An autopilot may strive for some minimization of total passenger discomfort over the flight, but also have a hard constraints on the max acceleration in the case that the discomfort minimization approach leads to something ridiculous.
* footnote: I don't think many people involved with AI research would count identical copies multiple times. But that is a tangential point. The issue is that when reading of 3^^^3 beings, it is really easy to make a mistake of not even checking whenever you do or don't count identical copies many times. The problem is that 3^^^3 is much, much larger than the numbers we would normally approximate as infinite.
On the counting of 'identical' items. Consider a computer system that has 2 copies of all data and re-does every calculation it makes. If it runs an AI, it may seem sensible to count AI twice when it's 2 computers in 2 boxes that are staying next to each other running same software on same input, but much less so if you picture one computer where each chip got two dies, one a mirror copy of the other, put right on top of it, separated by very thin layer of dielectric which serves no purpose (the potentials are same on both sides of it), and it's absurd if you remove the dielectric - it's 1 computer, just the wires are thicker, currents 2x larger, and transistors are in parallel pairs. Counting identical stuff several times is something we do when there's a difference in e.g. location, which renders stuff not identical. Decrease the spatial separation and the inclination to count identical items twice decreases. Have a giant server farm where next to each server there is the 2 backup servers in identical state (to recover when one fails), and I think just about any programmer would quickly forget about this minor implementation detail; have two giant server farms on opposite sides of Earth and you'll for sure feel like counting it twice.
edit: sorry for not being explicit, I kind of assumed the point was clear enough. Improved it.
Also, that's not for just dust specks vs torture but goes for all the other examples where the knuth up arrows are used to make very huge numbers. Pascal's mugging discussions for example.