I had trouble understanding this post, because its (apparent) thesis ("you can't have 3^^^3 human-like beings without having duplicates") is never actually stated in the post itself -- it was only Hedonic _Treader's comment that clued me in. Please consider revising to improve clarity.
(Maybe it seemed to you that the reference to "pigeons and holes" in the title was enough, but it wasn't: in fact I was expecting a new thought experiment involving birds, which indeed you seemed to promise here:
I've made this pigeon-hole example to demonstrate[...]
but never delivered.)
Same difficulty. Good work reconstructing the (obvious in restrospect) point. I kept skimming around, wondering if the author was somehow insane, or neglected to paste some text ...
I'm still left to wonder why it matters that 3^^^3 is such a large number that it's more than the number of possible human-like mind-states.
The reasoning about huge numbers of beings is a recurring theme here. Knuth's up-arrow notation is often used, with 3^^^3 as the number of beings.
I want to note that if a being is made of 10^30 parts, with 10^30 distinct states of each part, the number of distinct being states is (10^30)^(10^30) = 10^(3*10^31) . That's not a very big number; stacking uparrows quickly gets you to much larger numbers.
To quote from Torture Vs Dust Specks:
That's an unimaginably bigger number than 10^(3*10^31) . You just can't have 3^^^3 distinct humans (or the beings that are to human as human is to amoeba, or that repeated zillion times, or distinct universes for that matter). Most of them will be exactly identical to very many others among the 3^^^3 and have exactly identical experience*.
Of course, our reasoning does not somehow subconsciously impose a reasonable cap on number of beings and end up rational afterwards. I'm not arguing that gut feeling includes such consideration. (I'd say it usually just considers substantially different things incomparable and in-convertible, plus the space of utility needs not be one dimensional)
I've made this pigeon-hole example to demonstrate a failure with really huge numbers, that can undermine by an inconceivably huge factor the reasoning that seems rational and utilitarian and carefully done.
Also, it does seem to me that if the reasoning with huge numbers is likely to result in reasoning errors, then it can be rational to adopt some constraints/safeguards (e.g. veto approval of torture on basis of dust specks, veto pascal's mugging with very huge numbers, perhaps in general veto conversion between things of very different magnitude) as a rational strategy when one is aware that one is likely processing huge numbers incorrectly, not just on the gut feeling level but on conscious work with pencil and paper level as well.
An autopilot may strive for some minimization of total passenger discomfort over the flight, but also have a hard constraints on the max acceleration in the case that the discomfort minimization approach leads to something ridiculous.
* footnote: I don't think many people involved with AI research would count identical copies multiple times. But that is a tangential point. The issue is that when reading of 3^^^3 beings, it is really easy to make a mistake of not even checking whenever you do or don't count identical copies many times. The problem is that 3^^^3 is much, much larger than the numbers we would normally approximate as infinite.
On the counting of 'identical' items. Consider a computer system that has 2 copies of all data and re-does every calculation it makes. If it runs an AI, it may seem sensible to count AI twice when it's 2 computers in 2 boxes that are staying next to each other running same software on same input, but much less so if you picture one computer where each chip got two dies, one a mirror copy of the other, put right on top of it, separated by very thin layer of dielectric which serves no purpose (the potentials are same on both sides of it), and it's absurd if you remove the dielectric - it's 1 computer, just the wires are thicker, currents 2x larger, and transistors are in parallel pairs. Counting identical stuff several times is something we do when there's a difference in e.g. location, which renders stuff not identical. Decrease the spatial separation and the inclination to count identical items twice decreases. Have a giant server farm where next to each server there is the 2 backup servers in identical state (to recover when one fails), and I think just about any programmer would quickly forget about this minor implementation detail; have two giant server farms on opposite sides of Earth and you'll for sure feel like counting it twice.
edit: sorry for not being explicit, I kind of assumed the point was clear enough. Improved it.
Also, that's not for just dust specks vs torture but goes for all the other examples where the knuth up arrows are used to make very huge numbers. Pascal's mugging discussions for example.