The reasoning about huge numbers of beings is a recurring theme here. Knuth's up-arrow notation is often used, with 3^^^3 as the number of beings.

I want to note that if a being is made of 10^30 parts, with 10^30 distinct states of each part, the number of distinct being states is (10^30)^(10^30) = 10^(3*10^31) . That's not a very big number; stacking uparrows quickly gets you to much larger numbers.

To quote from Torture Vs Dust Specks:

 

  • 3^3 = 27.
  • 3^^3 = (3^(3^3)) = 3^27 = 7625597484987.
  • 3^^^3 = (3^^(3^^3)) = 3^^7625597484987 = (3^(3^(3^(... 7625597484987 times ...)))).

3^^^3 is an exponential tower of 3s which is 7,625,597,484,987 layers tall.  You start with 1; raise 3 to the power of 1 to get 3; raise 3 to the power of 3 to get 27; raise 3 to the power of 27 to get 7625597484987; raise 3 to the power of 7625597484987 to get a number much larger than the number of atoms in the universe, but which could still be written down in base 10, on 100 square kilometers of paper; then raise 3 to that power; and continue until you've exponentiated 7625597484987 times.  That's 3^^^3.  It's the smallest simple inconceivably huge number I know.

That's an unimaginably bigger number than 10^(3*10^31) . You just can't have 3^^^3 distinct humans (or the beings that are to human as human is to amoeba, or that repeated zillion times, or distinct universes for that matter). Most of them will be exactly identical to very many others among the 3^^^3 and have exactly identical experience*.

Of course, our reasoning does not somehow subconsciously impose a reasonable cap on number of beings and end up rational afterwards. I'm not arguing that gut feeling includes such consideration. (I'd say it usually just considers substantially different things incomparable and in-convertible, plus the space of utility needs not be one dimensional)

I've made this pigeon-hole example to demonstrate a failure with really huge numbers, that can undermine by an inconceivably huge factor the reasoning that seems rational and utilitarian and carefully done.

Also, it does seem to me that if the reasoning with huge numbers is likely to result in reasoning errors, then it can be rational to adopt some constraints/safeguards (e.g. veto approval of torture on basis of dust specks, veto pascal's mugging with very huge numbers, perhaps in general veto conversion between things of very different magnitude) as a rational strategy when one is aware that one is likely processing huge numbers incorrectly, not just on the gut feeling level but on conscious work with pencil and paper level as well.

An autopilot may strive for some minimization of total passenger discomfort over the flight, but also have a hard constraints on the max acceleration in the case that the discomfort minimization approach leads to something ridiculous.

* footnote: I don't think many people involved with AI research would count identical copies multiple times. But that is a tangential point. The issue is that when reading of 3^^^3 beings, it is really easy to make a mistake of not even checking whenever you do or don't count identical copies many times. The problem is that 3^^^3 is much, much larger than the numbers we would normally approximate as infinite.

On the counting of 'identical' items. Consider a computer system that has 2 copies of all data and re-does every calculation it makes. If it runs an AI, it may seem sensible to count AI twice when it's 2 computers in 2 boxes that are staying next to each other running same software on same input, but much less so if you picture one computer where each chip got two dies, one a mirror copy of the other, put right on top of it, separated by very thin layer of dielectric which serves no purpose (the potentials are same on both sides of it), and it's absurd if you remove the dielectric - it's 1 computer, just the wires are thicker, currents 2x larger, and transistors are in parallel pairs. Counting identical stuff several times is something we do when there's a difference in e.g. location, which renders stuff not identical. Decrease the spatial separation and the inclination to count identical items twice decreases. Have a giant server farm where next to each server there is the 2 backup servers in identical state (to recover when one fails), and I think just about any programmer would quickly forget about this minor implementation detail; have two giant server farms on opposite sides of Earth and you'll for sure feel like counting it twice.

edit: sorry for not being explicit, I kind of assumed the point was clear enough. Improved it.

Also, that's not for just dust specks vs torture but goes for all the other examples where the knuth up arrows are used to make very huge numbers. Pascal's mugging discussions for example.

New to LessWrong?

New Comment
20 comments, sorted by Click to highlight new comments since: Today at 8:56 AM

I had trouble understanding this post, because its (apparent) thesis ("you can't have 3^^^3 human-like beings without having duplicates") is never actually stated in the post itself -- it was only Hedonic _Treader's comment that clued me in. Please consider revising to improve clarity.

(Maybe it seemed to you that the reference to "pigeons and holes" in the title was enough, but it wasn't: in fact I was expecting a new thought experiment involving birds, which indeed you seemed to promise here:

I've made this pigeon-hole example to demonstrate[...]

but never delivered.)

Same difficulty. Good work reconstructing the (obvious in restrospect) point. I kept skimming around, wondering if the author was somehow insane, or neglected to paste some text ...

I'm still left to wonder why it matters that 3^^^3 is such a large number that it's more than the number of possible human-like mind-states.

[-]saturn12y100

Because it isn't settled whether harming two different people is worse than harming two identical copies of one person.

I hadn't considered that. I must grant that it's hard to argue about the issue in a way that's universally convincing. It quite depends on what you mean by "identical copies".

Non-diverging cheap mergeable copies (e.g. in the context of additional deterministic whole-universe copies) don't have much additional weight for me. But copies that are merely identical in their internal state, but (will) have a different external context and thus future are important.

So 3^^^3 is much larger than anything physical in our universe ... that doesn't stop me from entertaining hypotheticals about 3^^^3 (with or without collisions) people.

Maybe I can take as the point that it can't possibly matter if our intuitions fail at the scale of 3^^^3 (likely they do, but we can do explicit math instead) - it only matters if our intuitions fail at the scale of things that are actually possible in the universe.

It's the same thing, as I understand. Two identical copies are two different people.

The root of this might be in determining what is "identical".

If you have two identical copies and one is destroyed/hurt, then the copies are no longer identical.

Perhaps in this case, and maybe others, two identical copies of people can be worth one, until something changes them, eg getting destroyed.

[-][anonymous]12y00
[This comment is no longer endorsed by its author]Reply

Sorry for that. The problem, as you said, is that it is fairly obvious in retrospect, and the only way I can see my post is in this obvious retrospect.

It matters because it is an error to consider the 3^^^3 people without considering whenever you even care about most of them being identical; even if the identical-ness does not make any difference to you, you still should at least retrieve that before proceeding to consider 3^^^3 people.

The issue is that really huge numbers like 3^^^3 are larger than many numbers we would carelessly assume to be infinite - e.g. number of possible beings.

[-][anonymous]12y240

I want to note that if a being is made of 10^30 parts, with 10^30 distinct states of each part, the number of distinct being states is (10^30)^(10^30) = 10^(3*10^31) . That's not a very big number; stacking uparrows quickly gets you to much larger numbers.

The relevant question then is, do exactly identical system states - such as brain states - count separately, or is duplication some kind of ontological pointer to one and the same entity in existence-space. Also compare Nick Bostrom's paper on Quantity of experience and brain duplication. It seems that different answers on this will make the torture decision in torture vs. dustspecks more problematic.

Yep. Though my point is that it is easy to neglect to even consider this question when considering 3^^^3 beings. 3^^^3 is such a huge number it exceeds almost anything finite that we would routinely consider infinite.

I think with the identical system states, whenever you feel more or less inclined to count them twice really depends to the spatial separation between them vs their size, which is kind of silly. A computer duplicates everything by having a lot of electrons in every RAM cell and every wire, but the 'duplicate' is very nearby and we ignore the duplication; i never in my life have heard an argument that a computer system's existence should be proportional to currents in the wires and charges in the RAM cells. edit: I wrote more about that in the edit to the post.

I think a lot of folks take this position automatically and don't bother / feel the need to justify it, in general. When most people consider the possibility of Boltzmann brains (and nearby concepts in concept space) and examine their intuitions in light of that possibility, they find that they have an overwhelming intuition to care about duplicates. It also seems to be a requirement for not going crazy, if you take MWI or large universes seriously.

i never in my life have heard an argument that a computer system's existence should be proportional to currents in the wires and charges in the RAM cells

Never too late.

The word 'duplicate' is a pretty bad choice. There's no original and copy, and there is no process of duplication. It is simply identical instances. With regards to MWI, there is continuum of instances, and as far as we know you're dealing with uncountable infinity. If one subscribes to MWI one has to stick to caring for one's own world on basis of some morals that are not intuitively derivable (as are any other morals, but it is more acute with MWI).

I think it would be rather silly to count for more an AI that's running on less efficient hardware which uses larger charges and currents; not only silly but with real world consequences (an AI subscribing to such notion could be unwilling to upgrade it's hardware, or indeed strive to make even less efficient hardware for itself); yet the only difference between this AI, and an AI that has identical instances on 'separate' computers (that use smaller currents and charges, and thinner wires), is the spatial separation.

[-]see12y20

Hmm, let's say we don't count identical copies as having moral weight, and we assume that Many Worlds is correct.

In that case, I build a device that will utterly annihilate the Earth with a 50/50 probability based on a single quantum event, Schrodinger's Really Big Nuke. That event happens/doesn't happen, branching two universes identical except for that single event by MWI, one of which has an Earth immediately annihilated by the device, the other surviving.

By the not-counting-duplicates theory, the moral weight of the annihilation of an entire planet of seven billion thinking beings is zero, because they were all duplicated by the quantum event that caused their destruction.

I think the same quantum event would have to happen without the bomb; without the bomb would just the same lead to forking as the influences propagate through interactions between atoms, etc. (and the forking in either case will be into huge number of observers and continue forever, except with a bomb half the observers will be dead). It seems to me you'll have to violate conservation laws to actually create more copies for you to morally-neutrally destroy.

Anyways, how would you count AI running on a computer that got 2x wire crossection, 2x larger surface capacitors, 2x the current, etc? (versus other computer). What if I add non-functional dielectric, splitting along each wire, and each transistor, in two, resulting in 2 computers (that are almost superimposed on each other)? Why should it change the count?

I would be curious what you think of my comment elsewhere in this thread: http://lesswrong.com/r/discussion/lw/9xw/33_holes_and_1031031_pigeons_or_vice_versa/5vcq

[-]see12y10

I'm currently trying to avoid having opinions on this whole subject. I kept thinking it all around in circles; I'm now letting my back-brain see if it can come up with any insights. But yours is one of the ideas that passed my mind.

There's an interesting interaction of "identical copies don't mean anything" with one of the problem-of-identity solutions you see around this site, which is that you should treat copies and simulations of yourself as yourself, indeed in proportion to how closely they resemble you. If an identical- or near-copy of me has moral weight when I'm trying to decide whether to one-box, or defect in the Prisoner's Dilemma, or the like, it would seem to have to have the same weight in questions like this one, or vice-versa.

Agreed.

But don't avoid opinions, you can form some and always preface them with caveats to get a sword out of that iron.

This point has already been made. Even if it hadn't, this post could have been a comment on Torture vs Dust Specks.

3^^^3. It's the smallest simple inconceivably huge number I know.

See also: the concept of the SII ("Smallest Inconceivable Integer") defined here.

I unfortunately have nothing of interest to contribute to your article, but enjoyed the work more than a simple upvote. This is one of the more interesting articles I've seen on Lesswrong recently. Thank you for presenting it.