Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: denimalpaca 22 March 2017 10:08:28PM 0 points [-]

"(1) civilizations like ours tend to self-destruct before reaching technological maturity, (2) civilizations like ours tend to reach technological maturity but refrain from running a large number of ancestral simulations, or (3) we are almost certainly in a simulation."

Case 2 seems far, far more likely than case 3, and without a much more specific definition of "technological maturity", I can't make any statement on 1. Why does case 2 seem more likely than 3?

Energy. If we are to run an ancestral simulation that even remotely wants to correctly simulate as complex phenomenon as weather, we would probably need the scale of the simulation to be quite large. We would definitely need to simulate the entire earth, moon, and sun, as the physical relationships between these three are very intertwined. Now, let's focus on the sun for a second, because it should provide us with all the evidence we need that a simulation would be implausible.

The sun has a lot of energy, and to simulate it would itself require a lot of energy. To simulate the sun exactly as we know it would take MORE energy than the sun, because the entire energy of the sun must be simulated and we must account for the energy lost due to heat or other factors as an engineering concern. So just to properly simulate the sun, we'd need to generate more energy than the sun has, which already seems very implausible on earth, given we can't create a reactor larger than the sun on the earth. If we extend this argument to simulating the entire universe, it seems impossible that humans would ever have the necessary energy to simulate all the energy in the universe, so we must only be able to simulate a part of the universe or a smaller universe. This again follows from the fact that to perfectly simulate something, it requires more energy than the thing simulated.

Comment author: gjm 23 March 2017 12:06:35AM 0 points [-]

To simulate the sun exactly as we know it would take MORE energy than the sun, because the entire energy of the sun must be simulated and we must account for the energy lost due to heat or other factors as an engineering concern.

I don't understand this argument. If it's appealing to a general principle that "simulating something with energy E requires energy at least E" then I don't see any reason why that should be true. Why should it take twice as much energy to simulate a blue photon as a red photon, for instance?

(I am sympathetic to the overall pattern of your argument; I also do not expect civilizations like ours to run a lot of ancestral simulations and have never understood why they should be expected to, and I suspect that one reason why not is that the resources to do it well would be very large and even if it were possible there ought to be more useful things to do with those resources.)

Comment author: Lumifer 22 March 2017 02:38:52PM *  0 points [-]

Right, but I am specifically interested in Viliam's views about the scenario where there is no AI, but we do have honest and competent rulers.

Comment author: gjm 22 March 2017 04:36:03PM 0 points [-]

Fair enough; I just wanted to make it explicit that that question has basically nothing to do with anything else in the thread. I mean, Viliam was saying "so it might be a good idea to do such-and-such about superhumanly capable AI" and you came in and said "aha, that kinda pattern-matches to communism. Are you defending communism?" and then said oh, by the way, I'm only interested in communism in the case where there is no superhumanly capable AI.

But, well, trolls gonna troll, and you've already said trolling is your preferred mode of political debate.

In response to comment by gjm on Am I Really an X?
Comment author: math55 22 March 2017 02:47:20AM *  0 points [-]

Part of the reason is that otherwise-normal people who think or feel themselves to be avatars of gods are apparently very much rarer than otherwise-normal who think or feel themselves to be of a gender different from the one you'd guess from looking at the shape of their body.

How do you know? Granted the people who claim to be the wrong gender are more prominent (not necessarily more common) now, but if we look back as recently as say 50 years ago we'd find them much rarer than people claiming to be avatars. The official PC position as I understand it (this may well be a steelman) is that those people always existed and were simply in the closet/in denial because society wasn't prepared to accept them. But if you claim that's the case how do you know there aren't a similar or greater number of avatars also in the closet?

In response to comment by math55 on Am I Really an X?
Comment author: gjm 22 March 2017 03:14:15AM 0 points [-]

How do you know?

Note that I (deliberately) said "are apparently very much rarer ...". I know that because I encounter (in person, on the internet, in others' accounts of conversations they've had, etc.) a lot more people who say they're trans than people who say they're divine avatars. Whether they're actually much rarer, I don't claim to know.

if we look back as recently as say 50 years ago we'd find them much rarer than people claiming to be avatars.

How do you know that?

how do you know there aren't a similar or greater number of avatars also in the closet?

I don't. But if they're in the closet, then they aren't talking about their experiences as (alleged) avatars, which means that their alleged avatarity doesn't give rise to much need for special terminology. Whereas trans people are (apparently) more common and (certainly) more vocal, so conversations about unusual gender issues do happen, and some terminology is useful for those conversations.

Comment author: Lumifer 22 March 2017 01:13:04AM 0 points [-]

post-Singularity communism

I have no idea what this means.

Comment author: gjm 22 March 2017 03:10:19AM 0 points [-]

It seems you agree with Viliam: see the second paragraph below.

For the obvious reasons I don't think you can find selfless and competent human rulers to make this really work. But conditional on possibility of creating a Friendly superintelligent AI... sure.

Although calling that "communism" is about as much of a central example, as calling the paperclip maximizer scenario "capitalism".

Comment author: Lumifer 21 March 2017 06:22:03PM *  0 points [-]

capital is a rather vacuous word. It basically means "stuff that might be useful for something"

Um. Not in economics where it is well-defined. Capital is resources needed for production of value. Your stack of decade-old manga might be useful for something, but it's not capital. The $20 bill in your wallet isn't capital either.

Comment author: gjm 22 March 2017 01:11:16AM 0 points [-]

None the less, "capital" and "AI" are extremely different in scope and I see no particular reason to think that if "let's do X with capital" turns out to be a bad idea then we can rely on "let's do X with AI" also being a bad idea.

In a hypothetical future where the benefits of AI are so enormous that the rest of the economy can be ignored, perhaps the two kinda coalesce (though I'm not sure it's entirely clear), but that hypothetical future is also one so different from the past that past failures of "let's do X with capital" aren't necessarily a good indication of similar future failure.

Comment author: Lumifer 21 March 2017 06:19:48PM *  0 points [-]

Actually, no, we're (at least, I am) talking about pre-Singularity situations were you still have to dig in the muck to grow crops and make metal shavings and sawdust to manufacture things.

Viliam said that the main problem with communism is that the people at the top are (a) incompetent; and (b) corrupt. I don't think that's true with respect to the economy. That is, I agree that communism leads to incompetent and corrupt people rising to the top, but that is not the primary reason why communist economy isn't well-functioning.

I think the primary reason is that communism breaks the feedback loop in the economy where prices and profit function as vital dynamic indicators for resource allocation decisions. A communist economy is like a body where the autonomic nervous system is absent and most senses function slowly and badly (but the brain can make the limbs move just fine). Just making the bureaucrats (human-level) competent and honest is not going to improve things much.

Comment author: gjm 22 March 2017 01:07:20AM 0 points [-]

Maybe I misunderstood the context, but it looked to me as if Viliam was intending only to say that post-Singularity communism might work out OK on account of being run by superintelligent AIs rather than superstupid meatsacks, and any more general-sounding things he may have said about the problems of communism were directed at that scenario.

(I repeat that I agree that merely replacing the leaders with superintelligent AIs and changing nothing else would most likely not make communism work at all, for reasons essentially the same as yours.)

Comment author: gjm 21 March 2017 06:14:47PM 2 points [-]

an anti-coordination game, where you and your copy/estimate try to pick different options

It feels to me as if calling this an anti-coordination game makes good sense when Omega is actually running a simulated copy of you but not when Omega is predicting by radically different means.

Comment author: Lumifer 21 March 2017 05:08:27PM *  0 points [-]

this one

That too :-) I am a big fan of this approach.

For the obvious reasons I don't think you can find selfless and competent human rulers to make this really work.

But conditional on finding selfless and competent rulers (note that I'm not talking about the rest of the population), you think that communism will work? In particular, the economy will work?

Depends on whether you consider the possibility of superintelligent AI to be "realistic".

Aaaaand let me quote you yourself from just a sentence back:

Making a superintelligent AI will make our definitions of ownership (whether private or government) obsolete.

One of the arms of your choice involves Elon Musk (or equivalent) owning the singularity AI, the other gives every human 1/7B ownership share of the same AI. How does that work, exactly?

Besides, I thought that when Rapture comes...err... I mean, when the Singularity happens, humans will not decide anything any more -- the AI will take over and will make the right decisions for them-- isn't that so?

Comment author: gjm 21 March 2017 06:05:39PM 0 points [-]

conditional on finding selfless and competent rulers (note that I'm not talking about the rest of the population), you think that communism will work?

If we're talking about a Glorious Post-Singularity Future then presumably the superintelligent AIs are not only ruling the country and making economic decisions but also doing all the work, and they probably have magic nanobot spies everywhere so it's hard to lie to them effectively. That probably does get rid of the more obvious failure modes of a communist economy.

(If you just put the superintelligent AIs in charge of the top-level economic institutions and leave everything else to be run by the same dishonest and incompetent humans as normal, you're probably right that that wouldn't suffice.)

In response to comment by gjm on Am I Really an X?
Comment author: math55 21 March 2017 01:16:32AM 0 points [-]

But I guess what some people actually object to is not in fact the term "cis" but acknowledging the existence of transness at all.

Rather we object to singling out of a particular form of insanity and treating it as an alternate form of sanity, and then insisting that everyone else play along with you.

that we say those things without using the term "cis" but that we not have the sort of conversations in which the term would be useful.

So why don't you also have analogous conversations where words like "unavatarilicious" or "non-schizophrenic", or especially "non-autistic" would be useful anywhere near as often?

In response to comment by math55 on Am I Really an X?
Comment author: gjm 21 March 2017 12:24:33PM 0 points [-]

Rather we object to singling out a particular form of insanity and treating it as an alternate form of sanity, and then insisting that everyone else play along with you.

Potayto, potahto.

So why don't you also have analogous conversations where words like "unavatarilicious" or "non-schizophrenic", or especially "non-autistic" would be useful anywhere near as often?

It's hard to be sure. Part of the reason is that otherwise-normal people who think or feel themselves to be avatars of gods are apparently very much rarer than otherwise-normal who think or feel themselves to be of a gender different from the one you'd guess from looking at the shape of their body. That seems like an important fact. (You might claim that it's true only because of some kind of social-justice-feminist-warrior conspiracy, but I see no good reason to believe that; I think it gets the causality backwards.)

There are words for "non-autistic" that get used in conversations in communities where there are more than averagely many autistic people. I've seen "allistic", for instance. ("Neurotypical" is probably commonest. It's an interesting one because -- see above -- it looks like a broader word that's being used more specifically in a particular context, but I think the actual history is that the first uses of "neurotypical" were actually to mean "not autistic".)

In response to comment by gjm on Am I Really an X?
Comment author: bogus 20 March 2017 06:01:28PM *  0 points [-]

Well, I'm sure there are conversations in which it's useful to talk about people who aren't trans, but I bet in most of them one could either (1) say something like "not trans" or (2) co-opt some term with markedly broader meaning like "normal" or "gender-typical". It's not like we actually need an usage like that of "cis" to mean "not trans". So it's no surprise that some people think insisting on such usage is just about humoring a small minority of people who expressly regard themselves as trans (quite literally, "on the other side" with respect to gender).

In response to comment by bogus on Am I Really an X?
Comment author: gjm 21 March 2017 12:10:53AM 0 points [-]

I guess. It feels to me as if saying "not trans" instead of "cis" when the need arises would foreground transness more rather than less. But I guess what some people actually object to is not in fact the term "cis" but acknowledging the existence of transness at all. In other words, the solution that e.g. Eugine wants to the alleged problem that we say things using a term "cis" that he thinks shouldn't exist isn't that we say those things without using the term "cis" but that we not have the sort of conversations in which the term would be useful.

View more: Next