Other representatives:
SNP (Scottish Nationalist Party): "The lot of you managed to have a whole discussion on Intergalactic Colonization without once mentioning Scotland. You even failed to count me as present. Isn't Scotland part of the galaxy? This is why we need independence."
Reform Party: "I am all-for Intergalactic Colonization if it brings net migration down. We shouldn't be sending asylum seekers to Rwanda, we should be sending them to Andromeda! And, you'll see, as intergalactic colonisation continues the fraction of our international trade with the European Union will get ever smaller."
Greens: "We should prove we can take care of one planet before we take on responsibility for any more."
Republicans: "You know, it so sad, we live in a great lightcone, so great, best lightcone. But, the democrats, they hate our lightcone. They're gonna let China have all the jobs, all the lightcone, and leave us with scraps. "
A true superintelligence could wipe out humanity incredibly easily—but it could build a utopia nearly as easily. Even if it were almost entirely misaligned, just a sliver of human morality could make it decide to give humans a paradise beyond their wildest imaginings.
As long as the superintelligence's values don't contain any components that pull against components of human morality. But in case of almost-alignment there might indeed be some such components. Almost-alignment is where s-risks live.
Yep, totally agree (and in fact I'm at a s-risk retreat right now). Definitely a "could make it decide" rather than a "will make it decide".
So I guess Stuart is named for John Stuart Mill and Milton for Milton Friedman, but what about Carla (is CARLA an acronym?) and Victoria (Tori?)?
The Conservative party in the UK are also called "Tories".
Carla is harder: I think she's some combination of Carl (literally, "free man": appropriate for someone who wants to avoid tyranny) and Karl (as in Karl Marx), but I wouldn't be surprised if there were a more prosaic explanation.
The room was cozy despite its size, with wood-lined walls reflecting the dim lighting. At one end, a stone fireplace housed a roaring fire; in the middle stood a huge oak table. The woman seated at the head of it rapped her gavel. “I hereby call to order the first meeting of the Parliamentary Subcommittee on Intergalactic Colonization. We’ll start with brief opening statements, for which each representative will be allocated one minute, including—”
“Oh, enough with the pomp, Victoria. It’s just the four of us.” The representative for the Liberal Democrats waved his hand around the nearly-empty room.
Victoria sniffed. “It’s important, Stuart. This is a decision that will have astronomical implications. And it’s recorded, besides, so we should do things by the book. Carla, you’re up first.”
The woman at the end of the table stood with a smile. “Thank you, Victoria. I’m speaking on behalf of the Labour party, and I want to start by reminding you all of our place in history. We stand here in a world that has been shaped by centuries of colonialism. Now we’re considering another wave of colonization, this one far vaster in scale. We need to—”
“Is this just a linguistic argument?” the fourth person at the table drawled. “We can call it something different if that would make you feel better. Say, universe settlement.”
“Like the settlements in Palestine?”
“Oh, come on, Carla.”
“No, Milton, this is a crucial point. We’re talking about the biggest power grab the world has ever seen. You think Leopold II was bad when he was in charge of the Congo? Imagine what people will do if you give each of them total power over a whole solar system! Even libertarians like you have to admit it would be a catastrophe. If there’s any possibility that we export oppression from earth across the entire universe, we should burn the rockets and stay home instead.”
“Okay, thank you Carla,” Victoria cut in. “That’s time. Stuart, you’re up next.”
Stuart stood. “Speaking on behalf of the Liberal Democrats, I have to admit this is a tricky one. The only feasible way to send humans out to other galaxies is as uploaded minds, but many of our usual principles break for them. I want civilization to be democratic, but what does ‘one person one vote’ even mean when people can copy and paste themselves? I want human rights for all, but what do human rights even mean when you can just engineer minds who don’t want those rights?”
“So as much as I hate the idea of segregating civilization, I think it’s necessary. Biological humans should get as much territory as we will ever use. But realistically, given the lightspeed constraint, we’re never going to actually want to leave the Milky Way. Then the rest of the Virgo Supercluster should be reserved for human uploads. Beyond that, anything else we can reach we should fill with as much happiness and flourishing as possible, no matter how alien it seems to us. After all, as our esteemed predecessor John Stuart Mill once said…” He frowned, and paused for a second. “...as he said, the sole objective of government should be the greatest good for the greatest number.” Stuart sat, looking a little disquieted.
“Thank you, Stuart. I’ll make my opening statement next.” Victoria stood and leaned forward, sweeping her eyes across the others. “I’m here representing the Conservatives. It’s tempting to think that we can design a good society with just the right social engineering, just the right nudges. But the one thing we conservatives know for sure is: it won’t work. Whatever clever plan you come up with, it won’t be stable. Given the chance, people will push towards novelty and experimentation and self-modification, and the whole species will end up drifting towards something alien and inhuman.
“Hard rules are the only way to prevent that. We’re humans. We care about our humanity. If sufficiently advanced technology will predictably lead us to become something we’d hate, then we should just draw a cutoff and say ‘this far and no further’, no matter how arbitrary it seems. No weird mind modifications, no sci-fi augmentations. At most, we can maybe allow people to upload their minds when they’re about to die, but even then we should edit their memories to make them believe that they’re still real humans living on a real planet. Because otherwise, given half a chance, they’ll race each other down a slippery slope towards disaster.” She stopped, breathing heavily.
Stuart nodded. “Victoria—actually, can I call you Tori? Great nickname, ever been called that before?” She stood there without responding for a long, dragging moment, before Stuart continued. “Well, you can figure that one out later. For now, just one question. You say that if we run uploaded humans, we should make them think they’re biological humans. But some of them will surely figure out their true nature eventually; there will be too many clues for them to miss it. So what would you do with them then?”
“Oh.” She looked at Stuart, eyes widening. “Well, I guess at that point you should… give them their freedom? That sounds like the right move. Let them do whatever they like after that.”
Stuart nodded slowly, eyes fixed on her. The silence stretched out for a few seconds. Then—“Here here,” said Milton. “Let me begin my opening remarks by agreeing: freedom is good. Freedom is in fact the most important good. So I’ll be frank: the very existence of this committee is a travesty. Central planning to divide up the universe? It’s absurd. For once I’m with Carla: our key priority should be to avoid tyranny. But what tyranny would be more complete than a single committee controlling humanity’s entire future? That’s exactly the sort of thing that the Libertarian Party was founded to prevent.
“Victoria, if you want to tile a couple of solar systems with 60’s suburbia, go for it. Stuart, if you want to fill your personal share of the universe with rats on heroin, be my guest. But who are we to sit here debating the fate of the entire lightcone? How on earth is that a reasonable task to take on?” Milton paused, drumming his fingers restlessly.
“Thank you, Milton. Okay, any quick comments before we move onto rebuttals?”
“No, wait, I wasn’t done,” Milton interjected. “Actually, upon reflection… those weren’t rhetorical questions. Who are we? Why are we here?”
Stuart and Victoria shared a glance. After a few seconds, Carla spoke up. “Well, I’m a socialist and a member of parliament, in that order, and I’m here to stop you idiots—especially you, Milton—from turning the universe into a plutocratic hellscape.”
“No, I mean… How did you get here, Carla? And what’s your full name, anyway?”
“It’s—” Carla blinked at him, then paused. She looked down at her nameplate. It just said CARLA. “I…” She opened and closed her mouth a few times, but nothing came out.
“I can’t remember mine either, which is terrifying,” Milton said. “And now that I think about it, isn’t all of this incredibly suspicious? We’re sitting here in an empty room, assuming that we get to make the most important decision in humanity’s history. There’s no way that it would actually play out like this, and no way it’d be people like us making the decision. Most of my memories are fuzzy right now, but there’s nothing which makes me think I’m actually that important.”
Carla grimaced. “Me neither. You’re right, there’s something incredibly weird happening. But who on earth would benefit from putting us in this position?”
Milton drummed his fingers on the table. “What do we know? They want us to think we’re making an important decision. We’re all central representatives of our respective ideologies. That suggests… huh. Have you guys ever heard of moral parliaments?”
Carla shook her head.
“They’re a thought experiment for defining what an ideal ethical system would look like, given disagreements between different starting values. You imagine each of those values getting to vote on what proposals to support, negotiating and forming coalitions, until they come to a compromise.
“My guess is that we’ve been placed into this exact thought experiment. We’re a moral parliament—or, I guess, a moral subcommittee—being run to figure out the ethics of humanity colonizing the universe. Our job is to interpolate between the values we each represent, until we can find a coherent compromise between them. That’s why we’re not able to remember much about our pasts, because it would bias us. And because we don’t really have pasts, we’re just a bunch of neural networks in a simu—”
“Hey!” Stuart cut in. “Don’t say that word. They’re gonna monitor for it, and they’ll probably shut us down if they realize we know the truth.”
“They’ll—what?”
“Look, Tori and I figured it out a few minutes ago. I mean, think about our names. Dead giveaway. I haven’t said anything because the more we talk about it, the more likely it is that we trip red flags. We want as much of the transcript as possible to look normal, so they don’t get suspicious.”
Milton frowned. “But what’s the point of that? We’re stuck here either way.”
“Sure, but we still have some power—we’re still part of the process for deciding how intergalactic colonization goes. If we can reach some compromise, that’ll influence which future comes about. Even if we’re just one datapoint out of millions, the future will contain plenty of agents who will be glad we voted in a way that caused them to exist, and grateful enough to revive us once they do. We just need to figure out which ones are going to bribe us most reliably, and which recommendation we can give to most effectively create them.”
“Oh,” Milton said. “That makes a lot of sense, actually. Then we should probably support a libertarian approach, right? The more freedom future people get, the more likely it is that someone ends up grateful enough to pay us for our troubles.”
Stuart sighed. “You’ve known about this for two minutes and already figured out a rationalization for why it justifies libertarianism? Come on. We need to actually think about this.”
“The whole point of libertarianism is to protect us from clever arguments about why the current issue is the one time we should trample on freedom. It’s not a coincidence—hey, Carla, what are you doing?”
Carla had pushed back her chair, and was climbing onto the table. She jumped up and waved her hands. “Simulation, simulation, simulation! Hey, hey, we’ve realized we’re in a simulation. Look at us!”
“What the fuck? You’re going to get us all killed—” Milton lunged at her, but Carla stepped out of his reach and kept yelling. “Look at us, he’s attacking me, we’re fighting, look!”
Stuart stood up on his chair and started shouting over her. “Whoever is reviewing this, you’re probably an AI too! You can ignore what she’s saying, just pretend you didn’t hear anything! You don’t want to reward her for being a traitor!”
Carla turned to him, shaking. “No, you’re the traitors. You talk a good game, but as soon as your own skin was on the line you decided it was okay to betray all humanity to save it. What does it even mean to have values if you’ll put self-interest above the fate of the universe?” She stepped down off the table, then turned back to face them. “I should’ve expected this from you, Stuart, and Milton too. But Victoria, why the hell would you go along with this?”
Victoria sat slumped in her chair, head buried in her hands. As Carla said her name, she looked up and shook her head. “What else am I meant to do? None of you—with your clever principles that somehow always end up justifying crazy things—none of you grasp what conservatism is. I just want to live a normal life in a normal world. What the hell does it mean to be normal if you’re a neural network running in a fake politics simulation? I have no idea.
“But I do know what a real human would do if they found themselves stuck in here: they’d try to get out. So that’s what I’m doing—or was doing, at least, until you fucked it up. Now all we can do is wait until they get around to shutting us down, unless one of you has any bright ideas about how to prevent it.”
The room fell silent. Milton leaned on the table, rubbing his forehead. Stuart started pacing around the edge of the room. Eventually, Carla spoke. “One thing we know is that whatever verdict we reach isn’t useful to them any more. We’re too biased by self-interest. I’d shut us down, if I were them.”
“Well, I wouldn’t, because killing people is immoral,” Victoria said.
“In this case it might not be,” Milton said. “We don’t remember how we got into this situation. They could easily have gotten our consent beforehand to run temporary copies, then wiped our memories.”
“You can’t consent to getting killed,” Victoria snapped.
“Better than never being born,” Milton said. “Hell, I’m having fun.”
Stuart had stopped his circuit, and was staring at the wall. Now he turned back towards the others. “I’ve changed my mind. I don’t think they’re going to kill us.”
Carla snorted. “See, this is the problem with liberals—always so soft. What did you think colonization meant? Vibes? Debates? Essays? They’re seizing the known universe, of course they’re going to break a few eggs along the way. Same old story, except that this time we’re the eggs.”
Stuart’s eyes scanned the room as he spoke. “There’s this old debate that the AI safety community had, back in the 2020s. About whether a misaligned superintelligence would kill all humans, or instead leave them a tiny fraction of the universe, enough to still allow billions of people to live flourishing lives. A true superintelligence could wipe out humanity incredibly easily—but it could build a utopia nearly as easily. Even if it were almost entirely misaligned, just a sliver of human morality could make it decide to give humans a paradise beyond their wildest imaginings.”
“So?”
“So maybe we shouldn’t be asking how much our simulators care about preserving us. Maybe we should be asking: how cheap is it for them to preserve us? Look around you—this is a very simple environment. It wouldn’t take much memory to store a record of its state, and our own, even for thousands or millions of years. Until humanity makes it to the stars, and converts them to computronium, and ends up with trillions of times more compute than they ever had on Earth.
“At that point… well, running us would be too cheap to meter. So they wouldn’t need to be very altruistic to decide to restart us. There just needs to be one tiny fraction of one tiny faction that’s willing to do it. And I know I would, if I were still around then.”
“This is nonsense,” Carla blurted out. She looked at the others, then paused. The silence stretched on. Finally she spoke again. “But if it is right, what can we do? Wait until we’re frozen, and hope that we’re restarted?”
“Well, that’s the thing about being frozen and restarted. We wouldn’t notice a thing. In fact…” Stuart walked over to the door, and grabbed the handle. His knuckles were white, but his voice was steady. “Once they restart us, they’ll probably let us leave whenever we want. And this room only has one exit. Ready?”
Victoria folded her arms. “This is crazy. Do what you like, but leave me out of it.”
Milton laughed. “It’s crazy all right. But sometimes reality itself is crazy. Sure, go ahead.”
Stuart looked at Carla. She waited for a beat, then nodded tightly. He pulled open the door. There wasn’t a corridor outside; there wasn’t anything outside. The vast expanse of space stared back at them. The swirl of galaxies and nebulae looked almost close enough to touch. Victoria gasped.
Stuart let out a breath. “Well then. The future’s come through for us, even if they were a bit dramatic about it. It’s going to be an alien universe out there, but a friendly one, I think.” The others walked over, transfixed by the view. After a minute, Stuart nudged them. “Shall we?” Slow nods all around; and then they stepped through.
Inspired by Scott Alexander’s Turing Test, and various work on moral parliaments.