(this is a work of fiction)

it's been four years since the singularity. someone pressed the button, and their preferences were implemented across the cosmos. i don't think anyone knows who pressed the button; that is probly how they'd like things to be. maybe they don't know themself.

i wake up and cuddle with my partners for a while. we live in a log cabin, which was currently somewhere near forests and mountains, somewhere in washington state i think.

i don't know if it's actually washington state, because i don't care about being uploaded. it could be that 10¹⁰⁰ objective years have passed since the singularity and that now that it's got the compute it needs, Elua has started running our simulations. it could be that it's entirely remade the cosmos into a shape we cannot conceive. or it could be that this is actually earth, in its full physical continuity, actually four objective years after the singularity. all of these are okay to me.

"Elua" is what i've called the superintelligent singleton that now rules over everything. i call it that because of an old pre-singularity writing, which i still like. most of my friends have picked up on that name, but i hear some people in the nearby small town call it god. many people out there probly don't even know about Elua, because they would prefer not to. i'm sure even for them, everything is okay.

i wonder if someone out there cares about being uploaded. i wonder if their preferences have been satisfied, and if so how. what i am pretty confident about, is that whatever the situation, somehow, they are okay.

one of my partners goes to put on some music. we have something like a vinyl player. i like that the device is legible — the sound wave that we hear has been encoded into the physical shape of the object, and so the whole way the device works is understandable by a human mind. we don't really need a player of course, we could just magically hear whatever we wanted, without any artefacts. but i like things this way. i like the rustic experience of being a human manipulating tools. and it's not like they would ever actually get in the way of what i want to any extent which wouldn't be okay.

sometimes i talk with Elua in my dreams. i could talk to her anywhere, but dreams seem like a nice context for it. i've used lucid dreams to reshape my body a few times, and then woken up with my new body. it's mostly similar to the one i had before the singularity; i want to stay in a mostly grounded human experience, at least for now. maybe one day i'll explore much more alien forms of existence, as i'm sure many are doing already; but for the moment, this is what feels okay.

certainly, i don't suffer any grave illnesses; i do get a bit under the weather sometimes, because i think it's okay for that to happen.

i decide to check on how my friend is doing. i open the cupboard and find my crystal ball, put it on the table, and say the name of my friends. when some piece of technology has to be illegible, i like having it presented as magic. if this is the inside of a simulation that is getting ad-hoc intervened with for that device to work, then it might as well be magic anyways. both this kind of reliable magic, and other more mysterious forms of magic, are okay.

i speak the name of a friend to the ball, and then make an effort to focus on it. the focus does not do anything to the ball, but it makes it that my sensorial input of the ball is much amplified, and my input of the rest reduced. my friend is immediately available for communication — whether by either of us getting paused long enough for the other to become available, or because they actually were — and after greeting me, they reports what it's been like to expand their intellect a millionfold and study ever expanding maths. they tell me about some unimaginably elegant theorems they've found out about. as they say this, my focus makes it that i can see my friend as if they were standing in front of me, and they point at mathematical shapes floating in the air. i semi-consciously let them enter my mind, and the mathematical structures permeate my understanding. they are not visual, but truly mathematical, as if a logic-percieving module was attached to my mind to percieve mathematical logic directly. i appreciate my friend's discoveries, but i also discreetly chuckle at how cute they are when they get excited about it. i tell them about how i've been taking it easy but, percieving that they're not particularly interested, i let them get back to their stuff. our goodbye is a bit awkward, but that's okay.

by a flick of the mind, i retract my focus from the crystal ball, at which point the smell of toast strikes me. after getting my bearings for a second or two i put it back in the cupboard, and head to the small living room to see what my partners have been cooking. it's toasted bread with some sort of cheesey-creamy stuff on it. i don't know if the cheese appeared at the store magically, or if comes from fake animals that exist for the sake of people who want to partake of farming, but i don't have to worry about anything like meat industry scale suffering. something like that would just not happen — everything that does happen is okay.

we decide to go into town. the town is pretty small — not many people are in the streets. various stores are open. most give stuff away for free, while some sell it for money. money has become strange since the singularity. some people choose to care about it, and there are some scarce things it can track, such as the use of someone's time; but it doesn't make sense to track much else, such as material resources. so most people kind of just don't bother. even land in an absolute sense is not scarce; it seems like Elua's solution to some people such as me wanting to live on something like a single earth, has been to add more space in between existing space. the total amount of land that "earth" consists of may very well have doubled since the singularity, by now. somehow, it's all arranged such that traveling to somewhere you wanna go leads you there, but travelling aimlessly does get you to many new places. we can even get lost sometimes, when we're okay with that.

it is mid-winter, but i can't be bothered to put on something warm; nevertheless, i barely feel cold: i'm semi-consciously opting for it to feel just a bit chilly, reducing the pain of cold but still getting the informational sensation of it, the way some people pre-singularity would be born with the full information but none of the sensation of pain. in any case, feeling just a bit chilly is okay.

we go to the adventure guild, where i posted a quest for a playstation 1. i did give some currency as a reward — moreso to not feel bad that i'm using someone's time, even though the people who fulfill quests are all pretty much happy to do so — they're people who want their life to provide value and meaning to others, and for most of them those others must actually be real people; and it wouldn't be okay for Elua to just create people out of nowhere to create an artificial demand, so it doesn't. and so, there is a genuine market mismatch, in more people wanting to fulfill quests. despite the fact that adventurers are the ones gaining most of the value from this system, the custom has remained that it is the quest poster who pays the adventurer — it's not like money is very important anyways, so what might in a previous era have been considered terrible market inefficiency, is now more than okay.

the language used in town is basically english, with some internet meme slang thrown in there. it also has some pretty local characteristics, but hasn't diverged that much — people value using english as a lingua franca around here, and as for me and my partners, we reserve for private use the artistic constructed language we've developed together. i like english, it's good. and sometimes, people around don't speak it, and we just find something that they do know, or ask a local who can help translate, or even kinda just gesture at each other and work things out like that. anyone could just choose to have their brain understand any language they want, or even communicate by thought, but i like sticking to communities that share this humancore artefact that is highly imperfect verbal communication. even when there are misunderstandings, it's not a big deal; it's okay.

just as we arrive in the store, an adventurer comes back with the genuine playstation 1 we'd requested. probly not a coincidence, probly fortunate timing arranged by Elua. well, it's not like the timing would've been correct anyways: some time dilation has certainly taken place, considering the adventurer tells us how it's taken them several weeks to find that playstation due to them committing to not using Elua's help, while on my end i remember posting the quest just the day before. the adventurer recounts to us his adventure finding the playstation 1, driving to various pawn shops in the area, and asking people. i had made the quest kind of hard: i had requested a playstation that had existed physically continuously since the singularity, not one that had been created out of thin air or even constructed since the singularity, nor a pre-singularity one that had been copied into multiple instances. but he did find one, and the journey he was on made me feel, as i head back home with my new playstation, like this playstation now carries an extra bit of meaning.

as soon as i get home i simply plop the playstation in front of the chalkboard that we use as a TV, grab the controller, and put on the copy of metal gear solid i'd obtained a while ago. it's just as great a game as i'd rembered, and while i'm focusing on it, one of my partners watches while the other asks if they can play with me, and when i say yes they sit where i'm sitting, the two of us temporarily occupying the same physical location, so that we can hold the controller at the same time while our minds intermingle as they open to one another. we could have also magically duplicated the controller and taken turns, but it is more fun this way, each of us taking controlling the playing character to various degrees, and also having a shared piece of mind keeping track of what we're doing together, so as to not have to verbally communicae our intentions. we are fully focused on the game and on each other and we scarcely feel time go by, such that when my other partner calls to our attention to get dinner, it's already dark out. the days go by fast when we take things this easy, but it's okay; it's not like time is scarce.

we have some great tartiflette, and then head to bed, to chit chat and cuddle before sleep. we talk about what to do tomorrow, and decide to have the cabin move somewhere unexpected during our sleep, so we can go explore some new surroundings. maybe we'll wake up in a different continent, and we'll do some adventurous hiking, reassured by the feeling that whatever happens, everything will be okay.

afterword

writing utopia is important. it's not just a good way to get people to start actually thinking about how good things could actually be, but also, if something like PreDCA is the kind of benevolent superintelligent singleton we get, we have to start acting in ways that make us people who tend to express our values. we have to cultivate ourselves and each other to wish for good worlds, and realize how much we'd dislike bad ones. we need to help make future benevolent superintelligence's job at realizing our values as easy as it can be, and to make our expressed values clear through our actions, such that if we do start up an AI which extrapolates our values from our actions, it gets the correct idea. finally, writing utopian fiction is just plain fun, and i find it good motivation to work on AI risk mitigation: think carrot, not stick.

New Comment
23 comments, sorted by Click to highlight new comments since:

everything is okay

(this is a work of fiction)

Oof, right in the existential anxiety.

Nice story, though insofar as it's supposed to function as a concrete proposal for an AI-based utopia (or any utopia really), I think it's missing the answer to the most important question: what sorts of options or restrictions are there for creating descendants? (Such as children, digital autonomous organizations, nation-states or whatever.)

this is a specific focus for what part of utopia i'd like to live in, and i don't really have an interest in creating descendants at the moment. i've written more about rules for creating new persons in ∀V, but the short version is "no kids allowed (they're too hard for me to figure out), only copies of people". though in a setting where aligned Elua has more agency, maybe it could figure out to make kids viable.

Nice that you've thought about it and made a decision. So just to check that I understand correctly, your proposed utopia probably contains a a whole bunch of people who really want to start families but are forced not to by Elua?

who really want to start a family in a way that can't be satisfied by an alternative, yes. such as: creating a merged version of their minds and have it emit preferences in advance and then consentingly modify itself until it's reasonably childlike, have a non-moral-patient fake mind be in the body until a certain age before being replaced with that merged mind, or any other kind of weird scheme i haven't thought about. there are many possibilities, in virtualia and with Elua to help.

It sounds very unchallenging. 

Perhaps boring. Pointless.

(But then most utopias sound that way to me.)

there are probly many challenges one can face if they want that. i'm just fine getting my challenges from little things like video games, at least for a while. maybe i'd get back into the challenge of designing my own video games, too; i enjoyed that one.

"Utopias are all alike; every dystopia is horrible in its own way." -- AI Karenbot

I can empathise with the feeling, but I think it stems from the notion that I (used to) find challenges that I set for myself "artificial" in some way, so I can't be happy unless something or somebody else creates it for me. I don't like this attitude, as it seems like my brain is infantilising me. I don't want to depend on irreducible ignorance to be satisfied. I like being responsible for myself. I'm trying to capture something vague by using vague words, so there are likely many ways to misunderstand me here.

Another point is just that our brains fundamentally learn from reward prediction-errors, and this is likely to have generalised into all sorts of broad heuristics we use in episodic future thinking--which I speculate plays a central role in integrating/propagating new proto-values (aka 'moral philosophy').

i think about this story from time to time. it speaks to my soul.

  • it is cool that straight-up utopian fiction can have this effect on me.
  • it yanks me in a state of longing. it's as if i lost this world a long time ago, and i'm desperately trying to regain it.

i truly wish everything will be ok :,)

thank you for this, tamsin.

Thanks for writing! I'm a big fan of utopian fiction, it's really interesting to hear idealised depictions of how people would want to live and how they might want the universe to look. The differences and variation between attempts is fascinating - I genuinely enjoy seeing how different people think different things are important, the different things they value and what aspects they focus on in their stories. It's great when you can get new ideas yourself about what you want out of life, things to aspire to. 

I wouldn't mind at all if writing personal utopian fiction on LW were to become a trend. Like you say, it feels important, not just to help a potential AI and get people thinking about it, but also to help inspire each other, to give each other new ideas about what we could enjoy in the future.

Initially, I sorta felt bummed out that a post-singularity utopia would render my achievements meaningless. After reading this, I started thinking about it more and now I feel less bummed out. Could've done with mentions of biblically accurate angels playing incomprehensible 24d MMOs or omniscient buddhas living in equianimous cosmic bliss but it still works.

Would you mind explaining why the post is written in lowercase?

that's just part of my stylistic choices in blogging; to make my formal writing more representative of my casual chatting. see eg this or this [EDIT: or this]

Thanks for sharing this, I enjoyed it!

I will copy a lil post on an utopia idea I had some time ago here as it seems relevant enough. I think it can also address concern of some commenter utopias being boring.

It is not written in a careful LW style so add caveats as you read.

rat race done right = proximal development zone world

utopias are hard to imagine. some resort to saying that utopia would be so great that we can not even copmrehend how. there are no words or concepts in our language to begin to desribe it. I'll try nevertheless.

imagine a world where every day you get a perfect challenge for you at the time.

if you ace a course, you get into a more advanced one. if you are doing great an your job, you get more responsibity. your relationships deepen.

but very importantly this can also scale down. if your grades slip, you get helped or change it. if you are feeling not inspired, your job gives you a break. if your relationship gets too intense, your other responsibilities will decrease.

you don't ever have to deal with more stuff than you can handle but you are always learning and striving and getting better. like in a computer game.

it even works a little bit like this in this world except there is no magic/benevolent AI overlord computing what's best for you. and some (way too many) people roll too many shitty things in their lives. (and maybe some people get too few?)

let's be kinder and more daring

your princess is in another castle and I am happy for you 👋

Thanks! I'm always hungry for good sci-fi utopias :) I particularly liked that mindmelding part.

After also reading Diaspora and ∀V, I was thinking what should be done about minds who self-modify themselves into insanity and suffer terribly. In their case, talking about consent doesn't make much sense.

Maybe we could have a mechanism where:

  • I choose some people I trust the most, for example my partner, my mom, and my best friend
  • I give them the power to revert me back to my previous snapshot from before the modification, even if it's against my insane will (but only if they unanimously agree)
  • (optionally) by old snapshot is temporarily revived to be the final arbiter and decide if I should be reverted - after all I know me the best

well, i worry about the ethics of the situation where those third parties don't unanimously agree and you end up suffering. note that your past self, while it is a very close third party, is a third party among others.

i feel like i still wanna stick to my "sorry, you can't go to sufficiently bad hell" limitation.

(also, surely whatever "please take me out of there if X" command you'd trust third parties with, you could simply trust Elua with, no?)

Yeah, unanimous may be too strong - maybe it would be better to have 2 out of 3 majority voting for example. And I agree, my past self is a third party too.

Hm, yeah, trusting Elua to do it would work too. But in scenarios where we don't have Elua, or have some "almost Elua" that I don't fully trust, I'd rather rely on my trusted friends. And those scenarios are likely enough that it's a good option to have.

(As I side note, I don't think I can fully specify that "please take me out of there if X". There may be some Xs which I couldn't foresee, so I want to rely on those third party's judgement, not some hard rules. (of course, sufficiently good Elua could make those judgements too))

As for that limitation, how would you imagine it? That some mind modifications are just forbidden? I have an intuition that there may be modifications so alien, that the only way to predict their consequences is to actually run that modified mind and see what happens. (an analogy may be, that even the most powerful being cannot predict if some Turing machine halts without actually running it). So maybe reverting is still necessary sometimes.

i feel like letting people try things, with the possibility of rollback from backup, generally works. let people do stuff by default, and when something looks like a person undergoing too much suffering, roll them back (or terminate them, or whatever other ethically viable outcome is closest to what they would want).

maybe pre-emptive "you can't even try this" would only start making sense if there were concerns that too much experience-time is being filled with people accidentally ending up suffering from unpredictable modifications. (though i suspect i don't really think this because i'm usually more negative-utilitarian and less average-utilitarian than that)

that said, i've never modified my mind in a way that caused me to experience significant suffering. i have a friend who kinda has, by taking LSD and then having a very bad time for the rest of the day, and today-them says they're glad to have been able to try it. but i think LSD-day-them would strongly disagree.

Yeah, that makes sense.

I'd like the serious modifications to (at the very least) require a lot of effort to do. And be gradual, so you can monitor if you're going in the right direction, instead of suddenly jumping into a new mindspace. And maybe even collectively decide to forbid some modifications.

(btw, here is a great story about hedonic modification https://www.utilitarianism.com/greg-egan/Reasons-To-Be-Cheerful.pdf)

The reason that I lean toward relying on my friends, not a godlike entity, is because on default I distrust centralized systems with enormous power. But if we had Elua which is as good as you depicted, I would be okay with that ;)

thanks for the egan story, it was pretty good!

i tend to dislike such systems as well, but a correctly aligned superintelligence would surely be trustable with anything of the sort. if anything, it would at least know about the ways it could fail at this, and tell us about what it knows of those possibilities.

what i am pretty confident about, is that whatever the situation, somehow, they are okay.

This hit me. Had to read it thrice to parse it. "Is that sentence even finished?"

I've done a lot of endgame speculation, but I've never been close to imagining what it looks like for everyone to be okay. I can imagine, however, what it looks like internally for me to be confident everyone is ok. The same way I can imagine Magnus Carlsen winning a chess game even if the board is a mystery to me.

It's a destabilising feeling, but seems usefwl to backchain from.

[+][comment deleted]20