Previously, I posted a version of The Gift We Give Tomorrow that was designed to be read aloud. It was significantly abridged, and some portions reworded to flow better from the tongue. I recently finished another part of my project: An abridged version of Beyond the Reach of God. This one doesn’t lend itself as well to something resembling “poetry,” so it’s more a straightforward editing job. The original was 3315 words. The new one is currently 1090. I’m still trying to trim it a little more, if possible. GWGT was 1245, which was around 7 minutes of speaking time, and pushing the limit of how long the piece can be.

For those who were concerned, after paring this down into a collection of some of the most depressing sentences I've ever read, I decided it was NOT necessary to end "Gift We Give Tomorrow" on an echo of this post (although I'm leaving in the part where I reword the "Shadowy Figure" to more directly reference it). That reading will end with the original "Ever so long ago."

 

Beyond the Reach of God:

I remember, from distant childhood, what it's like to live in the world where God exists. Really exists, the way that children and rationalists take all their beliefs at face value.

In the world where God exists, he doesn’t intervene to optimize everything. God won’t make you a sandwich. Parents don't do everything their children ask. There are good arguments against always giving someone what they desire.

I don't want to become a simple wanting-thing, that never has to plan or act or think.

But clearly, there's some threshold of horror, awful enough that God will intervene. I remember that being true, when I believed after the fashion of a child. The God who never intervenes - that's an obvious attempt to avoid falsification, to protect a belief-in-belief. The beliefs of young children really shape their expectations - they honestly expect to see the dragon in their garage. They have no reason to imagine a loving God who never acts. No loving parents, desiring their child to grow up strong and self-reliant, would let their toddler be run over by a car.

But what if you built a simulated universe? Could you escape the reach of God? Simulate sentient minds, and torture them? If God's watching everywhere, then of course trying to build an unfair world results in the God intervening - stepping in to modify your transistors. God is omnipresent. There’s no refuge anywhere for true horror.

Life is fair.

But suppose you ask the question: Given such-and-such initial conditions, and given such-and-such rules, what would be the mathematical result?

Not even God can change the answer to that question.

What does life look like, in this imaginary world, where each step follows only from its immediate predecessor? Where things only ever happen, or don't happen, because of mathematical rules? And where the rules don't describe a God that checks over each state? What does it look like, the world of pure math, beyond the reach of God?

That world wouldn't be fair. If the initial state contained the seeds of something that could self-replicate, natural selection might or might not take place. Complex life might or might not evolve. That life might or might not become sentient. That world might have the equivalent of conscious cows, that lacked hands or brains to improve their condition. Maybe they would be eaten by conscious wolves who never thought that they were doing wrong, or cared.

If something like humans evolved, then they would suffer from diseases - not to teach them any lessons, but only because viruses happened to evolve as well. If the people of that world are happy, or unhappy, it might have nothing to do with good or bad choices they made. Nothing to do with free will or lessons learned. In the what-if world, Genghis Khan can murder a million people, and laugh, and be rich, and never be punished, and live his life much happier than the average. Who would prevents it?

And if the Khan tortures people to death, for his own amusement? They might call out for help, perhaps imagining a God. And if you really wrote the program, God *would* intervene, of course. But in the what-if question, there isn't any God in the system. The victims will be saved only if the right cells happen to be 0 or 1. And it's not likely that anyone will defy the Khan; if they did, someone would strike them with a sword, and the sword would disrupt their organs and they would die, and that would be the end of that. 

So the victims die, screaming, and no one helps them. That is the answer to the what-if question.

...is this world starting to sound familiar?

Could it really be that sentient beings have died, absolutely, for millions of years.... with no soul and no afterlife... not as any grand plan of Nature. Not to teach us about the meaning of life. Not even to teach a profound lesson about what is impossible.

Just dead. Just because.

Once upon a time, I believed that the extinction of humanity was not allowed. And others, who call themselves rationalists, may yet have things they trust. They might be called "positive-sum games", or "democracy", or “capitalism”, or "technology", but they’re sacred. They can't lead to anything really bad, not without a silver lining. The unfolding history of Earth can't ever turn from its positive-sum trend to a negative-sum trend. Democracies won't ever legalize torture. Technology has done so much good, that there can't possibly be a black swan that breaks the trend and does more harm than all the good up until this point.

Anyone listening, who still thinks that being happy counts for more than anything in life, well, maybe they shouldn't ponder the unprotectedness of their existence. Maybe think of it just long enough to sign up themselves and their family for cryonics, or write a check to an existential-risk-mitigation agency now and then. Or at least wear a seatbelt and get health insurance and all those other dreary necessary things that can destroy your life if you miss that one step... but aside from that, if you want to be happy, meditating on the fragility of life isn't going to help.

But I'm speaking now to those who have something to protect.

What can a twelfth-century peasant do to save themselves from annihilation? Nothing. Nature's challenges aren't always fair. When you run into a challenge that's too difficult, you suffer the penalty; when you run into a lethal penalty, you die. That's how it is for people, and it isn't any different for planets. Someone who wants to dance the deadly dance with Nature needs to understand what they're up against: Absolute, utter, exceptionless neutrality.

And knowing this might not save you. It wouldn't save a twelfth-century peasant, even if they knew. If you think that a rationalist who fully understands the mess they're in, must be able to find a way out - well, then you trust rationality. Enough said.

Still, I don't want to create needless despair, so I will say a few hopeful words at this point:

If humanity's future unfolds in the right way, we might be able to make our future fair(er). We can't change physics. But we can build some guardrails, and put down some padding.

Someday, maybe, minds will be sheltered. Children might burn a finger or lose a toy, but they won't ever be run over by cars. A super-intelligence would not be intimidated by a challenge where death is the price of a single failure. The raw universe wouldn't seem so harsh, would be only another problem to be solved.

The problem is that building an adult is itself an adult challenge. That's what I finally realized, years ago.

If there is a fair(er) universe, we have to get there starting from this world - the neutral world, the world of hard concrete with no padding. The world where challenges are not calibrated to your skills, and you can die for failing them.

What does a child need to do, to solve an adult problem?

New to LessWrong?

New Comment
29 comments, sorted by Click to highlight new comments since: Today at 9:59 PM

Young children really expect to see a dragon in their garage.

I'm not sure this will work well when being read aloud, and if you haven't already read the dragon post.

"The beliefs of young children actually shape their expectations. If a child believes there is a dragon in their garage, they really expect to see one there."

Or the Carl Sagan work it's a reference to, presumably.

For the intended audience, most people should be familiar with that. (There are going to be dramatically more obscure references over the course of the night). But for generally audiences, yeah.

Gonna go meta here.

What's this for? Who is it for? Would the virtues be an appropriate essay for spoken-word? Would you say that rationalists other than Yudkowsky deserve a spot in your oration session?

I am thinking about this at the meta level, but I'm thinking about it for a private group of people and for the time being am going to stay mysterious. It's also the sort of artistic project that benefits from feedback and input, but ultimately will work because a particular vision I have. I suspect that if I tried to describe that vision, it would sound a little weird, but when I present the finished product it'll make sense.

(I did get extensive feedback from the intended audience tonight)

Afterwards, if the event is successful (I think it will be) I will make a post about it, where I describe what problem I was trying to solve, why I was trying to solve it, how I went about it and what the results were. Then open up for discussion in terms of what might want to change if people want to replicate the event.

I actually did consider the virtues. They didn't end up quite fitting (despite looking like they might have in multiple ways that were each rather clever).

[-][anonymous]12y00

You have my attention - I look forward to seeing what you're putting together.

I think you will be satisfied, but I feel I should point out that it will be along a (slightly) different axis than you are probably anticipating.

[-][anonymous]12y30

This was pretty good. I like what you're doing here. Beyond the Reach of God is the very first essay I ever read on LessWrong, and it still remains to this day my favorite essay, and, in my opinion, the most important to understand. This may simply be because I still haven't completely gotten over my "existential angst" phase from my teenage years, but I do think realizing that we live in an unprotected universe is one of the most important truths any human can face.

I actually wrote an extended version of Beyond the Reach of God a few months ago out of fun (and boredom), with a few paragraphs of my own tacked on. I haven't shown it to anyone. I mostly extended the "universe-allows-horrible-things-to-happen" aspects of the essay and cut down the rationalist parts, so my version is twice as "dark" and depressing as the original. But I think it packs more of an emotional punch, at least for me.

Anyway, I really like what you're doing here. Not much else to say.

Thanks. Additional paragraphs wouldn't be useful for this project obviously, but I'd be curious to see what you added.

I know tastes differ, but... my favorite single paragraph in all of LW didn't make the cut :-( Maybe it doesn't translate to spoken word very well?

And if the Khan tortures people horribly to death over the course of days, for his own amusement perhaps? They will call out for help, perhaps imagining a God. And if you really wrote that cellular automaton, God would intervene in your program, of course. But in the what-if question, what the cellular automaton would do under the mathematical rules, there isn't any God in the system. Since the physical laws contain no specification of a utility function - in particular, no prohibition against torture - then the victims will be saved only if the right cells happen to be 0 or 1. And it's not likely that anyone will defy the Khan; if they did, someone would strike them with a sword, and the sword would disrupt their organs and they would die, and that would be the end of that. So the victims die, screaming, and no one helps them; that is the answer to the what-if question.

That was an unfortunate, arbitrary call. The whole piece is filled with good lines, and somehow they need to be trimmed out to the minimum necessary to convey the idea. There are some other borderline sections that I might swap out for it, but honestly they probably need to be removed to begin with. (Specifically, the line about the cows and wolves - that can be skipped over to the line about humans. But evolution is a recurring theme for the night as well [An Alien God is also getting abridged. Very seriously abridged] and I'd like to keep that tied in.)

I may be underestimating how long people's attention span is, but I'd prefer to risk overshooting the trimming process. But I'll try reading it with and without and see if it feels too long. If I were to include it, it'd be trimmed something like this:

And if the Khan tortures people to death, for his own amusement? They might call out for help, imagining a God. And if you really wrote the program, God would intervene, of course. But in the what-if question, there isn't any God in the system. The victims will be saved only if the right cells happen to be 0 or 1. And it's not likely that anyone will defy the Khan; if they did, someone would strike them with a sword, and the sword would disrupt their organs and they would die, and that would be the end of that.

So the victims die, screaming, and no one helps them. That is the answer to the what-if question.

I think removing it was a fine call. It is a nice paragraph, of course, but as you said, the entire thing is nicely written. And the next paragraph:

God would prevent it from ever actually happening, of course. At the very least, he’d visit some shade of gloom in Khan's heart. But in the mathematical answer to the question “What if?”, there is no God in the axioms.

conveys the same idea well enough.

If I were adding in the torture paragraph, it'd probably actually replace the "shade of gloom" paragraph. I think they end up closely stacked against each other - if you're saving the three lines from "gloom" then the slight length of "torture" isn't too bad. Then it's just "which is more powerful," which I'm still mulling over.

The trimmed version looks fine to me, but you don't have to include the paragraph just because one person likes it! It's completely your call.

If God is watching everywhere, then of course trying to build an unfair world results in the God will intervening

Strike "will", or change to "...will result in God's intervention"

After practicing reading a few times, I decided that some sections really need extra verbage to retain their power. Added some things back in.

I also think I may need to adjust the tone slightly, to match the style of speech that I would normally use when saying something that I'm truly passionate about. Right now, when I read it, it's obvious that I'm reading from a script instead of speaking from the heart.

What's the intended audience for this? It might be worth editing the language to be less Less Wrong focused. If you want an average audience to follow along at the pace you're speaking, you don't want bits where they pause and think "Wait, "fun-theoretic?""

Audience, for now, is a group of friends who've read the sequences. (If this is successful within that group, I'll consider optimizing it for others, but even then it's really intended to be a bonding ritual for people who already share our memes, rather than a way to spread them to newcomers).

"This piece is written for those who have something to protect" - shouldn't this be edited not to menion writing?

I removed the word "post." I thought about the altering the word 'written,' but speeches and stories are written just as much as blog posts. "This was written" seemed pretty straightforward for what it is.

The alternative would be:

"But I'm speaking now, for those who have something to protect."

Which I actually kinda like, now that I look at it. Hmm.

What's wrong with

"This piece is for those who have something to protect"

?

Nothing, actually.

I also really like the sound of that alternative. It's very powerful and personal, and the traditional hemming|hawing about active-not-passive voice actually is a rare case here of genuinely adding emotional voice.

What sort of feels wrong to me is that this piece is EXPLICITLY Eliezer talking (whereas Gift We Give Tomorrow are sort of generic people). It feels sort of presumptuous to pretend to be him. Of course, if I'm actually concerned about that then there's more I need to be worried about than this line.

In the second paragraph, I would strike this line:

Parents don't do everything their children ask.

as it seems to interrupt the flow.

[This comment is no longer endorsed by its author]Reply

I think it helps establish the overall parent-child metaphor, which is important to the work.

After reading it over again, I think I agree with you.