You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

NancyLebovitz comments on Open Thread, November 1 - 7, 2013 - Less Wrong Discussion

5 Post author: witzvo 02 November 2013 04:37PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (299)

You are viewing a single comment's thread. Show more comments above.

Comment author: gattsuru 05 November 2013 12:47:16AM *  3 points [-]

At least to me, it's increasingly difficult to distinguish between a paradise machine and wireheading, and I dislike wireheading. Each shard of the Equestria Online simulation is built to be as fulfilling (of values through ponies and friendship) as possible, for the individual placed within that shard.

That sounds great! .... what happens when you're wrong?

I mean, look at our everyman character, David. He's set up in a shard of his own, with one hundred and thirty two artificial beings perfectly formatted to fit his every desire and want, and with just enough variation and challenge to keep from being bored. It's not real variation, or real challenge, but he'd not experience that in the real world, either, so it's a moot point. But look at the world he values. His challenges are the stuff of sophmore programming problems. His interpersonal relationships include a score counter for how many orgasms he gives or receives.

Oh, his lover is sentient and real, if that helps, but look at that relationship in specific. Butterscotch is created as just that little bit less intelligent than David is -- whether this is because David enjoys teaching, or because he's wrapped around the idea of women being less powerful than he is, or both, is up to the reader. Sculpted in her memories to exactly fit David's desires, and even a few memories that David has of her she never experiences, so that the real Butterscotch wouldn't have to have experienced unpleasant things that CelestAI used to manipulate David into liking/protecting her.

There are, to a weak approximation, somewhere between five hundred billion and one trillion artificial beings in the simulation, by the time most of humanity uploads. That number will only scale up over time. Let's ignore, for now, the creepiness in creating artificial sentients who value being people that make your life better. We're making artificial optimized for enjoying slaking your desires, which I would be surprised if it happened to also be optimized for what we as society would really like.

Lars is even worse: he is actively made to not not want his life of debauchery -- see the obvious overlap with the guy modifying himself to not get bored with a million years of catgirl sex.

At a deeper level, what if your own values are wrong?

The basic example, brought up in the Rules of The Universe document, is a violent psychopath. Upon being uploaded, CelestAI would quite happily set our psychopath up in a private shard with one hundred and fifty artificial ponies, all of which are perfectly molded to value being shot, stabbed, lit on fire, and violated in a way that is as satisfying as possible to a Dexter villain.

Or I can provide a personal example. I can go both ways, preferring guys, and was an unusually late bloomer. I can look back through time to see an earlier version of myself's values, and remember how they changed. Even in a fairly tolerant society and even with a very collaborative environment, this was not something that came according to my values or without external stimulus. ((There is a political position version of this, but for the sake of brevity I'll just mention that it's possible. More worryingly, I'm not sure there's a way to formalize this concern, as much as it hits me at a gut level. For the most part, value drift is something we don't want.))

Or, for an in-story example :

Ybbx ng jung unccraf gb Unaan / 'Cevaprff Yhan'. Gur guvat fur inyhrf zbfg, ng gur raq bs gur fgbel, vf oryvrivat gung fur qvq abg znxr n zvfgnxr hayrnfuvat PryrfgNV. Naq PryrfgNV vf dhvgr pncnoyr bs fubjvat ure whfg gur orfg rknzcyrf bs ubj guvatf ner orggre. Vg qbrfa'g znggre jung gur ernyvgl vf, naq vaqrrq gur nhgube gryyf hf gung Unaan pbhyq unir qbar orggre. Zrnajuvyr, Unaan vf xrcg whfg ba gur obeqre bs zvfrenoyr nf gur fgbel raqf.

It's a very good dysutopia -- I'd rather live there than here, and heck it even beats a good majority of conventional fluffy cloud heaven afterlives -- but it's still got a number of really creepy issues..

Comment author: NancyLebovitz 05 November 2013 01:05:49AM 4 points [-]

A little fiction on related topics: "Hell Is Forever" by Alfred Bester-- what if your dearest wish ts to create universes?You're given a pocket universe to live in forever, and that's when you find out that your subconscious keeps leaking into your creations (they're on the object level, not the natural law level), and you don't like your subconscious.

Saturn's Children by Charles Stross. The human race is gone. All that's left is robots, who were built to be imprinted on humans. The vast majority of robots are horrified at the idea of recreating humans.