Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: The_Jaded_One 30 December 2016 11:58:47PM *  0 points [-]

Interesting, I'd like to think/talk more about how different types of people might get into Cryonics, and how they might do on the other side.

One expectation I have is that the people who tend to self-select into cryo are probably the people with the most to gain from it.

I think that the binding constraint on how good paradise can be is the constraint of how much you can modify yourself and still realistically say that it is you. If you are a fairly average person from today with simple tastes and interests, there perhaps not much room for you to grow and still be "you".

If you have more exotic tastes and more sophisticated ambitions, you have more room to grow. The more frustrated and stifled you feel by contemporary society, the more you'll benefit from having all those constraints lifted. Dream big.

Comment author: Stuart_Armstrong 31 December 2016 06:41:57AM 1 point [-]

I suspect that a few people will end up as celebrities for exploring interesting areas of mindspace, and they may spark various fashions among people who would not have expected to change much.

Comment author: WalterL 29 December 2016 07:23:10AM 1 point [-]

Thanks for writing this! I appreciate it.

Comment author: Stuart_Armstrong 29 December 2016 10:38:10AM 0 points [-]

Cheers!

Comment author: vakusdrake 28 December 2016 10:11:24PM *  1 point [-]

That is the default, because most people are OK with that :-)

See that makes it kind of a weird thing for that character to have brought up then. After all why bring something up if it only applies to people who wouldn't care by definition.

Also you don't really explain how religious fanaticism is going to effectively be suppressed via economics. Like how you still haven't really given any plausible way for that to happen, like what are you going to try to starve them to death if they don't deconvert?, that wouldn't work for a number of reasons. Also I'm not just talking about a small number of cultists. Barring unprecedented cultural changes between now and 2064 there will still be millions if not billions of people interested in maintaining oppressive religious cultures by the time the singularity rolls around. As for them dwindling out that seems unlikely given 1. their high birth rates and 2. the fact that since the outside world is so alien it will be easy to demonize. Plus they will likely become more extreme because many people will perceive this as the end times, and the prior mentioned isolation such an alien external world would cause. Not to mention they would see how religious communities that had contact with the outside invariably fell to sin, thus making the need for isolation even more important as far as they're concerned.

Comment author: Stuart_Armstrong 29 December 2016 10:35:47AM 0 points [-]

Caveat: I hadn't thought all these things through as much at the time, so there are ideas I'm developing during this conversation.

So I'd see this world as maintaining the possibility for surprise and objections. They could have informed Grant of how they would drop info into him. Instead, they expected it to come up at some point (which it did), and, depending on his reaction at the time (which could vary with the manner it was brought up), they would change their interaction with him. This also gives people the possibility of manipulating the interaction with him, as the lady talking with him did. People still have choices, and those choices have consequences and are not all pre-ordained.

And I'm not arguing that the world will become perfectly enlightened and agreeing with my values (or the extended cluster of my values) by 2064, just that AIs will have tools to achieve their goals by them. Religious ideologies change all the time, and economic power is one strong thing that changes this (note how the truly stable religious communities only work by maintaining themselves at the dunbar number). When it becomes exceedingly expensive to not be an upload, while all uploads run cognitive circles around you while boasting of properly stable religious communities within them, the temptation to join will be large. And the possibility of anonymous forks can be offered then or later.

Comment author: The_Jaded_One 28 December 2016 12:01:40PM *  0 points [-]

Yeah maybe I am putting myself into Grant's shoes a bit too much. Modifying your own algorithm is a bit like messing with system files on Linux/Windows.

"What can possibly go wrong if I just chmod the System files to 777 so that I have full access to all of them?"

...

computer dies horribly

I suspect that most people who are in the rational-o-sphere would be super cautious too, but perhaps one could build Grant's presingularity life up a bit. Maybe he won the lottery and decided to outright buy cryo at an older age, for example? Maybe he doesn't have all that rational-o-sphere knowledge?

Comment author: Stuart_Armstrong 28 December 2016 09:34:10PM *  0 points [-]

So, this is non-canon, but I pictured Grant as black, partially self-taught, middle manager career, some nerdish hobbies but many not, and overconfident in his own abilities. He chose cryogenics, because his overconfidence overrode his absurdity heuristic. But as I said, this is non-canon and subject to change if I ever flesh him out more.

Comment author: The_Jaded_One 28 December 2016 04:38:29PM 0 points [-]

needing to present all the ideas within a certain length, especially

btw is this self-imposed? Or just time constrained?

Comment author: Stuart_Armstrong 28 December 2016 09:28:25PM 0 points [-]

Readability for most people :-) it might be too long already.

Comment author: vakusdrake 28 December 2016 02:39:01PM 1 point [-]

See we know the powers didn't just drop knowledge into Grant's mind because he'd be okay with it. The whole reason he found out about it was because someone else was bringing up the fact that the powers are doing this to presumably everybody by default. I just wouldn't make sense for them to bring it up unless it was default.

Also the powers clearly seem to care about human autonomy, otherwise they would just use super persuasion to get everybody to agree to live the best possible human life.
Plus how exactly is economics in a post scarcity world going to stop religious fanaticism?

Comment author: Stuart_Armstrong 28 December 2016 09:27:38PM 0 points [-]

That is the default, because most people are OK with that :-)

The point about economics is that this is something that already undermines current religious cults; so there are tools, broadly within human autonomy, that can undermine this way of thinking.

On the question of autonomy, it seems they generally respect human autonomy, but do override this when they judge it necessary (eg resetting Grant). But when they do, they try and violate autonomy the least they can. If they judge that self-reinforcing unhappy socially conservative communities, without forks, are detrimental to human flourishing, then they can subtly undermine them (until at least the possibility of forks is accepted). The question is whether the situation is sufficiently dire to require such interventions.

Comment author: vakusdrake 28 December 2016 12:34:34PM *  2 points [-]

Hmm, there are two problems (though as a whole it's the best utopia I've ever seen conceived) I have with this utopia.

  • Firstly the default is for the powers to just insert the truth into your mind with consent or even letting you know, which I imagine many people like myself would find pretty terrifying, even if it's done in a way that we ourselves might have willingly chosen to get implemented. In fact I imagine most non-transhumanists would have pretty strong preferences against self modification and having their minds tampered with.

  • Secondly is the fact that I really doubt the portrayed efficacy of many of the safeguards against humans making things shitty. For instance I really doubt many hardcore social conservatives would want to even get uploaded in the first place and given their insular nature, external social pressure wouldn't do much to get them to change, if anything it would feed their persecution complex.
    A lot of the safeguards against what you call the culture trap, kind of seem like they would only work on people who were already somewhat socially progressive.

I just don't actually think there's any solution that preserves human autonomy that will eliminate oppressive insular cultures. Plus the more alien the outside world becomes, the easier it will be for it to be demonized. Even if you're forcefully inserting the truth into their minds they will just end up saying that's the voice of the devil or something in order to get people to not listen to it, the only solution is to just forcefully change their beliefs but that would totally mean dropping the idea of human autonomy.

Comment author: Stuart_Armstrong 28 December 2016 02:16:00PM 0 points [-]

Good points. I'd imagine the Powers have chosen the "drop knowledge into Grant's mind" because he wouldn't object to that; they may have other means for other people.

As for the general point, I don't know, but the Powers can be exceedingly manipulative when they need to, and respecting baseline human autonomy, even when made more rigorous, is not much of an obstacle for a superintelligence. Standard economics forces, without direction, have been very effective even in our world...

Comment author: ete 26 December 2016 05:19:11PM *  1 point [-]

I also know a good number of people from subcultures where nature is the foundational moral value, a few from ones where family structures are core (who'd likely be horrified by altering them at will), and some from the psychonaut space where mindstates and exploring them is the primary focus. I'd also guess than people for whom religions are central would find the idea of forked selves committing things they consider sins breaks this utopia for them. These groups seem to have genuine value differences, and would not be okay with a future which does not carve out a space for them. "Adventure" and a bunch of specifics here points at a very wide region of funspace, but one centered around our culture to some extent.

There's some rich territory in the direction of people who want reality to be different in reasonable ways coming together to work out what to do. The suffering reduction vs nature preservation bundle seems the largest, but there's also the complex value vs raw qualia maximization. Actually, this kinda fits into a 2x2?

Moral plurality is a moral necessity, and signalling it clearly seems crucial, since we'd be taking everyone else along for the ride.

Edit: This is touched on by characters exchanging values, and that seems very good.

Comment author: Stuart_Armstrong 28 December 2016 10:44:39AM 1 point [-]

If I was to impose this utopia in everyone, without negotiating with their values, it would go something like: everyone has the right to not have forks, everyone has the right to have forks, everyone has the eight to have their forks either "sin" or be sinless. And the Powers would act against groups that tried to make conditions on forks as conditions of belonging (they'd also discourage forks from acting together; if you create forks to sock-puppet how great you are, they will feel free to let that information leak).

From their perspective, this allows those people to consciously and fully knowledgeably live a sinless life, rather than being compelled to merely by social pressure.

Now, there's going to be value negotiations, but this system has already done quite a bit to accommodate multiple values.

Comment author: The_Jaded_One 28 December 2016 12:59:35AM *  5 points [-]

Great story, I read all of it.

What I liked: this seems to be one of the first serious attempts in either fiction or academic writing at dealing with how to enhance human minds without corrupting or destroying them, especially under recursive self modification.

It is breaking new ground in that direction as far as I can tell.

I liked the way Boon broke down its problems logically, it was a good depiction of a bootstrapping intelligence.

A few criticisms:

  • Lots of tell where show would be better but also much longer

  • Some of Grant's actions seemed really dumb or odd, but maybe they would have made more sense if we knew a bit more about who he is and what his previous life experience was.

  • Waking up from cryosleep and playing chase the goblin seems really odd to me. Maybe it's just me but damn I would want to savor that moment.

  • Grant Insisting on modifying his own mind RIGHT NOW WITH NO SAFEGUARDS BECAUSE HELL IT'S MY MIND BITCH seems incredibly stupid, I want to see some explanation and backstory to justify how someone could be rational enough to sign up for cryo but dumb enough to ask for that.

  • Some things that were supposed to be amazing and fantastic seemed weird and icky. The weird orgy thing for example. I have had a three way sexual experience and it was absolutely magical, the writing about the orgy made me want to be sick. I think it may partly be a show vs tell problem, and partly be because one can just do a lot better in terms of the scenario. Capturing a single human sexual experience between two people is hard work, tbh I wouldn't know how to do it (I lack writing talent)

Overall, fantastic, amazing, please do more and please teach me how to write!

Comment author: Stuart_Armstrong 28 December 2016 10:32:55AM 1 point [-]

Thanks for the compliments and the constructive criticisms! As you can tell, some of the problems are imposed by the structure of the story (needing to present all the ideas within a certain length, especially). If I write further stories set in this world, I'll try and address your points.

One minor counter: I think Grant's behaviour with self modification is actually sensible, seen from his own perspective. He can't trust that others won't overwrite key parts of him, and his very first self-modification action is to cautiously modify himself so he doesn't foolishly modify himself.

I also suspect his granddaughter was a bit manipulative there, giving him full control in a way that encouraged destructive modification. She could have given him a self-modification format with more training wheels. Instead, she chose to give him what he asked for, not what he wanted.

Comment author: NancyLebovitz 26 December 2016 02:44:13PM 0 points [-]

Thanks-- I had fun reading it, and it's definitely more exuberant than most utopias.

I wasn't crazy about the sandworm challenge for getting to be politically influential-- wouldn't it make more sense to work one's way up by being influential in smaller groups?

Probably too much for this story, but there'd also be basic research going on and changing things.

Comment author: Stuart_Armstrong 26 December 2016 03:07:32PM 0 points [-]

In story: the challenge is to join the City. Becoming influential afterwards is another task.

Out of story: the point of the sandworms is to suggest the breadth of possibilities going on in this world.

View more: Next