Armok_GoB comments on How to Become a 1000 Year Old Vampire - Less Wrong

55 [deleted] 02 October 2013 05:07AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (123)

You are viewing a single comment's thread.

Comment author: Armok_GoB 03 October 2013 12:37:07AM 13 points [-]

This article makes some great point, however I think you are other optimizing. Specifically, these seem more like techniques for Unlocking Massive Latent Potential (that most people don't have), or curing lazy/spoiled but already awesome people. That's very much worth writing an article over, since those are probably where most potential rationalists will come from, but it's not the same as an universal formula for awesomeness.

That wouldn't be a problem - social/environmental stimulation and diversity of experience are good for you even if it doesn't turn you into a badass. However, many of the techniques are dangerous if tried by a median human; getting rid of de-stressing activities and entertainment or taking on more responsibilities or than you can handle can burn you out, doing hard things and overloading yourself risks downright trauma and injury, and other hard things or quitting your job could leave someone in permanent financial ruin and unemployment.

What I suspect has happened here is the same type of selection effect as in books on how to get rich by extremely rich people - just because almost all members of desirable group X did Y, doesn't mean doing Y is a good idea; you never heard of all the many more people that did Y and failed ending up in a much worse position than if they had just stuck with status quo. Being member of an elite doesn't just select for strategy it also selects heavily for talent and luck, and different strategies may be optimal depending on the amount of talent and luck you have.

Comment author: [deleted] 03 October 2013 01:50:10AM *  12 points [-]

Ha, it is not until very recently that people have accused me of Latent Destiny.

I've been thinking about this, and I think that most people who have Latent Destiny do not believe that they do, and never achieve greatness because of that. People like that need to know that it's possible, and need some inspiration like OP.

As for everyone else, humans are pretty robust and will at least exercise basic judgement before doing stupid life-wrecking things based on a bad reading of an internet article written by someone called Nyan Sandwich. And if they don't exercise such judgement, someone else's dangerous advice would get them if mine did not.

But that's beside the point, because a few medians (perfect word for them, thank you) is an easy price to pay for another hero. If dangerous advice is necessary to create heroes, such dangerous advice is good, because we need more heroes, and we are at a point in history where the instrumental value of people dominates the intrinsic.

The selection effect comment is interesting, probably some truth. I'll think about that.

You are almost certainly right about this not working for everyone. Still, applied well it could produce gains in most.

Comment author: TheOtherDave 03 October 2013 02:20:35AM 6 points [-]

we are at a point in history where the instrumental value of people dominates the intrinsic.

On your view, has this ever been false? If we can justify treating people as means rather than ends now, I can't imagine a time when we couldn't justify it.

Comment author: [deleted] 03 October 2013 05:05:14AM 3 points [-]

Yes, in the grand scheme of things, the overwhelming value of historically existing humans is in enabling our glorious future, and this has always been true so far.

(Not to understate the awesomeness produced so far, just that so much more is possible)

At least, that's, like, my opinion, man.

Comment author: TheOtherDave 03 October 2013 02:35:54PM 3 points [-]

OK, cool.

And sure, I was just making sure I understood your position; your emphasis on the point we are in history left me uncertain whether there was some other point where it wasn't true.

I now feel compelled to add that to the extent that I expect you to act accordingly I will avoid you having power over anyone I care about, as I distrust the willingness to sacrifice existing people in order to enable a vision of our glorious future. That sort of thing has an iffy hit rate with humans; we tend to be overconfident about our specific details of our visions of glorious futures.

Comment author: Armok_GoB 03 October 2013 10:23:34PM 2 points [-]

Oh. Yea that's obvious hindsight. Looks like a got so carried away making a clever point that I forgot to be consequentialist. Again.

Comment author: katydee 03 October 2013 03:26:45AM *  2 points [-]

I think it might make more sense to just believe that Latent Destiny isn't real.

If dangerous advice is necessary to create heroes, such dangerous advice is good, because we need more heroes, and we are at a point in history where the instrumental value of people dominates the intrinsic.

This seems very dubious.

Comment author: [deleted] 04 October 2013 01:41:29AM 1 point [-]

It seems pretty clearly real to me. (Where "Latent Destiny" is some set of characteristics observable in advance of greatness that are necessary for greatness, and not widely seen outside of great people) Can you expand on why you think it is not real?

Or do you just mean that it's best to act as if it's not real? If so, that implies that the optimal decision for people without Latent Destiny is the same as for those with it. Why would that be?

Comment author: katydee 04 October 2013 04:58:52AM 0 points [-]

It isn't clear to me that we're at a point in history where heroes are unusually needed, so I'm not particularly in favor of prescribing dangerous advice to people in the hopes of producing more heroes.

As for "Latent Destiny," it seems acquired to me.

Comment author: shminux 03 October 2013 10:40:22PM 0 points [-]

we are at a point in history where the instrumental value of people dominates the intrinsic.

Is this because of the (presumed) looming AI x-risk or something?

Comment author: [deleted] 03 October 2013 11:17:08PM 2 points [-]

Yes there's that. There's also the positive side; we have not won yet.

(We've won when the universe has been torn apart and rebuilt for our benefit, or a powerful and reliable process that doesn't need human help is doing so. At that point we can focus on living valuable lives as people. For now we have to focus on being agents; steering the future towards that win.)

Comment author: Moss_Piglet 03 October 2013 11:47:20PM 2 points [-]

(We've won when the universe has been torn apart and rebuilt for our benefit, or a powerful and reliable process that doesn't need human help is doing so. At that point we can focus on living valuable lives as people. For now we have to focus on being agents; steering the future towards that win.)

Whenever I read people talking about post-singularity utopias it makes me really glad I'll probably die before they can be brought to fruition.

Can you actually imagine life where innate talent and learned skill had no value outside of social posturing or self-gratification? Where a Godlike machine secretly planned out your life according to some formula of ideal living? Where no-one suffered or died at all unless they chose to and real power is essentially non-existent? Where you'll be, in all likelihood, locked into a computer simulation while the real world is being torn apart for materials to increase your jailer's intelligence and extend its lifespan?

Even Hanson's cockroach-topia sounds better than that.

Comment author: [deleted] 04 October 2013 12:49:21AM *  4 points [-]

Can you actually imagine life where innate talent and learned skill had no value outside of social posturing or self-gratification? Where a Godlike machine secretly planned out your life according to some formula of ideal living? Where no-one suffered or died at all unless they chose to and real power is essentially non-existent? Where you'll be, in all likelihood, locked into a computer simulation while the real world is being torn apart for materials to increase your jailer's intelligence and extend its lifespan?

In case you are simply making a mistake: If we win, it is hardly likely that we would build something that sucked that much. The scenario you describe is not winning, and not what anyone actually wants.

How about a world where random things that you could do nothing about did not just kill you. Where hard work and learning actually could push you as far as you wanted to go. etc.

Read the Fun Theory Sequence

Comment author: Moss_Piglet 04 October 2013 04:11:12PM *  -2 points [-]

Read the Fun Theory Sequence

I did that several years before I joined, actually. Although obviously we've taken away different things from it.

How about a world where random things that you could do nothing about did not just kill you. Where hard work and learning actually could push you as far as you wanted to go. etc.

That's actually the problem though; the 'unfairness' of chance destruction and the hard limits which no individual can transcend in a lifetime are exactly the sorts of things which I find valuable. In my view, "[h]ard work pushing you" is not a choice any more than there's a choice between eating and starvation; those who strive will extend their will into the future while those who don't will be annihilated and forgotten. If there are safeties to turn off, a door to get out of or a 100% completion "Golden Ending" available then it's not life but a game. I'd like to see humanity grow up and stop playing around with games, on a metaphorical level anyway.

I know that this is not a mainstream view and that few would choose to live in my personal utopia, but on the flip side why should I change my aesthetics just because they'd lose in a headcount?

Comment author: Desrtopa 08 October 2013 05:10:17PM 5 points [-]

those who strive will extend their will into the future while those who don't will be annihilated and forgotten.

Those who strive can also be annihilated and forgotten, and those who don't can extend their will into the future given sufficiently fortuitous starting conditions.

If that's also part of a utopia you'd desire, so be it, but there are no points for pretending the chance destruction of our reality is fairer than it is.

Comment author: hairyfigment 04 October 2013 05:00:05PM 0 points [-]

Because your aesthetics are evil, possibly self-contradictory as a result, and sound ill-informed. (Picture from here, if the first link doesn't work).

Comment author: Nisan 04 October 2013 08:12:05PM 8 points [-]

Trigger warning: The parent comment links to a photograph of a screaming child covered in blood.

Comment author: Moss_Piglet 04 October 2013 05:18:02PM *  4 points [-]

Thanks for the pic, but I'm curious as to why you thought an 'evil' person would be bothered by it.

Edit: Actually, while I've got you here, can you let me know where I've been contradictory? I'd like to fix that sort of thing going forward.

Comment author: hairyfigment 04 October 2013 07:13:49PM 6 points [-]

The part that puzzles me most is the way you admit the existence of "chance destruction" before claiming "those who strive will extend their will into the future". In the real world, looks like 5.5 million children die annually between the ages of 4 weeks and 5 years. I picked that range because I'd expect many of them to try forming models of the world and optimizing it as best they can, before the world unceremoniously deletes them.

Comment author: hairyfigment 04 October 2013 05:21:59PM 2 points [-]

Like I said, I don't think you're evil. I think you espoused an evil position or preference, which may well contradict other and more real preferences. I think you're looking at a ridiculously small sliver of the real world and using that to conclude that the status quo is fine. (Or you're pattern-matching wildly, which has now grown in probability.)

Comment author: [deleted] 04 October 2013 11:19:17PM 0 points [-]

See Yvain on how certain ‘dystopias’ only look like ones from a First World person perspective.

Comment author: hairyfigment 04 October 2013 05:14:05PM 4 points [-]

For the down-voters:

It's scary to believe your leaders may secretly be, uh, not so sad if you die. But all you have to do is listen to them, and they'll tell you.

Can we change this? Maybe. But the first step in changing reality is facing it, no matter how ugly and frightening it is.

Comment author: pragmatist 08 October 2013 04:23:12PM 0 points [-]

Consider incorporating Nisan's trigger warning into your post.

Comment author: lmm 13 October 2013 09:42:00AM 2 points [-]

Can you actually imagine life where innate talent and learned skill had no value outside of social posturing or self-gratification?

Pretty sure I'm there already.

Where a Godlike machine secretly planned out your life according to some formula of ideal living?

Sounds comforting.

Where no-one suffered or died at all unless they chose to

Yes, a thousand times yes.

and real power is essentially non-existent?

Hmm. If I could only gain happiness from having power over others, I would be forced to consider myself as evil. That aside, in these post-singularity utopias I could have an arrangement with other people where we took turns, or gambled. Or I could lord it over a nation of entities that looked and acted like humans but weren't really.

Is that a better deal for me personally? Depends on whether I'm powerful at the moment. But it's certainly an improvement for people on average.

Where you'll be, in all likelihood, locked into a computer simulation while the real world is being torn apart for materials to increase your jailer's intelligence and extend its lifespan?

Agreed that this is bad. Post-singularity utopian AI should not do this.

Comment author: TheOtherDave 04 October 2013 01:41:03AM *  2 points [-]

Can you state clearly what's wrong with the world you describe... that is, what you would prefer instead?
If not, do you think someone much smarter than you might be able to state it clearly?
Either way... why not assume a world like that, instead?

For my own part, the things you describe don't sound particularly bad, weighted language notwithstanding.
They're certainly better than what we currently live with.

Comment author: MugaSofer 13 October 2013 03:46:08PM 1 point [-]

Can you actually imagine life where innate talent and learned skill had no value outside of social posturing or self-gratification?

... as opposed to ... helping other people?

Where a Godlike machine secretly planned out your life according to some formula of ideal living?

I'm having trouble seeing the downsides to that. Oh, is "some formula" supposed to imply it's arbitrary and false?

Where no-one suffered or died at all unless they chose to and real power is essentially non-existent?

I don't even know what you mean by this one.

Where you'll be, in all likelihood, locked into a computer simulation while the real world is being torn apart for materials to increase your jailer's intelligence and extend its lifespan?

"Locked"?

But, to be fair, plenty of people don't like the idea of living in a simulation. Probably didn't play enough videogames as kids : P

... but seriously, folks, if this turns out to be an issue, you don't have to keep posthumanity in simulations. Many people assume we wont. It's just easier to simulate awesome things than build them, that's all.

Even Hanson's cockroach-topia sounds better than that.

OK, that's a really excellent name for it. Upvoting just for that.

Comment author: Dorikka 04 October 2013 01:01:32AM 1 point [-]

Hire a different world-builder, then. :)

Comment author: shminux 04 October 2013 02:27:18AM -1 points [-]

Hmm, by "we" I assume that you mean "transhumansts"? Or some subset of them?

We've won when the universe has been torn apart and rebuilt for our benefit

I am in general dubious about "post-singularity utopias", as Moss_Piglet put it. I am all for rebuilding the universe, once "we" know what "we" are doing, but I am skeptical that it will, say, put an end to all suffering, or achieve some similarly sweeping goals. It just pattern matches too closely to every end-of-the-world myth ever. But that's a different discussion, maybe during some meetup.

Comment author: [deleted] 04 October 2013 02:44:59AM *  4 points [-]

Hmm, by "we" I assume that you mean "transhumansts"? Or some subset of them?

"We" is an elaborate ruse to sugar coat the fact that I really mean "I".

I am in general dubious about "post-singularity utopias", as Moss_Piglet put it.

It is annoying when people pose obviously bad solutions and then use that to argue against trying to solve anything. You may not mean that, but it would help the rest of us if serious critics would carefully distinguish themselves from crackpots.

I am skeptical that it will, say, put an end to all suffering, or achieve some similarly sweeping goals.

One of our meetups should be a workshop on the limits and potential of post-singularity superintelligence, so that we can work out ambiguities on whether such things are possible.

It just pattern matches too closely to every end-of-the-world myth ever.

That is of course totally concerning. I don't know what to do about it.

But that's a different discussion, maybe during some meetup.

There's a meetup every weekend, you know. We tend to have good discussions recently.