ialdabaoth comments on How can I reduce existential risk from AI? - Less Wrong

46 Post author: lukeprog 13 November 2012 09:56PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (92)

You are viewing a single comment's thread.

Comment author: ialdabaoth 11 November 2012 11:41:20PM 2 points [-]

Regarding "making money" / "accumulating wealth": Why is wealth in my hands preferable to wealth in someone else's hands?

Comment author: Pablo_Stafforini 11 November 2012 11:54:48PM 14 points [-]

Because it's extremely unlikely that a random person will be at least as concerned with existential risk as you are.

Comment author: ialdabaoth 12 November 2012 01:07:21AM 5 points [-]

But why is it likely that I'll be better at doing anything about it? Just because I try to be rational, doesn't mean I'm any good at it - especially at something where we have no idea what the correct actions are. How do I know that my efforts will even have a positive dot-product with the "do the right thing" vector?

Comment author: Kaj_Sotala 12 November 2012 10:31:05AM 7 points [-]

The average person has zero interest in fighting existential risk. It's very easy to do better than average, if the average is zero. Even if you've only spent fifty hours (say) familiarizing yourself with the topic, that's already much better than most.

Comment author: [deleted] 12 November 2012 11:32:02AM *  1 point [-]

The average person has zero interest in fighting existential risk.

This strikes me as, ahem, an inflationary use of the term zero. Try negligible instead. :-)

EDIT: Turns out it was an inflationary use of the term average instead. :-) Retracted.

Comment author: Kaj_Sotala 12 November 2012 12:16:27PM 1 point [-]

Well, if we measure interest by what they actually do, then I hold by the "zero".

Comment author: [deleted] 12 November 2012 12:57:30PM *  5 points [-]

EY's interest in fighting existential risks is strictly greater than 0 as far as I can tell; is someone else cancelling that out in the average? (Or by average did you actually mean 'median'?) The number of arms the average human has is strictly less than 2.

Comment author: Kaj_Sotala 12 November 2012 02:18:26PM *  5 points [-]

I meant median.

Comment author: Giles 12 November 2012 01:57:10AM 0 points [-]

Do you feel the odds improve if you choose "become stronger" first?

Comment author: ialdabaoth 12 November 2012 02:31:51AM 2 points [-]

No, because I am perpetually trying to "become stronger", and I still have yet to solve the problem that I'm just "trying" (due to an infinite regress of "trying to try to try to try...)

Comment author: Giles 12 November 2012 02:52:38AM 1 point [-]

What kinds of approach have you tried, and which have resulted in any progress? (at least according to their own metric - e.g. you make progress in "becoming better at math" if you become better at math, even if you're not sure this has improved your general rationality)

Comment author: ialdabaoth 12 November 2012 04:26:44AM 1 point [-]

Well, I've managed to dissuade myself of most cognitive biases clustered around the self-serving bias, by deliberately training myself to second-guess anything that would make me feel comfortable or feel good about myself. This one worked pretty well; I tend to be much less susceptible to 'delusional optimism' than most people, and I'm generally the voice of rational dissent in any group I belong to.

I've tried to make myself more 'programmable' - developed techniques for quickly adjusting my "gut feelings" / subconscious heuristics towards whatever ends I anticipated would be useful. This worked out really well in my 20's, but after awhile I wound up introducing a lot of bugs into it. As it turns out, when you start using meditation and occult techniques to hack your brain, it's really easy to lock yourself into a stable attractor where you no longer have the capacity to perform further useful programming. Oops.

Comment author: Giles 12 November 2012 05:24:29AM 1 point [-]

I'm not sure if this advice will be useful to you, but I think what I'd do in this situation would be to stick to standard techniques and avoid the brain hacking, at least for now. "Standard techniques" might be things like learning particular skills, or asking your friends to be on the lookout for weird behavior from you.

One other thing - though I may have understood you here - you say you've trained yourself not to feel good about yourself in certain circumstances. To compensate, have you trained yourself to feel better about yourself in other circumstances? I'd guess there's an optimal overall level of feeling good about yourself and our natural overall level is probably about right (but not necessarily right about the specifics of what to feel good about)

Comment author: ialdabaoth 12 November 2012 05:48:05AM *  3 points [-]

No, in general I've trained myself to operate, as much as possible, with an incredibly lean dopamine mixture. To hell with feeling good; I want to be able to push on no matter how bad I feel.

(As it turns out, I have limits, but I've mostly trained to push through those limits through shame and willpower rather than through reward mechanisms, to the point that reward mechanisms generally don't even really work on me anymore - at least, not to the level other people expect them to).

A lot of this was a direct decision, at a very young age, to never exploit primate dominance rituals or competitive zero-sum exchanges to get ahead. It's been horrific, but... the best metaphor I can give is from a story called "Those Who Walk Away From Omelas".

Essentially, you have a utopia, that is powered by the horrific torture and suffering of a single innocent child. At a certain age, everyone in the culture is explained how the utopia works, and given two choices: commit fully to making the utopia worth the cost of that kid's suffering, or walk away from utopia and brave the harsh world outside.

I tried to take a third path, and say "fuck it. Let the kid go and strap me in."

So in a sense, I suppose I tried to replace normal feel-good routines for a sort of smug moral superiority, but then I trained myself to see my own behavior as smug moral superiority so I wouldn't feel good about it. So, yeah.

Comment author: Giles 12 November 2012 06:13:38AM 4 points [-]

Are you sure this is optimal? You seem to have goals but have thrown away three potentially useful tools: reward mechanisms, primate dominance rituals and zero-sum competitions. Obviously you've gained grit.

Comment author: Strange7 13 November 2012 10:10:36PM 1 point [-]

Problem being that Omelas doesn't just require that /somebody/ be suffering; if it did, they'd probably take turns or something. It's some quality of that one kid.

Comment author: roystgnr 13 November 2012 11:16:23PM 1 point [-]

Even if benthamite's excellent answer wasn't true, and every random person was at least as concerned with existential risk as you are, it would still be useful for you to accumulate wealth. The economy is not a zero-sum game; producing wealth for yourself does not necessitate reducing the wealth in the hands of others.

Comment author: Benja 11 November 2012 11:50:10PM 1 point [-]

It isn't... if those other hands would give at least as much of it to x-risk reduction as you will.