jimrandomh comments on Siren worlds and the perils of over-optimised search - Less Wrong

27 Post author: Stuart_Armstrong 07 April 2014 11:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (411)

You are viewing a single comment's thread. Show more comments above.

Comment author: jimrandomh 14 May 2014 03:20:05PM 1 point [-]

What are the prerequisites for grasping the truth when it comes to AI risks?

Ability to program is probably not sufficient, but it is definitely necessary. But not because of domain relevance; it's necessary because programming teaches cognitive skills that you can't get any other way, by presenting a tight feedback loop where every time you get confused, or merge concepts that needed to be distinct, or try to wield a concept without fully sharpening your understanding of it first, the mistake quickly gets thrown in your face.

And, well... it's pretty clear from your writing that you haven't mastered this yet, and that you aren't going to become less confused without stepping sideways and mastering the basics first.

Comment author: [deleted] 14 May 2014 06:02:05PM 1 point [-]

it's necessary because programming teaches cognitive skills that you can't get any other way, by presenting a tight feedback loop where every time you get confused, or merge concepts that needed to be distinct, or try to wield a concept without fully sharpening your understanding of it first, the mistake quickly gets thrown in your face.

On a complete sidenote, this is a lot of why programming is fun. I've also found that learning the Coq theorem-prover has exactly the same effect, to the point that studying Coq has become one of the things I do to relax.

Comment author: Lumifer 14 May 2014 03:32:36PM 1 point [-]

programming teaches cognitive skills that you can't get any other way

That looks highly doubtful to me.

Comment author: trist 14 May 2014 04:54:10PM -1 points [-]

You mean that most cognitive skills can be taught in multiple ways, and you don't see why those taught by programming are any different? Or do you have a specific skill taught by programming in mind, and think there's other ways to learn it?

Comment author: Lumifer 14 May 2014 05:06:50PM 3 points [-]

There are a whole bunch of considerations.

First, meta. It should be suspicious to see programmers claiming to posses special cognitive skills that only they can have -- it's basically a "high priesthood" claim. Besides, programming became widespread only about 30 years ago. So, which cognitive skills were very rare until that time?

Second, "presenting a tight feedback loop where ... the mistake quickly gets thrown in your face" isn't a unique-to-programming situation by any means.

Third, most cognitive skills are fairly diffuse and cross-linked. Which specific cognitive skills you can't get any way other than programming?

I suspect that what the OP meant was "My programmer friends are generally smarter than my non-programmer friends" which is, um, a different claim :-/

Comment author: Nornagest 14 May 2014 05:29:20PM 5 points [-]

I don't think programming is the only way to build... let's call it "reductionist humility". Nor even necessarily the most reliable; non-software engineers probably have intuitions at least as good, for example, to say nothing of people like research-level physicists. I do think it's the fastest, cheapest, and currently most common, thanks to tight feedback loops and a low barrier to entry.

On the other hand, most programmers -- and other types of engineers -- compartmentalize this sort of humility. There might even be something about the field that encourages compartmentalization, or attracts to it people that are already good at it; engineers are disproportionately likely to be religious fundamentalists, for example. Since that's not sufficient to meet the demands of AGI problems, we probably shouldn't be patting ourselves on the back too much here.

Comment author: Lumifer 14 May 2014 05:58:10PM 0 points [-]

Can you expand on how do you understand "reductionist humility", in particular as a cognitive skill?

Comment author: Nornagest 14 May 2014 06:33:58PM 4 points [-]

I might summarize it as an intuitive understanding that there is no magic, no anthropomorphism, in what you're building; that any problems are entirely due to flaws in your specification or your model. I'm describing it in terms of humility because the hard part, in practice, seems to be internalizing the idea that you and not some external malicious agency are responsible for failures.

This is hard to cultivate directly, and programmers usually get partway there by adopting a semi-mechanistic conception of agency that can apply to the things they're working on: the component knows about this, talks to that, has such-and-such a purpose in life. But I don't see it much at all outside of scientists and engineers.

Comment author: [deleted] 14 May 2014 06:44:02PM 1 point [-]

IOW realizing that the reason why if you eat a lot you get fat is not that you piss off God and he takes revenge, as certain people appear to alieve.

Comment author: Lumifer 14 May 2014 07:01:16PM 0 points [-]

internalizing the idea that you and not some external malicious agency are responsible for failures.

So it's basically responsibility?

...that any problems are entirely due to flaws in your specification or your model.

Clearly you never had to chase bugs through third-party libraries... :-) But yes, I understand what you mean, though I am not sure in which way this is a cognitive skill. I'd probably call it an attitude common to professions in which randomness or external factors don't play a major role -- sure, programming and engineering are prominent here.

Comment author: Nornagest 14 May 2014 07:23:32PM *  0 points [-]

So it's basically responsibility?

You could describe it as a particular type of responsibility, but that feels noncentral to me.

Clearly you never had to chase bugs through third-party libraries...

Heh. A lot of my current job has to do with hacking OpenSSL, actually, which is by no means a bug-free library. But that's part of what I was trying to get at by including the bit about models -- and in disciplines like physics, of course, there's nothing but third-party content.

I don't see attitudes and cognitive skills as being all that well differentiated.

Comment author: TheAncientGeek 14 May 2014 07:33:24PM -1 points [-]

But randomness and external factors do predominate in almost everything. For that reason, applying programming skills to other domains is almost certain to be suboptimal

Comment author: Lumifer 14 May 2014 07:36:07PM 1 point [-]

But randomness and external factors do predominate in almost everything.

I don't think so, otherwise walking out of your door each morning would start a wild adventure and attempting to drive a vehicle would be an act of utter madness.

Comment author: TheAncientGeek 14 May 2014 05:54:14PM *  0 points [-]

Much of the writing on this site is philosophy, and people with a technology background tend not to grok philosophy because they are accurated to answer that can be be looked up, or figured out by known methods. If they could keep the logic chops and lose the impatience, they [might make good philosophers], but they tend not to.

Comment author: Nornagest 14 May 2014 05:58:54PM 0 points [-]

If they could keep the logic chops and lose the impatience, they child,might be,are good philosophers, but they tend not to.

Beg pardon?

Comment author: [deleted] 14 May 2014 03:21:01PM -1 points [-]

And, well... it's pretty clear from your writing that you haven't mastered this yet, and that you aren't going to become less confused without stepping sideways and mastering the basics first.

People have been telling him this for years. I doubt it will get much better.