Eliezer_Yudkowsky comments on Raising the Sanity Waterline - Less Wrong

112 Post author: Eliezer_Yudkowsky 12 March 2009 04:28AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (207)

You are viewing a single comment's thread. Show more comments above.

Comment author: RobinHanson 12 March 2009 01:04:15PM 10 points [-]

I suspect you are right; the issue isn't that these people haven't "learned" relevant abstractions or tools. They just don't have enough incentives to apply those tools in these context. I'm not sure you "teach" incentives, so I'm not sure there is anything you can teach which will achieve the goal stated. So I'd ask the question: how can we give people incentives to apply their tools to cases like religion?

Comment author: Eliezer_Yudkowsky 12 March 2009 06:49:40PM 6 points [-]

I think there's a question of understanding here, not just incentives. The knowledge of minds as cognitive engines or the principle of the bottom line, is the knowledge that in full generality you can't draw an accurate map of a city without seeing it or having some other kind of causal interaction with it. This is one of the things that readers have cited as the most important thing they learned from my writing on OB. And it's the difference between being told an equation in school to use on a particular test, versus knowing under what (extremely general) real-world conditions you can derive it.

Like the difference between being told that gravity is 9.8 m/s^2 and being able to use that to answer written questions about gravity on a test or maybe even predict the fall of clocks off a tower, but never thinking to apply this to anything except gravity. Versus being able to do and visualize the two steps of integral calculus that get you from constant acceleration A to 1/2 A t^2, which is much more general than gravity.

If you knew on a gut level - as knowledge - that you couldn't draw a map of a city without looking at it, I think the issue of incentives would be a lot mooter. There might still be incentives whether or not to communicate that understanding, whether or not to talk to others about it, etc., but on a gut level, you yourself would just know.

Comment author: pjeby 12 March 2009 07:42:36PM 7 points [-]

Even if you "just know", this doesn't grant you the ability to perform an instantaneous search-and-replace on the entire contents of your own brain.

Think of the difference between copying code, and function invocation. If the function is defined in one place and then reused, you can certainly make one change, and get a multitude of benefits from doing so.

However, this relies on the original programmer having recognized the pattern, and then consistently using a single abstraction throughout the code. But in practice, we usually learn variations on a theme before we learn the theme itself, and don't always connect all our variations.

And this limitation applies equally to our declarative and procedural memories. If there's not a shared abstraction in use, you have to search-and-replace... and the brain doesn't have very many "indexes" you can use to do the searching with -- you're usually limited to searching by sensory information (which can include emotional responses, fortunately), or by existing abstractions. ("Off-index" or "table scan" searches are slower and unlikely to be complete, anyway -- think of trying to do a search and replace on uses of the "visitor" pattern, where each application has different method names, none of which include "visit" or use "Visitor" in a class name!)

It seems to me that yours and Robin's view of minds still contains some notion of a "decider" -- that there's some part of you that can just look and see something's wrong and then refuse to execute that wrongness.

But if mind is just a self-modifying program, then not only are we subject to getting things wrong, we're also subject to recording that wrongness, and perpetuating it in a variety of ways... recapitulating the hardware wrongs on a software level, in other words.

And so, while you seem to be saying, "if people were better programmers, they'd write better code"... it seems to me you're leaving out the part where becoming a better programmer has NO effect...

On all the code you've already written.