anonym comments on A Nightmare for Eliezer - Less Wrong

0 Post author: Madbadger 29 November 2009 12:50AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (74)

You are viewing a single comment's thread. Show more comments above.

Comment author: anonym 29 November 2009 07:00:15AM 2 points [-]

I see your point, but I don't think either of those is (or should be) embarrassing. Higher-level aspects of intelligence, such as capacity for abstraction and analogy, creativity, etc., are far more important, and we have no known peers with respect to those capacities.

The truly embarrassing things to me are things like paying almost no attention to global existential risks, having billions of our fellow human beings live in poverty and die early from preventable causes, and our profound irrationality as shown in the heuristics and biases literature. Those are (i.e., should be) more embarrassing limitations, not only because they are more consequential but because we accept and sustain those things in a way that we don't with respect to WM size and limitations of that sort.

Comment author: DanArmak 29 November 2009 05:41:59PM 3 points [-]

Higher-level aspects of intelligence, such as capacity for abstraction and analogy, creativity, etc., are far more important, and we have no known peers with respect to those capacities.

What do you think of the suggestion that you feel they are more important in part because humans have no peers there?

Comment author: anonym 30 November 2009 12:14:03AM 1 point [-]

That's an astute question. I think I almost certainly do value those things more than I otherwise would if we did have peers. Having said that, I believe that even if we did have peers with respect to those abilities, I would still think that, for example, abstraction is more important, because I think it is a central aspect of the only general intelligence we know in a way that WM is not. There may be other types of thought that are more important, and more central, to a type of general intelligence that is beyond ours, but I don't know what they are, so I consider the central aspects of the most general intelligence I know of to be the most important for now.

Comment author: DanArmak 30 November 2009 12:41:38AM 0 points [-]

abstraction is more important, because I think it is a central aspect of the only general intelligence we know in a way that WM is not.

In what way is that? I don't see why abstraction should be considered more important to our intelligence than WM. Our intelligence can't go on working without WM, can it?

Comment author: anonym 30 November 2009 01:32:10AM 0 points [-]

I can imagine life evolving and general intelligence emerging without anything much like our WM, but I can't imagine general intelligence arising without something a lot like (at least) our capacity for abstraction. This may be a failure of imagination on my part, but WM seems like a very powerful and useful way of designing an intelligence, while abstraction seems much closer to a precondition for intelligence.

Can you conceive of a general intelligence that has no capacity for abstraction? And do you not find it possible (even if difficult) to think of general intelligence that doesn't use a WM?

Comment author: wedrifid 30 November 2009 06:09:25PM 0 points [-]

Can you conceive of a general intelligence that has no capacity for abstraction? And do you not find it possible (even if difficult) to think of general intelligence that doesn't use a WM?

Particularly since our most advanced thinking has far less reliance on our working memory. Advanced expertise brings with it the ability to manipulate highly specialised memories in what would normally be considered long term memory. It doesn't replace WM but it comes close enough for our imaginative purposes!

Comment author: DanArmak 30 November 2009 05:56:38PM 0 points [-]

I agree with you about intelligences in general. I was asking about your statement that

it is a central aspect of the only general intelligence we know

i.e. that WM is less important than abstraction, in some sense, in the particular case of humans - if that's what you meant.

Comment author: anonym 02 December 2009 07:42:04AM 0 points [-]

I mean just that abstraction is central to human intelligence and general intelligence in a way that seems necessary (integral and inseparable) and part of the very definition of general intelligence, whereas WM is not. I can imagine something a lot like me that wouldn't use WM, but I can't imagine anything remotely like me or any other kind of general intelligence that doesn't have something very much like our ability to abstract. But I think that's pretty much what I've said already, so I'm probably not helping and should give up.

Comment author: Gavin 29 November 2009 09:39:36PM 1 point [-]

They may be far more important because we have no peers. That's what makes it a competitive advantage.

Comment author: DanArmak 29 November 2009 11:34:43PM 1 point [-]

That makes them important in our lives, yes, but anonym's comment compares us against the set of all possible intelligences (or at least all intelligences that might one day trace their descent from us humans). If so there should be an argument for their objective or absolute importance.

Comment author: anonym 30 November 2009 12:22:50AM 1 point [-]

I don't think they are objectively or absolutely the most important with respect to all intelligences, only to the most powerful intelligence we know of to this point. If we encountered a greater intelligence that used other principles that seemed more central to it, I'd revise my belief, as I would if somebody outlined on paper a convincing theory for a more powerful kind of intelligence that used other principles.

Comment author: wedrifid 29 November 2009 07:56:53AM *  0 points [-]

Yeah, those are rather worse! I guess it depends just how tragic and horific something can be and still be embarrassing!