fubarobfusco comments on "Stupid" questions thread - Less Wrong

40 Post author: gothgirl420666 13 July 2013 02:42AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (850)

You are viewing a single comment's thread. Show more comments above.

Comment author: fubarobfusco 13 July 2013 05:46:35PM 2 points [-]

One approach: Think of two terms or ideas that are similar but want distinguishing. "How is a foo different from a bar?" For instance, if you're looking to learn about data structures in Python, you might ask, "How is a dictionary different from a list?"

You can learn if your thought that they are similar is accurate, too: "How is a list different from a for loop?" might get some insightful discussion ... if you're lucky.

Comment author: SaidAchmiz 14 July 2013 07:26:23AM 2 points [-]

Of course, if you know sufficiently little about the subject matter, you might instead end up asking a question like

"How is a browser different from a hard drive?"

which, instead, discourages the expert from speaking with you (and makes them think that you're an idiot).

Comment author: Kaj_Sotala 14 July 2013 08:11:06AM 1 point [-]

I think that would get me to talk with them out of sheer curiosity. ("Just what kind of a mental model could this person to have in order to ask such a question?")

Comment author: SaidAchmiz 14 July 2013 04:26:54PM 6 points [-]

Sadly, reacting in such a way generally amounts to grossly overestimating the questioner's intelligence and informedness. Most people don't have mental models. The contents of their minds are just a jumble; a question like the one I quoted is roughly equivalent to

"I have absolutely no idea what's going on. Here's something that sounds like a question, but understand that I probably won't even remotely comprehend any answer you give me. If you want me to understand anything about this, at all, you'll have to go way back to the beginning and take it real slow."

(Source: years of working in computer retail and tech support.)

Comment author: Kaj_Sotala 14 July 2013 06:48:26PM *  3 points [-]

Even "it's a mysterious black box that might work right if I keep smashing the buttons at random" is a model, just a poor and confused one. Literally not having a model about something would require knowing literally nothing about it, and today everyone knows at least a little about computers, even if that knowledge all came from movies.

This might sound like I'm just being pedantic, but it's also that I find "most people are stupid and have literally no mental models of computers" to be a harmful idea in many ways - it equates a "model" with a clear explicit model while entirely ignoring vague implicit models (that most of human thought probably consists of), it implies that anyone who doesn't have a store of specialized knowledge is stupid, and it ignores the value of experts familiarizing themselves with various folk models (e.g. folk models of security) that people hold about the domain.

Comment author: ChristianKl 15 July 2013 09:28:37AM 4 points [-]

Literally not having a model about something would require knowing literally nothing about it, and today everyone knows at least a little about computers, even if that knowledge all came from movies.

Even someone who has know knowledge about computer will use a mental model if he has to interact with a computer. It's likely that he will borrow a mental model from another field. He might try to treat the computer like a pet.

If people don't have any mental model in which to fit information they will ignore the information.

Comment author: Error 15 July 2013 12:50:10PM *  1 point [-]

It's likely that he will borrow a mental model from another field. He might try to treat the computer like a pet.

I think...this might actually be a possible mechanism behind really dumb computer users. I'll have to keep it in mind when dealing with them in future.

Comparing to Achmiz above:

Most people don't have mental models.

Both of these feel intuitively right to me, and lead me to suspect the following: A sufficiently bad model is indistinguishable from no model at all. It reminds me of the post on chaotic inversions.

Comment author: ChristianKl 15 July 2013 09:19:31PM 2 points [-]

Both of these feel intuitively right to me, and lead me to suspect the following: A sufficiently bad model is indistinguishable from no model at all.

Mental models are the basis of human thinking. Take original cargo cultists. They had a really bad model of why cargo was dropped on their island. On the other hand they used that model to do really dumb things.

A while ago I was reading a book about mental models. It investigates how people deal with the question: "You throw a steel ball against the floor and it bounches back. Where does the energy that moves the ball into the air come from?"

The "correct answer" is that the ball contracts when it hits the floor and then expands and that energy then brings the ball back into the air. In the book they called it the phenomenological primitives of springiness.

A lot of students had the idea that somehow the ball transfers energy into the ground and then the ground pushes the ball back. The idea that a steel ball contracts is really hard for them to accept because in their mental model of the world steel balls don't contract.

If you simply tell such a person the correct solution they won't remember. Teaching a new phenomenological primitives is really hard and takes a lot of repetition.

As a programmer the phenomenological primitives of recursion is obvious to me. I had the experience of trying to teach it to a struggling student and had to discover how hard it is too teach it from scratch. People always want to fit new information into their old models of the world.

People black out information that doesn't fit into their models of the world. This can lead to some interesting social engieering results.

A lot of magic tricks are based on faulty mental models by the audience.

Comment author: Kaj_Sotala 16 July 2013 09:02:08AM *  1 point [-]

Which book was that? Would you recommend it in general?

Comment author: SaidAchmiz 15 July 2013 02:31:40PM 1 point [-]

This reminds me of the debate in philosophy of mind between the "simulation theory" and the "theory theory" of folk psychology. The former (which I believe is more accepted currently — professional philosophers of mind correct me if I'm wrong) holds that people do not have mental models of other people, not even unconscious ones, and that we make folk-psychological predictions by "simulating" other people "in hardware", as it were.

It seems possible that people model animals similarly, by simulation. The computer-as-pet hypothesis suggests the same for computers. If this is the case, then it could be true that (some) humans literally have no mental models, conscious or unconscious, of computers.

If this were true, then what Kaj_Sotala said —

Literally not having a model about something would require knowing literally nothing about it

would be false.

Of course we could still think of a person as having an implicit mental model of a computer, even if they model it by simulation... but that is stretching the meaning, I think, and this is not the kind of model I referred to when I said most people have no mental models.

Comment author: ChristianKl 16 July 2013 02:57:52PM *  1 point [-]

Simulations are models. They allow us to make predictions about how something behaves.

Comment author: SaidAchmiz 16 July 2013 03:36:55PM 0 points [-]

The "simulation" in this case is a black box. When you use your own mental hardware to simulate another person (assuming the simulation theory is correct), you do so unconsciously. You have no idea how the simulation works; you only have access to its output. You have no ability to consciously fiddle with the simulation's settings or its structure.

A black box that takes input and produces predictive output while being totally impenetrable is not a "model" in any useful sense of the word.

Comment author: Kaj_Sotala 16 July 2013 09:05:22AM 0 points [-]

and that we make folk-psychological predictions by "simulating" other people "in hardware", as it were.

How does this theory treat the observation that we are better with dealing with the kinds of people that we have experience of? (E.g. I get better along with people of certain personality types because I've learned how they think.) Doesn't that unavoidably imply the existence of some kinds of models?

Comment author: NancyLebovitz 16 July 2013 02:41:29PM 0 points [-]

If people don't have any mental model in which to fit information they will ignore the information.

I'm pretty sure this is correct.

Comment author: Kaj_Sotala 16 July 2013 09:00:55AM 0 points [-]

Thanks, that's a good point.

Comment author: SaidAchmiz 14 July 2013 07:30:25PM 3 points [-]

Fair enough. Pedantry accepted. :) I especially agree with the importance of recognizing vague implicit "folk models".

However:

it implies that anyone who doesn't have a store of specialized knowledge is stupid

Most such people are. (Actually, most people are, period.)

Believe you me, most people who ask questions like the one I quote are stupid.

Comment author: ChristianKl 15 July 2013 09:17:19AM 2 points [-]

Sadly, reacting in such a way generally amounts to grossly overestimating the questioner's intelligence and informedness. Most people don't have mental models. The contents of their minds are just a jumble; a question like the one I quoted is roughly equivalent to

Most people do have mental models in the sense the word get's defined in decision theory literature.