Comment author: AnneC 31 July 2008 07:11:55PM 13 points [-]

Anne, feel free not to answer this one: What do you know about neurotypicals that neurotypicals don't know about themselves?

Wow, that's an interesting one. I don't think I can make a valid general statement that some particular thing that's true of ALL nonautistic people but that none of them know themselves, so I won't even attempt that.

However, the thing that does come to mind in response to your question (and I don't know if this counts but I'll put it forth anyway) is that I do find myself often aware when (nonautistic) people are making certain assumptions about reality that are transparent to them because they happen so automatically, but apparent to me because I don't make those assumptions.

I'm sure I make other assumptions (as all humans, insofar as I know, use heuristics to some extent), but it's pretty evident that my heuristic set is somewhat atypical, and judging from the cog-sci stuff I've read, some of this could probably relate to a difference in how low-level perceptual information is processed.

E.g., there have been times when people have commented on something I've done, "You must have spent a lot of time on that!" or even "Too much effort" (as one teacher wrote on a project I did in high school), when in fact I haven't necessarily spent a lot of time on said thing, or put in what I'd consider to be heroic amounts of effort. I've also had the opposite experience, wherein I've tried very hard to do something for a long time, and still not been able to, and gotten numerous comments regarding how I could do it if only I "tried harder" or "relaxed".

To me, this says that many (mostly non-autistic) people tend toward a particular way of perceiving and processing certain kinds of information, and are hence presuming that certain things are going to be relatively easier or more difficult based on the assumptions their processing style encourages. And it also tells me that in those cases, I am sometimes more aware of how their processing style might be working than they are -- that is, what variables they might be ignoring without realizing it.

Hopefully this doesn't come across as horribly presumptuous -- I'm perfectly aware that this can go in the other direction. Where I see there being potential here (as far as helping further an understanding of cognition goes) is in the fact that minds with at least somewhat different basic assumption-sets can sometimes point out these assumption sets across cognitive style gaps, leading to a greater meta-awareness of the kinds of assumptions that tend to get made and what their consequences can be.

Comment author: AnneC 31 July 2008 05:16:58PM 19 points [-]

Commenting on the autism thing (as I've got an insider's perspective there): one thing that strongly characterized my experience growing up was being consistently "mis-read" by those around me. While I (and, I'd wager, most others on the autistic spectrum) do have some "standard" reactions to things (like laughing when amused, smiling when happy, etc.), I don't always emote in visibly standard ways. This led a lot of people, while I was growing up, to believe that I "didn't care" in situations where I cared deeply, that I had intentions I didn't have, that I was sad/lonely when in fact I was just neutrally preoccupied with something, etc.

I also tend(ed) to get read as "nervous" a lot because I can be fidgety and have difficulty speaking (or, in some cases, talk a mile a minute simply because I don't have much vocal modulation) -- and while like everyone I get anxious occasionally, I am probably no more generally anxious than average, and despite being introverted, I am definitely not "shy".

Anyway, even before I found out I was on the spectrum, I had figured out that I was (what I termed) "differently mapped" -- as in, I'd realized that my outward signals didn't mean the same things that people assumed them to. Earlier, in around fourth grade, I'd determined that I might actually be an alien because of how disconnected I felt from those around me and how often I was called "weirdo". I soon decided that it was scientifically infeasible for me to actually have come from outer space, but still, in communicating with other autistics, I have been amazed at how common it is for us to wonder as children whether we're "not entirely human". There's even some thought that "changeling" mythology (in which young children are said to have been "replaced" by elves or faerie babies, whose qualities perplex or annoy the parents) is based in early observations of autistics and other atypical children.

Also, regarding the "autistics anthropomorphize less" thing: my experience as a youngster was subjectively similar to what I've seen termed "panpsychism". That is, I didn't really distinguish between "live" and "non-live" things at all, or between humans and nonhuman animals -- everything was "potentially alive" as far as I was concerned. I've since learned otherwise (due to learning about brains and nerves and such), and I no longer wonder if objects like pencils and Lego blogs feel pain, but I definitely still feel a kind of "psychological unity" with nonhuman animals, especially cats, as their actions make a lot of sense to me for some reason.

I've confirmed that I am not unique in this among the autistic population; several others have described similar experiences (I know one autistic kid who, upon determining that the electronic pokemon plush he'd just gotten at the store didn't light up the way it was supposed to, decided to keep it anyway because he figured it still needed a home). Which is interesting to me, as the stereotype seems to be that autistics see the whole world (including the people in it) as "dead" and "empty". My experience was the precise opposite of this; I perceived the whole world as vibrant and suffused with great depth and beauty and complexity, to the point where humans didn't always stand out as the most interesting thing in my environment (which is probably why I was seen as "oblivious to other people" at times).

Nevertheless, I certainly wouldn't describe autistics as "actually alien", as we evolved here on Earth just like everyone else. We're just a particular variation of human. And I do actually think that despite the lack of a simplistic "psychological unity" that can be fully detected on the basis of outward expressions, there is definitely a deeper psychological unity. Autistic humans and nonautistic humans alike can feel happy, sad, frustrated, angry, appreciative of beauty, disgusted, etc., even if we show these emotions in different ways and in response (sometimes) to different experiences.

One of the great challenges of what I'd call "social progress" is that of figuring out how humans with different cognitive styles can learn to communicate with one another and recognize that different but equally "valid" minds do in fact exist already within the human population. I also think it is probably relevant to AI research to look at how humans who *are* cognitively different in various ways end up coming to understand one another, because this does happen at least occasionally. I've noticed that in relating to other autistics I experience a lot more of what feels like the ability to take accurate "short cuts" to mutual understanding, and it occurred to me a while back that perhaps that "short cut" feeling is what many nonautistic people experience all the time with the majority of those around them.

Comment author: AnneC 20 February 2008 08:13:44AM 7 points [-]

Perhaps you mean that, in characteristics where humans are known to vary, one should suspend judgment / assume the default probability distribution, rather than assuming the person is known to be average?

Yes. I put notions like "humans are generally vulnerable to Death by Hemlock" in a different class than notions like "Girls don't like science". For one thing, the stakes are a lot higher in the former case: you don't harm a female by not assuming she doesn't like science, but you might kill a human by feeding them hemlock under the assumption that you "need more data". There's plenty of empirical data on the effects of hemlock poisoning in entities you'd likely classify as "human" (for the purpose of this exercise), after all, and it seems pretty clear that hemlock ingestion is much more hazardous than not being subjected to the assumption that you hate science because you have a uterus.

Again, I think it's import to see the kind of categorization you dislike as 'inept categorization', including attempts to infer from the category things that have already been observed and hence ought properly to be screened off; rather than 'forbidden categorization'.

No argument from me there.

Comment author: AnneC 20 February 2008 07:04:27AM 14 points [-]

there's no point in trying to make someone match the average (or mere stereotype) of the female-human cluster if you already have access to more detailed information about her than that.

I would say that there's little to no point in trying to make someone match the average/stereotype about someone even if you don't have access to more detailed information about her than that. Or, at the very least, people should be capable of maintaining awareness of the information that someone is female without their connotations of what "female" means blocking their ability to take in new data about that person.

As an engineer, I've come across an unsettling number of assumptions that "engineering needs women because they're so much better at multitasking and working in groups" -- e.g., my presence in engineering is welcomed on the basis of supposed "positives" that I don't actually provide. So while patting themselves on the back for earning Diversity Points, some folks are simultaneously holding female engineers responsible for providing the Wanted Stereotypical Ability. And meanwhile, the real (and useful) abilities that J. Random Engineer Who Happens To Be Female might provide get ignored, or not believed to exist until the engineer in question performs a sufficient number of Extraordinary Superhero Feats to get branded "The Exception".

What you're objecting to isn't so much the shortcut, it seems to me, as the way-too-short, much-shorter-than-necessary cut. "Playing with spaceship Lego" isn't an atomically detailed description of you either, but it's more information than "female (human)".

Yes, exactly. I remember always feeling kind of weird as a kid because I tended to identify more with male characters in stories (because I had more in common with them interest-wise and personality-wise), and yet, I knew I supposedly belonged to a category called "female". Hence, I really liked it when I came across "tomboy" characters or girls who were good at math and science (like Meg Murry from "A Wrinkle In Time"), because reading about those characters gave me a bit of a "cognitive dissonance vacation". I know some people dismiss the impact of fiction on culture, but since fiction is both a thing that culture both produces and is influenced by, I have always appreciated it when authors can successfully manage to realistically portray a character that subverts particular stereotypes -- such works can have the curious effect of reassuring particular segments of the population that yes, they do, in fact, exist.

Also, this post makes me think of this entry in the TV Tropes wiki: "You Know What They Say About X" (a corollary of which could be Positive Discrimination)

Comment author: AnneC 20 February 2008 04:45:30AM 28 points [-]

Mainly I see categories as useful only as "shorthand", and then only along very particular vectors.

For example, one category that includes people like me (at least along one particular axis) is "female". To me, all this really means is that I'm physiologically configured in a particular way that influences what kinds of bathrooms I can use and what kinds of doctors I need to see. In that respect, "female" is a useful and descriptive category.

But in other respects, it isn't at all useful. As a youngster I went through a phase of "not seeing myself as female" -- not because I hated my physical form (I don't) but because everything that people seemed to associate with "females" didn't fit me. As a female, I was expected (by my surrounding culture) to like pink things, to want to wear dresses, to prefer "domestic" games to construction toys or computers. I was also expected to have certain kinds of social skills I didn't have, as well as certain cognitive tendencies. Etc. So my initial reaction was to wonder whether or not I was a "real girl" in the first place.

Eventually, though, my brain did a sort of flip and I realized that the problem wasn't that I was "inauthentically female", but that people were taking the things about me that were actually female (e.g., aspects of my physiology) and using those things as a basis for assuming a whole bunch of other things. And my reaction was one of indignance at that point: why can't a Real Girl play with the spaceship Lego and wear pants on special occasions (instead of annoying, uncomfortable dresses)?

So, I'm quite familiar with the phenomenon described in this post. It's actually kind of surprising to learn (as I have fairly recently) that many people actually memorize a category definition and then attempt to force-fit reality into it, rather than just gathering a lot of data over time and then (when necessary for the sake of practicality or shorthand) applying category-labels to some members of that data set along particular specified vectors.

In other words, if I'm applying for a job, the fact that I have ovaries shouldn't be a factor (unless the job happens to be something like "egg donor", but that's not something I really see myself getting into). But if I suddenly start experiencing weird abdominal pain, the fact that I have ovaries (and other female internals) becomes pertinent information. The category is context-specific and I think a lot of problems come in when people try to "universalize" categories across all contexts and along all vectors.

In response to Superhero Bias
Comment author: AnneC 02 December 2007 06:09:15PM 0 points [-]

Eli: I wouldn't argue otherwise, though I think that sometimes people think they are applying "utilitarianism" when in fact they are just going with "what sounds good" to them based on very limited information.

This is probably just something I should think harder about and go off and make a post on my own blog about, but one thing I've noticed recently is that many people claim that their preferred option provides the most utility, when in fact, there are other options they haven't even considered. So as I conceded above, the nursing home scenario (in which people don't even consider that perhaps nursing homes aren't necessary, or that they might do more harm than good) is probably a manifestation of "lazy thinking", but it seems as if there's a particular kind of lazy thinking that the structure of utilitarian/consequentialist thinking enables, if not the actual content.

That is, what looks like the "greatest good" on the surface might not actually be, and people need to be wary of assuming they've "found the answer" to a particular problem, particularly if "the answer" entails treating some groups of people in ways that would be considered unethical if applied to other groups. E.g., a person can't just stick their mother-in-law in a nursing home because they find her irritating; however, if she has any trouble whatsoever managing daily living tasks, society will justify and enable nursing-home placement almost without question.

Hmm. In some respects this actually relates back to the original post, in the sense that people might be tempted to choose a socially acceptable/enabled option and feel that their job is "done", when in fact, they could add more value to more people's lives by taking the reputational risk of an "outside the box" option.

In response to Superhero Bias
Comment author: AnneC 02 December 2007 09:26:14AM 2 points [-]

Prakash: Good example, I can see that.

Nick: Institutional bias comes to mind. A lot of people think that some groups (the elderly, people with particular disabilities) "naturally belong" in institutions, when the fact is that institutions are completely unnecessary. There is no form of care provided IN an institution that cannot be provided in the community (often for lower cost, though I don't have exact figures on hand right now). And institutions themselves tend to be internally structured in such a way that power imbalances, abusive situations (see the Stanford Prison Experiment), and "learned helplessness" are perpetuated.

I'm not saying that a "proper" utilitarian (whatever that means) would agree that all old people need to be put in nursing homes "for the good of the community", but there are people who believe that institutional care saves money and represents an appropriate option for people with certain kinds of health needs. This leads to a situation in which some health problems (pneumonia, infections, etc.) run the risk of being associated with the mere fact of being a certain kind of person as opposed to associated with an institutional environment. Yet very few people seem willing to consider alternatives to nursing homes, since they see such facilities as fundamentally part of the landscape and not potential sources of problems.

That's the first example that came to mind, though I'll give that people's reluctance to seek alternatives may simply be a result of lazy thinking as opposed to specifically "utilitarian" thinking.

A second (semi-related) example is that of when intensive behavioural therapy was used in order to "cure" homosexual tendencies in boys who seemed to exhibit same-sex affections and "effeminate" behaviour. It wasn't until 1973 that activists managed to get homosexuality removed from the Diagnostic and Statistical Manual of Mental Disorders. And when young men were subjected to "therapies" intended to make them heterosexual, the basic idea behind these therapies was that it was only the outward result that mattered -- the notion of subjective internal distress was not even considered in light of the pervasive social sense that homosexuality was unhealthy, aberrant, sinful, etc. That is, the "consequence" of ensuing hetero-normative behaviour was deemed much more important than whether the treatment led to depression or other mental health issues. It certainly wasn't the psychiatrists or researchers who came up with the idea that forcing people to outwardly conform to social norms (like heterosexual behaviour) could result in internal strife; it was the actual individuals being subject to constant pathologization.

I guess what I'm saying is, while I do think utilitarian/consequentialist thinking have their places (to the extent that I understand them), they are somewhat vulnerable to a tendency to support prevailing social and structural norms even when those norms are destructive and damaging. (And beware the fallacy of the excluded middle -- I am certainly not saying that the utility-minded and consequentialists among us are all evil and blinded to social injustice, I'm just saying that there are traps people need to watch out for. Incompleteness applies to philosophies as well as to mathematics.

In response to Superhero Bias
Comment author: AnneC 02 December 2007 03:11:55AM 1 point [-]

I realize people have to make decisions regarding how to best distribute their own resources, and I agree that "whichever cause saves the most lives" is a far, far better choice criterion than "whichever cause is more likely to make others look admiringly at me". That's a no-brainer as far as I'm concerned. Of course we each have to deal with, if not a literal lack of sufficient resources, the difficulty of figuring out how to distribute resources effectively where they are needed. My comment was meant simply to caution people against assuming too quickly that they know they're dealing with a zero-sum tradeoff, which could lead them to make a decision that effectively ends up saving fewer people.

I'm sure there's a component of having to avoid spending too much time seeking more information as people keep dying left and right (as they wait for you to make up your mind), but given that, I think my point still stands. Not every situation is as clear-cut as it might seem to be at first, and plus, when it comes to the information people have available about possible charitable causes, there's a lot of "noise" to sort through (case in point: some celebrities are very gung-ho about "curing autism", to the point where I doubt that some of them even realize that autism isn't fatal!)

In response to Superhero Bias
Comment author: AnneC 02 December 2007 02:03:52AM 0 points [-]

One must also be wary of what might be termed the "zero-sum game" bias. In real life, how often does it really occur that a person is faced with either saving X people OR saving Y people (where Y is greater than X)?

The notion of human lives as some sort of currency to be paid in exchange for fate's favor seems like something stemming primarily from the world of mythology and story problems, rather than something stemming from practical reality. While of course people shouldn't let themselves be "blinded by their own greatness" to the point where they merely save whatever group of people they believe will signal the most virtue on their part, it is important to keep one's mind attuned to the fact that sometimes, it is possible to save everyone.

(And, while I do think from a values standpoint that "Whoever saves a single life, it is as if he had saved the whole world", this does not imply that I believe that people who can save lives should feel "satisfied" once they have saved a single life. And I don't think saving lots of lives is good because of some abstract "additive utility" that cannot ever be subjectively appreciated by any entity; I think saving lots of lives is good because individuals and their unique perspectives are irreplaceable, and from the perspective of each person who is saved, the entire world has been saved -- seeing as they can only continue to experience the world if they are saved!)

Comment author: AnneC 11 October 2007 03:52:40AM 1 point [-]

Spinoza suggested that we first passively accept a proposition in the course of comprehending it, and only afterward actively disbelieve propositions which are rejected by consideration.

That sounds like what Sam Adams was saying at the Singularity Summit -- the idea of "superstition" being essential to learning in some respects.

View more: Prev | Next