Today's post, The Psychological Unity of Humankind was originally published on 24 June 2008. A summary (taken from the LW wiki):

 

Because humans are a sexually reproducing species, human brains are nearly identical. All human beings share similar emotions, tell stories, and employ identical facial expressions. We naively expect all other minds to work like ours, which cause problems when trying to predict the actions of non-human intelligences.


Discuss the post here (rather than in the comments to the original post).

This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was Optimization and the Singularity, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.

Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.

New Comment
6 comments, sorted by Click to highlight new comments since:

As Razib Khan, Tim Tyler and Nick Tarleton explain in the comments, the post is more wrong than right. I seem to vaguely remember that some other posts in the sequences were also shown to be wrong, maybe we should make a list.

Not that we should of course only check the bottom line, but it seems to me that even if the strongest ev psych claims about gender, the strongest race realist claims about race, and the strongest social constructionist claims about culture were all correct, human minds would still occupy a very small portion of mindspace. The moral, then, would seem to be correct: neither our biological nor cultural nor individual histories have shaped our intuitions for dealing with the sort of minds we might be able to create.

Could it have no "sense of self?" Could it be more like a swarm with implausibly uncanny optimization capabilities than a Mind from one of the Culture novels? Perhaps it would be like a "hegemonic swarm" from one of those books.

Is human civilization as a whole such an entity? I can look at humanity as a whole with a mindset, such that all of human civilization and culture can look like a "soulless, monstrous, hegemonic swarm."

What would entities that do have a sense of self, but no compatible concept of sex and mammalian politics seem like?

I read in The Selfish Gene that spiders are wired with the evolutionary stable strategy of always buckling when confronted by an invader in a territorial dispute. This works because all spiders are so wired, so overly rigorous territory battles are avoided. It's evolutionarily stable because a contrarian spider born in such a population would get the tar beat out of it. It seems that human beings tend to defend their territory. Perhaps there's a race of aliens who are great explorers because members of their species are wired like spiders and so are always being pushed outwards. If they encounter us, they might expect us to vacate our planet. This thought makes "invasion" stories a smidgen more likely, though it's just a smidgen in comparison to all the other implausibility in such stories.

[-][anonymous]30

EY mentions kids and embryonic development, but then writes "human brains are nearly identical." Either kids and embryos aren't human beings or this is a false statement. Embryo brains, kid brains and adult brains are not nearly identical. Perhaps one could exclude people with traumatic brain injuries as not representational, but embryos and kids are as representational as it gets.

No, the only really alien intelligence on this planet is natural selection, of which I have already spoken... for exactly this reason, that it gives you true experience of the Alien. Evolution knows no joy and no anger, and it has no facial expressions; yet it is nonetheless capable of creating complex machinery and complex strategies. It does not work like you do.

If you want a real alien to gawk at, look at the other Powerful Optimization Process.

I recall comments made by Kasparov after his defeat by Deep Blue. Something along the lines of having confronted another order of intelligence.

I suspect that looking at another Powerful Optimization Process is useful for getting a better sense of the distance, but is possibly going too far, somewhat like contemplating a light-year to get a sense of the distances involved in landing on Mars.

Another social intelligence is likely to have mechanisms for dealing with something like politics. (If anyone has ever lived with a dog and a parrot, one has a couple of points of comparison for a very slightly alien, and slightly more alien intelligence, though these are very close to us in comparison to evolution.)

I wonder if this implies that what Eliezer and most AI researchers are currently working with is of the order of alienness that evolution is?

[+]Shmi-90