The Outside View Of Human Complexity

14 Andy_McKenzie 08 October 2011 06:12PM

One common question: how complex is some aspect of the human body? In addition to directly evaluating the available evidence for that aspect, one fruitful tactic in making this kind of prediction is to analyze past predictions about similar phenomena and assume that the outcome will be similar. This is called reference class forecasting, and is often referred to on this site as "taking the outside view." 

First, how do we define complexity? Loosely, I will consider a more complex situation to be one with more components, either in total number or type, which allows for more degrees of freedom in the system considered. Using this loose definition for now, how do our predictions about human complexity tend to fare? 

Point: Predictions about concrete things have tended to overestimate our complexity

Once we know about their theoretical existence of phenomenon but before they are systematically measured, our predictions about measurable traits of the human body tend to err on the side of being more complex (i.e., more extensive or variable) than reality. 

1) Although scholars throughout history have tended to think that human brains must be vastly differently from those of other animals, on the molecular and cellular level there have turned out to be few differences. As Eric Kandel relates in his autobiography (p. 236), "because human mental processes have long been thought to be unique, some early students of the brain expected to find many new classes of proteins lurking in our gray matter. Instead, science has found surprisingly few proteins that are truly unique to the human brain and no signaling systems that are unique to it." 

2) There turned out to be fewer protein-coding genes in human body than most people expected. We have data on this by way of an informal betting market in the early 2000's, described here ($) and here (OA). The predictions ranged from 26,000 - 150,000, and that lower bound prediction won, even though it probably wasn't low enough! As of 2008, the predicted number by Ensembl was in the 23,000s. (As an aside, humans don't have the largest genome in terms of number of nucleotides either, by far. That title currently belongs to the canopy plant, pictured below (thanks to kodamatic for the photo, and to Pellicer et al. for the sequencing effort).)

3) Intro neuro texts (including one co-written by the aforementioned Kandel) claim that there are 10-fold (or more) more glia cells than neurons in the human brain. Since glia play crucial support roles and can even propagate info signals, this is not a trivial claim and would vastly increase the processing power of the brain. But when it has actually been measured, the ratio of glial to neural cells is actually around one to one in most species, including humans (see here and here). 

Counterpoint: Categories we use to explain the function of our bodies have tended to be more arbitrary than we recognize

1) One active area of research is in determining whether the distinguishing characteristics between what we consider cell "types" are more quantitative or qualitative (i.e., degree rather than form). Consider, for example, the continuum between the "classical" m1 and "alternative" m2 macrophages, which contributes whether those immune cells will be pro- or anti-tumor. Or consider the gradient of pluripotency in stem cells. If cell types are on a spectrum, depending upon the sort of transcripts or proteins they contain at any given moment, that suggests that they may be able to have more different sorts of interactions at different points in time.

2) Although we found fewer human genes than most geneticists expected, components of genes (exons) have been found to be able to combine in many ways, a phenomenon called alternative splicing. One article (here) found that of genes with multiple exons, more than 90% are alternatively spliced. Specifically, these researchers found ~67,000 alternatively spliced transcripts from ~20,000 genes. Since these alternatively spliced genes have different nucleic acid sequences, they could (and probably do) have quite different functions. 

3) The chromatin state of a given portion of the genome, i.e. where it falls on the spectrum of euchromatic vs heterochromatic, seems to have the ability to explain a large percentage of a variance in whether or not that gene is expressed. For example, one study (here) shows a strikingly high correlation between the ability of one transcription factor to bind to DNA and the chromatin state of that region of DNA (check figure 3). The fact that these chromatin states can be transmitted between generations via germ cells is also a fascinating finding that has implications which increase the complexity of human biology as compared to the "static DNA" model. 

Synthesis: When to expect more or less complexity

The above is far from systematic, but I think it portrays the trends. The known unknowns have tended to end up lower in complexity than we've predicted. But unknown unknowns continue to blindside us, unabated, adding to the total complexity of the human body. 

Why do we tend to over-estimate the complexity of known unknowns in the human body? People who study biological processes want to find more "degrees of freedom" in their systems, so that the phenomenon they're studying can have more explanatory power. The standard reason for this is that they want their results to have an impact in preventing or curing diseases, while the cynical ("Hansonian") reason is that they want to attract more status and funding. The real answer is probably a mix of both, but either way, the result is that we tend to over-estimate the complexity of the known unknowns. 

Why does it take so long to recognize the vast number of unknown unknowns? I think the best explanation for this is the standard, "Kuhnian" one, that shifting a paradigm is difficult. Adding an entirely new facet to any established scientific discipline requires slow-moving institutional support, and human biology is no exception. Look, for example, at the history of neurogenesis. Another explanation is technological, that we just don't have the capacity to observe certain things until we reach a given level of engineering success. We could not have known about histone-based epigenetics until we had the capacity to visualize cells at the level of electron microscopy (see pdf).  

The next time someone uses an argument like "the human body is so complex," try to notice whether they are referring to a prediction about the way that the human body and biology work in general, or one particular aspect of the human body. If they're referring to the general issue, at scales from the atomic to the molecular to the tissue level, they're right: there's loads we don't understand and probably lots of important stuff we don't even know about. But if they're referring to a particular as-of-yet unmeasurable aspect of the human body, past history suggests that that particular phenomenon is likely to be less complex than you might guess. 

References

Kandel, E. In Search of Memory: The Emergence of a New Science of Mind. amazon

Pennisi, A. 2003 Low Number Wins the GeneSweep Pool. abstract.   

Human Genome Information Project. 2008 How Many Genes Are in the Human Genome?. link

Pellicer J. 2010 The largest eukaryotic genome of them all? abstract. doi: 10.1111/j.1095-8339.2010.01072.x

Kandel E, et al. Principles of Neural Science. amazon

Azevedo FA, et al. 2009 Equal numbers of neuronal and nonneuronal cells make the human brain an isometrically scaled-up primate brain. pubmed

Ma J, et al. 2010 The M1 form of tumor-associated macrophages in non-small cell lung cancer is positively associated with survival time. doi:10.1186/1471-2407-10-112

Hough SR, Laslett AL, Grimmond SB, Kolle G, Pera MF (2009) A Continuum of Cell States Spans Pluripotency and Lineage Commitment in Human Embryonic Stem Cells. PLoS ONE 4(11): e7708. doi:10.1371/journal.pone.0007708

Toung JM. 2011 RNA-sequence analysis of human B-cells. abstract. doi:10.1101/gr.116335.110

John S, et al. 2011 Chromatin accessibility pre-determines glucocorticoid receptor binding patterns. doi:10.1038/ng.759.

Olins DE and Olins AL. 2003 Chromatin history: our view from the bridge, pdf

Wheeler A. A Brief History and Timeline: Adult mammalian neurogenesis. link.  

Complexity: inherent, created, and hidden

8 Swimmer963 14 September 2011 02:33PM

Related to: inferential distance, fun theory sequence.

“The arrow of human history…points towards larger quantities of non-zero-sumness. As history progresses, human beings find themselves playing non-zero-sum games with more and more other human beings. Interdependence expands, and social complexity grows in scope and depth.” (Robert Wright, Nonzero: The Logic of Human Destiny.)

What does it mean for a human society to be more complex? Where does new information come from, and where in the system is it stored? What does it mean for everyday people to live in a simple versus a complex society?

There are certain kinds of complexity that are inherent in the environment: that existed before there were human societies at all, and would go on existing without those societies. Even the simplest human society needs to be able to adapt to these factors in order to survive. For example: climate and weather are necessary features of the planet, and humans still spend huge amounts of resources dealing with changing seasons, droughts, and the extremes of heat and cold. Certain plants grow in certain types of soil, and different animals have different migratory patterns. Even the most basic hunter-gatherer groups needed to store and pass on knowledge of these patterns. 

But even early human societies had a lot more than the minimum amount of knowledge required to live in a particular environment. Cultural complexity, in the form of traditions, conventions, rituals, and social roles, added to technological complexity, in the form of tools designed for particular purposes. Living in an agricultural society with division of labour and various different social roles required children to learn more than if they had been born to a small hunter-gatherer band. And although everyone in a village might have the same knowledge about the world, it was (probably) no longer possible for all the procedural skills taught and passed on in a given group to be mastered by a single person. (Imagine learning all the skills to be a farmer, carpenter, metalworker, weaver, baker, potter, and probably a half-dozen other things.)

This would have been the real beginning of Robert Wright’s interdependence and non-zero-sum interactions. No individual could possess all of the knowledge/complexity of their society, but every individual would benefit from its existence, at the price of a slightly longer education or apprenticeship than their counterparts in hunter-gather groups. The complexity was hidden; a person could wear a robe without knowing how to weave it, and a clay bowl without knowing how to shape it or bake it in a kiln. There was room for that knowledge in other people’s brains. The only downside, other than slightly longer investments in education, was a small increase in inferential distance between individuals.

Writing was the next step. For the first time, a significant amount of knowledge could be stored outside of anyone’s brain. Information could be passed on from one individual, the writer, to a nearly unbounded number of others, the readers. Considering the limits of human working memory, significant mathematical discoveries would have been impossible before there was a form of notation. (Imagine solving polynomial equations without pencil and paper.) And for the first time, knowledge was cumulative. An individual no longer had to spend a number of years mastering a particular, specific skill in an apprenticeship, having to laboriously pass on any new discoveries one at a time to their own apprentices. The new generation could start where the previous generation had left off. Knowledge could stay alive indefinitely, almost, in writing, without having to pass through a continuous line of minds. (Without writing, even if the ancient Greek society had possessed equivalent scientific and mathematical knowledge, it could not have later been rediscovered by any other society.) Conditions were ripe for the total sum of human knowledge to explode, and for complexity to increase rapidly.

The downside was a huge increase in inferential distance. For the first time, not only could individuals lack a particular procedural skill, they might not even know that the skill existed. They might not even benefit from the fact of its existence. The stock market contains a huge amount of knowledge and complexity, and provides non-zero-sum gains to many individuals (as well as zero-sum gains to some individuals). But to understand it requires enough education and training that most individuals can’t participate. The difference between the medical knowledge of professionals versus uneducated individuals is huge, and I expect that many people suffer because, although someone knows how they could avoid or solve their medical problems, they don’t.  Computers, aside from being really nifty, are also incredibly useful, but learning to use them well is challenging enough that a lot of people, especially older people, don’t or can’t.

(That being said, nearly everyone in Western nations benefits from living here and now, instead of in an agricultural village 4000 years ago. Think of the complexity embodied in the justice system and the health care system, both of which make life easier and safer for nearly everyone regardless of whether they actually train as professionals in those domains. But people don’t benefit as much as they could.)

Is there any way to avoid this? It’s probably impossible for an individual to have even superficial understanding in every domain of knowledge, much less the level of understanding required to benefit from that knowledge. Just keeping up with day-to-day life (managing finances, holding a job, and trying to socialize in an environment vastly different from the ancestral one) can be trying, especially for individuals on the lower end of the IQ bell curve. (I hate the idea of intelligence, something not under the individual’s control and thus unfair-seeming, being that important to success, but I’m pretty sure it’s true.) This might be why so many people are unhappy. Without regressing to a less complex kind of society, is there anything we can do?

I think the answer is quite clear, because even as societies become more complex, the arrow of daily-life-difficulty-level doesn’t always go in the same direction. There are various examples of this; computers becoming more user-friendly with time, for example. But I’ll use an example that comes readily to mind for me: automated external defibrillators, or AEDs.

A defibrillator uses electricity to interrupt an abnormal heart rhythm (ventricular fibrillation is the typical example, thus de-fibrillation). External means that the device acts from outside the patient’s body (pads with electrodes on the skin) rather than being implanted. Most defibrillators require training to use and can cause a lot of harm if they’re used wrong. The automated part is what changes this. AEDs will analyze a patient’s heart rhythm, and they will only shock if it is necessary. They have colorful diagrams and recorded verbal instructions. There’s probably a way to use an AED wrong, but you would have to be very creative to find it. Needless to say, the technology involved is ridiculously complex and took years to develop, but you don’t need to understand the science involved in order to use an AED. You probably don’t even need to read. The complexity is neatly hidden away; all that matters is that someone knows it. There weren't necessarily any ground-breaking innovations involved, just the knowledge of old inventions in a user-friendly format.

The difference is intelligence. An AED has some limited artificial intelligence in it, programmed in by people who knew what they were talking about, which is why it can replace the decision process that would otherwise be made by medical professionals. A book contains knowledge, but has to be read and interpreted in its entirety by a human brain. A device that has its own small brain doesn’t. This is probably the route where our society is headed if the arrow of (technological) complexity keeps going up. Societies need to be livable for human beings.

That being said, there is probably such thing as too much hidden complexity. If most of the information in a given society is hidden, embodied by non-human intelligences, then life as a garden-variety human would be awfully boring. Which could be the main reason for exploring human cognitive enhancement, but that’s a whole different story.

Say Not "Complexity"

34 Eliezer_Yudkowsky 29 August 2007 04:22AM

Once upon a time...

This is a story from when I first met Marcello, with whom I would later work for a year on AI theory; but at this point I had not yet accepted him as my apprentice.  I knew that he competed at the national level in mathematical and computing olympiads, which sufficed to attract my attention for a closer look; but I didn't know yet if he could learn to think about AI.

I had asked Marcello to say how he thought an AI might discover how to solve a Rubik's Cube.  Not in a preprogrammed way, which is trivial, but rather how the AI itself might figure out the laws of the Rubik universe and reason out how to exploit them.  How would an AI invent for itself the concept of an "operator", or "macro", which is the key to solving the Rubik's Cube?

At some point in this discussion, Marcello said:  "Well, I think the AI needs complexity to do X, and complexity to do Y—"

And I said, "Don't say 'complexity'."

Marcello said, "Why not?"

continue reading »