Sorry about my lack of clarity: By "complex" I mean "intricately ordered" rather than the simple disorder generally expected of an entropic process. To taboo both this and alignment as "following the same pattern as":
I’d like to make the case that emergent complexity is where…
- a whole system is more intricately ordered than the sum of its parts
- a system follows more closely the pattern of a macroscopic phenomenon than it follows the pattern of any of its component parts.
By a macroscopic phenomenon, I mean any (or all) of the following:
1. Another physical feature of the world which it fits to, like roads aligning with a map and its terrain (and obstacles).
2. Another instance of what appears to fulfil a similar purpose despite entirely different paths to get there or materials (like with convergence)
3. A conceptual feature of the world, like a purpose or function.
So, we can more readily understand an emergent phenomenon in relation to some other macroscopic phenomenon than we can were we to merely inspect the cells in isolation. In other words, there is usefulness identifying the 20+ varieties of eyes as "eyes" (2) even though they are not the same at all, on a cellular level. It is also meaningful to understand that they perform a function or purpose (3), and that they fit the physical world (by reflecting it relatively accurately) (1).
This is an error I see people making over and over... That different theory may be a useful new development! But that is what it is, not a defence of the original theory.
I think this is the crux of our disagreement. Yudkowsky was denying the usefulness of a term entirely because some people use it vaguely. I am trying to provide a less vague and more useful definition of the term—not to say Yudkowsky is unjustified in criticising the use of the term, but that he is unjustified in writing it off completely because of some superficial flaws in presentation, or some unrefined aspects of the concept.
An error that I see happening often is throwing out the baby with the bathwater, and I've read people on Less Wrong (even Yudkowsky I think, though I can't remember where, sorry) write in support of ideas like "Error Correction" as a virtue and Bayesian updating whereby we take criticisms as an opportunity to refine a concept rather than writing it off completely.
I am trying to take part in that process, and I think Yudkowsky would have been better served had he done the same—suggested a better definition that is useful.
Thanks for your comment, but I think it misses the mark somewhat.
While googling to find someone who expresses a straw-man position in the real-world is a form of straw-manning itself, this comment goes further to misrepresent a colloquial use of the word "magical" to mean literal (supernatural) "magic".
While I haven't read the book referenced, the quotes provided do not give enough context to claim that the author doesn't mean what he obviously means (to me at least) that the development of an emergent phenomena seems magical... does it not seem magical? Seeming magical is not a claim that something is not reducible to its component parts, it just means it's not immediately reducible without some thorough investigation into the mechanisms at work. Part and parcel of the definition of emergence is that it is a non-magical (bottom-up) way of understanding phenomena that seem remarkable (magical), which is why he uses a clearly non-supernatural system like an anthill to illustrate it.
Despite all this, the purpose of the post was to give a clear definition of emergence that doesn't fall into Yudkowsky's strawman—not a claim that no one has ever used the word loosely in the past. As conceded in the preamble (paraphrasing) I don't expect something written 18 years ago to perfectly reflect the conceptual landscape of today.
Thanks, and yes, I did scan over the comments when I first read the article, and noted many good points, but when I decided to write I wanted to focus on this particular angle and not get lost in an encyclopaedia defences. I'm very much in the same camp as the first comment you quote.
I appreciate your take on Yudkowsky's overreach, and the historical context. That helps me understand his position better.
The semantic stop-sign is interesting, I do appreciate Yudkowsky coming up with these handy handles for ideas that often crop up in discussion. Your two examples make me think of the fallacy of composition, in that emergence seems to be a key feature of reality that, at least in part, makes the fallacy of composition a fallacy.
Thanks for your well considered comment.
Could you explain what exactly you mean by "complex" here?
So, here I'm just stating the requirement that the system adds complexity, and that it is not merely categorically different. So, heat, for instance could be seen is categorically different to the process that it "emerged" from, but it would not qualify as "emergent" it is clearly entropic, reducing complexity. Whereas an immune system is built on top of an organism's complexity, it is a more complex system because it includes all the complexity of the system it emerged from + its own complexity (or to use your code example, all the base code plus the new branch).
The second part is more important to my particular way of understanding emergence.
What does "aligned" mean in this context?
I think I could potentially make this clearer as it seems "alignment" comes with a lot of baggage, and has potentially been worn out in general (vague) usage, making its correct usage seem obscure and difficult to place. By "aligned with" I mean not merely related to but, "following the same pattern as", that pattern might be a function it plays or a physical or conceptual shape that is similar. So, the slime mold and the Tokyo rail system share a similar shape, they have converged on a similar outcome because they are aligned with a similar pattern (efficiency of transport given a particular map).
Cells that a toe consists of are different than cells that a testicle or an eye consist of.
I think we're in agreement here, my point is that the eye or testicle perform a (macroscopic) function, the cells they are made of are less important than the function—of the 20+ different varieties of eyes, none of them are made of the same cells, but it still makes sense to call them eyes, because they align with the function, eyes are essentially cell-agnostic, as long as they converge on a function.
Again, thanks for the response, I'll try to think of some edits that help make these aspects clearer in the text.
Thanks Jonas, that's really nice of you to say, and a great suggestion. I've had a look at doing sequences here. Now that I have more content, I'll take your request and run with it.
For now, over on the site I have the posts broken up into curated categories that work as rudimentary sequences, if you'd like to check them out. Appreciate your feedback!
Thanks? (Does that mean it’s well structured?) You’re the second person to have said this. The illustrations are original, as is all the writing.
As I mentioned to the other person who raised this concern, the blog I write (the source) is an outlet for my own ideas, using chat would sort of defeat the purpose.
I can assure you that the words and images are all original, I'm quite capable of vagueifying something myself—I don't have a content quota to make, just trying to present ideas I've had, so it would be quite antithetical to the project to get chat to write it.
By "aligned" I'm not meaning "related to", I mean "maps to the same conceptual shape", "correlated with" or "analogous to". So the nutrient pathways of slime molds are aligned with the Tokyo rail system, but they are not related (other than by sharing an alignment with a pattern). Whereas peanut butter is related to toast, but it's not aligned with it.
But I appreciate the feedback, if you're able to point to something specifically that's vague, I'll definitely get in there and tighten it up.
The "Soldier Mindset" flag is a fair enough call, I guess this could be seen as persuasion (a no-no). Perhaps, I would rather frame it as bypassing emotions (that are acting as barriers to understanding) in order to connect. Correctly understanding the other person's position, or core beliefs, you actually have to let go of your own biases, and in the process might actually become more open to their position.
Thanks for your comment, I appreciate your points, and see that Yudkowsky appreciates some use of higher-level abstractions as a pragmatic tool that is not erased by reductionism. But I still feel like you're being a bit too charitable. I re-read the 'it's okay to use 'emerge"' parts several times, and as I understand it, he's not meaning to refer to a higher-level abstraction, he's using it in the general sense "whatever byproduct comes from this" in which case it would be just as meaningful to say "heat emerges from the body" which does not reflect any definition of emergence as a higher-level abstraction. I think the issue comes into focus with your final point:
But it is not correct to say that acknowledging intelligence as emergent doesn't help us predict anything. If emergence can be described as a pattern that happens across different realms then it can help to predict things, through the use of analogy. If for instance we can see that neurones are selected and strengthened based on use, we can transfer some of our knowledge about natural selection in biological evolution to provide fruitful questions to ask, and research to do, on neural evolution. If we understand that an emergent system has reached equilibrium, it can help us to ask useful questions about what new systems might emerge on top of that system, questions we might not otherwise ask if we were not to recognise the shared pattern.
A question I often ask myself is "If the world itself is to become increasingly organised, at some point do we cease to be autonomous entities an on a floating rock, and become instead like automatic cells within a new vector of autonomy (the planet as super-organism)". This question only comes about if we acknowledge that the world itself is subject to the same sorts of emergent processes that humans and other animals are (although not exactly, a planet doesn't have much of a social life, and that could be essential to autonomy). I find these predictions based on principles of emergence interesting and potentially consequential.