Logo

Emergence

An essential concept in systems sciences is designated by the term emergence, originating from the Latin emergere for the process of the "arising" or the "coming out" of a new phenomenon or property. A system is said to have emergent properties if these properties are obviously generated in the interaction of the system's components which by themselves do not show these properties. The concept is often - and slightly misleadingly - associated with the Aristotelian phrase of a whole that is more than the sum of its parts, and distinguished in respect to a micro/macro-difference, the difference between micro-level-interaction - the interaction on the level of the components of a system - and the macro-level-phenomenon - the behavior or property of the system as such.

Segregation as a first example

A famous example for the emergence of system properties is the phenomenon of segregation as considered by the economist Thomas Schelling, who around the year 1970 was interested in the settlement preferences of residents from different ethnic backgrounds in American cities. He proposed to explain the observed segregation patterns - often showing distinct separation of, for instance, inhabitants with European and Afro-American origins - with a simple model designed on a checkerboard but later becoming an early instance of what is called multi-agent-simulation. Schelling distributed two kinds of coins (Pennies and Dimes) in random order on the checkerboard and then considered each coin in respect to their adjacent neighbor coins. In general on a checker board (i.e. in two dimensions), each position not laying at the edge of the board has eight adjacent neighbors. If the positions on the edges are considered to border with positions on the respective opposite edge of the board - left to right and bottom to top - this applies to all positions on the board. Mathematically, the resulting figure is called a torus.

Schelling defined the following rule: if a coin would have less than a specified percentage of neighboring coins of the same kind, for instance 50%, this coin was to be transferred to an empty space on the checkerboard. He kept on repeating (iterating) this simple rule, starting with the coin in the upper left corner of the board moving down the lower right corner and starting over again until no more coins could be moved since all coins were positioned according to the specified percentage. As result, this process creates striking segregation patterns like the one shown in the right image below (from a computer-simulation after 14 runs through all dots. The left image shows the random distribution at setup).

An interesting and somewhat counterintuitive further result from this simulation is the fact that similar segregation patterns still emerge if the "tolerance threshold" of the percentage of coins of the same kind is lowered to 40% or even 30%. The patterns are less distinct, but segregation is clearly discernible, indicating that even relatively "tolerant" neighborhoods can create what is called "race segregation", a property that, in this case, would not be observable as a property of the interacting components. Segregation hence is an emergent property.

Emergence in the digital age

Scientists use the term emergence for quite some time. In the 19th century several phenomena were suspected to be caused by an interaction of underlying components creating new properties on a system level, but leaving it unclear how these properties were brought about in detail. The fluidity or transparency of water for instance remained enigmatic, since it was known that water consists of hydrogen and oxygen with both of this substances not showing any of these qualities on their own. Even more astounding in this respect seems the fire-extincting quality of water with hydrogen and oxygen both being highly inflammable by themselves.  In being associated with the Aristotelian notion of a whole being more than the sum of its parts, these phenomena entailed somehow vague and at times even unscientific connotations. It remained unclear if something – and if yes what – was missing in the explanations the term emergence provided. When 20th-century science managed to explain some of these properties with the help of quantum-mechanics or biochemistry the term was discarded from natural sciences. In some other disciplines it remained in use, but was confined to denote such hard-to-grasp phenomena like consciousness or life.

The systems science pioneer Ludwig von Bertalanffy for instance based his conception of the (then) new science of systems on this concept, emphasizing however the need for a scientific interpretation of the term.

“In emergent evolution every step: atom, molecule, colloidal unit, biokyl, cell, cellular organism, colony of organisms, marks the attainment of new peculiarities which, in contrast to resultants, cannot be derived from the subordinate elements. [...] We must try to establish a new standpoint which – as opposed to mechanism – takes account of organic individuality and wholeness, but – in contrast to vitalism – treats it in a manner which admits of scientific investigation.” (Bertalanffy 1933: 46)

Bertalanffy, Ludwig von (1933) Modern Theories of Development. An Introduction to Theoretical Biology. Oxford University Press.

Beginning in the 1960ies, with the rising interest in self-organizing processes and in particular with the advent of computer science the discussions of emergence resurfaced. At first, in the context of research on Cellular Automata, later in what was to become agent-based modeling (ABM), the term proved to be useful to denote the phenomenon of pattern formation and similar global (or macro-level) effects of local (or micro-level) interactions.

With this kinds of computer-based research the claim became testable that no hidden or missed causes are involved in the generation of emergent phenomena. The macro-level effect – the up-to-20-cell-pattern in the rule-22-CA for instance – can arise from nothing more than the recursive interaction of micro-level causes – the interaction of just three neighboring cells. The actual process of aggregation, leading from local neighborhood-interaction to global patterns, might remain difficult to follow in details, even in the case of simple CA interactions. But in the simulation of these micro-level interactions – which when needed can simply be repeated step by step in slow motion – the computer unmistakably confirms the simple causality of these phenomena. In simulation the strict determinism becomes evident, although the phenomenon itself might still appear complex and somehow enigmatic when assessed without artificial help.

In respect to the conditioned conceivability of emergent phenomena, Mark Bedau (1997) suggested to define “weak” (i.e. roughly: simulatable) emergence as “explanatory incompressible”, meaning that an analytical explanation cannot be provided without “crawling the micro-causal web by way of simulation”. In other words, (simulated) emergent properties can be considered causally determined, but are not reducible other than by way of computation. They are “reducible in principle, but not in practice” (Bedau 2008). Another pioneer of modeling and simulation, Joshua Epstein (2006), expressed similar more profanely with the phrase “If you didn’t grow it, you didn’t explain it”. The term emergence hence seems to have found its eligibility in the digital age.