THE RE-EMERGENCE OF “EMERGENCE”: A VENERABLE CONCEPT IN SEARCH OF A THEORY Peter A. Corning, Ph.D. Institute For the Study of Complex Systems 119 Bryant Street, Suite 212 Palo Alto, CA 94301 USA Tel: 1-650-325-5717 Fax: 1-650-325-3775 E-mail: [email protected] Website: www.complexsystems.org © Complexity (2002) 7(6): 18-30 Despite its current popularity, “emergence” is a concept with a venerable history and an elusive, ambiguous standing in contemporary evolutionary theory. This paper briefly recounts the history of the term and details some of its current usages. Not only are there radically varying interpretations about what emergence means but “reductionist” and “holistic” theorists have very different views about the issue of causation. However, these two seemingly polar positions are not irreconcilable. Reductionism, or detailed analysis of the parts and their interactions, is essential for answering the “how” question in evolution -- how does a complex living system work? But holism is equally necessary for answering the “why” question -- why did a particular arrangement of parts evolve? In order to answer the “why” question, a broader, multi-leveled paradigm is required. The reductionist approach to explaining emergent complexity has entailed a search for underlying “laws of emergence.” Another alternative is the “Synergism Hypothesis,” which focuses on the “economics” – the functional effects produced by emergent wholes and their selective consequences. This theory, in a nutshell, proposes that the synergistic (co-operative) effects produced by various combinations of parts have played a major causal role in the evolution of biological complexity. It will also be argued that emergent phenomena represent, in effect, a subset of a much larger universe of combined effects in the natural world; there are many different kinds of synergy, but not all synergies represent emergent phenomena. Key Words: Emergence, evolution, natural selection, synergy 1 INTRODUCTION If “complexity” is currently the buzzword of choice for our newly minted millennium — as many theorists proclaim — “emergence” seems to be the explication of the hour for how complexity has evolved. Complexity, it is said, is an emergent phenomenon. Emergence is what “self-organizing” processes produce. Emergence is the reason why there are hurricanes, and ecosystems, and complex organisms like humankind, not to mention traffic congestion and rock concerts. Indeed, the term is positively awe-inspiring. As physicist Doyne Farmer observed: “It’s not magic...but it feels like magic.”[1] Among other things, emergence has been used by physicists to explain Bénard (convection) cells, by psychologists to explain consciousness, by economists and investment advisors to explain stock market behavior, and by organization theorists to explain informal “networks” in large companies. Indeed, a number of recent books view the evolutionary process itself as a self-organizing, emergent phenomenon (see below). But what is emergence? What does it explain, really? And why is it so readily embraced, in spite of its opacity, by reductionists and holists alike.? There are very few terms in evolutionary theory these days — not even “natural selection” — that can command such an ecumenical following. Though emergence may seem to be the “new, new thing” — from the title of the recent bestseller by Michael Lewis about high technology in Silicon Valley — in fact it is a venerable term in evolutionary theory that traces back to the latter 19th and early 20th centuries. It was originally coined during an earlier upsurge of interest in the evolution of wholes, or, more precisely, what was viewed unabashedly in those days as a “progressive” trend in evolution toward new levels of organization culminating in mental phenomena and the human mind. This long-ago episode, part of the early history of evolutionary theory, is not well known today, or at least not fully appreciated. Nonetheless, it provides a theoretical context and offers some important insights into what can legitimately be called the re-emergence of emergence. THE ORIGIN OF EMERGENCE According to the philosopher David Blitz in his definitive history of emergence entitled, appropriately enough, Emergent Evolution: Qualitative Novelty and the Levels of Reality (1992),[2] the term “emergent” was coined by the pioneer psychologist G. H. Lewes in his multi- volume Problems of Life and Mind (1874-1879).[3] Like many post-Darwinian scientists of that period, Lewes viewed the evolution of the human mind as a formidable conundrum. Some evolutionists, like Alfred Russel Wallace (the co-discoverer of natural selection), opted for a dualistic explanation. The mind is the product of a supernatural agency, he claimed. But Lewes, following the lead of the philosopher John Stuart Mill, argued that, to the contrary, certain phenomena in nature produce what he called “qualitative novelty” — material changes that cannot be expressed in simple quantitative terms; they are emergents rather than resultants. To quote Lewes: 2 Every resultant is either a sum or a difference of the cooperant forces; their sum, when their directions are the same — their difference, when their directions are contrary. Further, every resultant is clearly traceable in its components, because these are homogeneous and commensurable... It is otherwise with emergents, when, instead of adding measurable motion to measurable motion, or things of one kind to other individuals of their kind, there is a co- operation of things of unlike kinds...The emergent is unlike its components in so far as these are incommensurable, and it cannot be reduced to their sum or their difference (p. 413). Years earlier, John Stuart Mill had used the example of water to illustrate essentially the same idea: “The chemical combination of two substances produces, as is well known, a third substance with properties different from those of either of the two substances separately, or of both of them taken together” (p. 371).[4] However, Mill himself had an illustrious predecessor. In fact, both Mill and Lewes were resurrecting an argument that Aristotle had made more than 2000 years earlier in a philosophical treatise, later renamed the Metaphysics, about the significance of “wholes” in the natural world. Aristotle wrote: “The whole is something over and above its parts, and not just the sum of them all...” (Book H, 1045:8-10). (We will return to Aristotle’s famous catch-phrase later on.) So the ontological distinction between parts and wholes was not exactly a new idea in the 19th century. The difference was that the late Victorian theorists framed the parts-wholes relationship within the context of the theory of evolution and the challenge of accounting for biological complexity. The basic quandary for holistic theorists of that era was that evolutionary theory as formulated by Darwin did not allow for radically new phenomena in nature, like the human mind (presumably). As every first-year biology student these days knows, Darwin was a convinced gradualist who frequently quoted the popular canon of his day, natura non facit saltum — nature does not make leaps. (The phrase appears no less than five times in The Origin of Species.) Indeed, Darwin rejected the very idea of sharp discontinuities in nature. In The Origin, Darwin emphasized what he called the “Law of Continuity,” and he repeatedly stressed the incremental nature of evolutionary change, which he termed “descent with modification.”[5] Darwin believed that this principle applied as well to the evolution of the “mind”. In the Descent of Man, he asserted that the difference between the human mind and that of “lower” animals was “one of degree and not of kind” (I p. 70). [6] Many theorists of that era viewed Darwin’s explanation as unsatisfactory, or at least incomplete, and emergent evolution theory was advanced as a way to reconcile Darwin’s gradualism with the appearance of “qualitative novelties” and, equally important, with Herbert Spencer’s notion (following Lamarck) of an inherent, energy-driven trend in evolution toward new levels of organization. Emergent evolution had several prominent adherents, but the leading theorist of this school was the comparative psychologist and prolific writer, Conwy Lloyd Morgan, who ultimately published three volumes on the subject, Emergent Evolution (1923), Life, Spirit and Mind (1926) and The Emergence of Novelty (1933).[7-9] (Other theorists in this vein included Samuel Alexander, Roy Wood Sellars, C.D. Broad, Jan Smuts, Arthur Lovejoy and W. M. Wheeler. Jan Smuts, a one-time Prime Minister of South Africa, deserves special note because his volume, Holism and Evolution (1926), advanced the concept of “holistic selection” — the idea that wholes of various kinds might be units of selection in 3 nature. [10] It was a prescient precursor to such later concepts as David Sloan Wilson’s “trait group selection,” John Maynard Smith’s “synergistic selection” and my Synergism Hypothesis – see below.) The main tenets of Lloyd Morgan’s paradigm will sound familiar to modern-day holists: quantitative, incremental changes can lead to qualitative changes that are different from, and irreducible to, their parts. By their very nature, moreover, such wholes are unpredictable. Though higher-level, emergent phenomena may arise from lower-level parts and their actions, there may also be “return action,” or what Lloyd Morgan also called “supervenience” (“downward causation” in today’s parlance). But most important, Lloyd Morgan argued that the evolutionary process has an underlying “progressive” tendency, because emergent phenomena lead in due course to new levels of reality. It was a grand vision, but what did it explain? As Blitz observes, it was not a causal theory. “Emergent evolution related the domains studied by the sciences of physics, chemistry, biology, and psychology — a philosophical task not undertaken by any one of them — but did not propose mechanisms of change specific to any one of them — a scientific task which philosophy could not undertake”(p. 100). [2] Indeed, Lloyd Morgan ultimately embraced a metaphysical teleology that portrayed the evolutionary process as an unfolding of inherent tendencies, which he associated with a creative divinity (shades of Spencer, Henri Bergson, Pierre Tielhard de Chardin and other orthogenetic and “vitalistic” theorists, not to mention some of today’s complexity theorists). In short, emergent evolution in Lloyd Morgan’s hands was not really a scientific theory, though the boundary line was not so sharply delineated back then. But far more damaging to the cause of emergent evolution was the rise of the science of genetics in the 1920s and 1930s and the triumph of an analytical, experimental approach to biology. In its most strident form, reductionism swept aside the basic claim of emergent evolutionists that wholes had irreducible properties that could not be fully understood or predicted by examining the parts alone. Critics like Stephen C. Pepper, Charles Baylis, William McDougall, Rudolph Carnap and Bertrand Russell claimed that emergent qualities were merely epiphenomena and of no scientific significance. Russell, for instance, argued that analysis “enables us to arrive at a structure such that the properties of the complex can be inferred from those of the parts” (pp. 285-286).[11] While the reductionists conceded that it was not currently possible, in many cases, for science to make such inferences and predictions, this shortcoming was a reflection of the state of the art in science and not of some superordinate property in nature itself. In time, it was said, reductionism would be able to give a full accounting for emergent phenomena. THE SUBMERGENCE OF EMERGENCE Under this theoretical onslaught, the doctrine of emergent evolution went into a prolonged eclipse, although it never succumbed completely to the promissory notes proffered by the reductionists. During the decades that followed, the Aristotelian argument that wholes have distinctive, irreducible properties “re-emerged” in several other venues (though often with different terminology). In the 1930s, for example, embryologist Joseph Needham advanced the idea of “integrative levels” in nature and argued for “the existence of [different] levels of organization in the universe, successive forms of order in 4 a scale of complexity and organization” (p. 234)[12]. A decade later, the biologist Julian Huxley, a principal architect of the “modern synthesis” in evolutionary biology, sought to define evolution as “a continuous process from star-dust to human society.” Among other things, Huxley asserted that “now and again there is a sudden rapid passage to a totally new and more comprehensive type of order or organization, with quite new emergent properties, and involving quite new methods of further evolution” (p. 120).[13] Biologist Alex B. Novikoff also defended the idea of emergent levels of reality in a much- cited 1945 article in Science entitled “The Concept of Integrative Levels in Biology.”[14] The growth of the new science of ecology in the 1930s also stimulated an interest in whole systems and macro-level relationships. Among the pioneer ecologists — such as Charles Elton, A.G. Tansley, Raymond Lindeman, G. Evelyn Hutchinson and others — there was much talk about how the natural world is an integrated “economy”, a biological “community” and even, for some theorists, a “quasi-organism” (Tansley). Ironically enough, the seminal concept of an “ecosystem” — which has since become a centerpiece of modern ecology — was originally conceived by Tansley in the context of his belated conversion to reductionism. “Wholes,” he wrote, “are in analysis nothing but the synthesized actions of the components in associations.” (For an in-depth history of ecology, see Donald Worster’s Nature’s Economy: A History of Ecological Ideas, 1977.)[15] A much broader reaffirmation of the importance of wholes in nature occurred in the 1950s with the rise of “general systems theory.” Inspired especially by the writings of biologist Ludwig von Bertalanffy,[16,17] the systems movement was to that era what complexity theory is today, and the Society for General Systems Research, founded in 1956, provided an interdisciplinary haven for the beleaguered band of holistic theorists of that era. (The organization was later renamed The International Society for the Systems Sciences). Indeed, the Society’s yearbook — General Systems — was a beacon (and a treasure-trove) for the systems movement for more than a generation. It included the contributions of such luminaries as Kenneth Boulding, Ralph Gerard, Anatol Rapoport, H. Ross Ashby, Heinz von Foerster, Russell Ackoff, Stafford Beer, Donald T. Campbell, Herbert Simon, George Klir, Robert Rosen, Lawrence Slobodkin, Paul Weiss, James Grier Miller and many others. (Herbert Simon’s 1962 article on “The Architecture of Complexity” was seminal, along with Paul Weiss’s 1969 article on “Determinism Stratified.”)[18,19] “RE-EMERGENCE” It is difficult to attach a date to the re-emergence of emergence as a legitimate, mainstream concept, but it roughly coincided with the growth of scientific interest in the phenomenon of complexity and the development of new, non-linear mathematical tools — particularly chaos theory and dynamical systems theory — which allowed scientists to model the interactions within complex, dynamic systems in new and insightful ways. Among other things, complexity theory gave mathematical legitimacy to the idea that processes involving the interactions among many parts may be at once deterministic yet for various reasons unpredictable. (One oft-noted constraint, for instance, is the way in which initial conditions — the historical context — may greatly influence later outcomes in unforeseeable ways.) 5 One of the benchmarks associated with the re-emergence of emergence was the work of Nobel psychobiologist Roger Sperry [20-23] on mental phenomena and the role of what he was the first to call “downward causation” in complex systems like the human brain. (Donald Campbell may have coined the term independently.)[24] Sperry spoke of the need for “new principles” of “cognitive and emergent causation and top down determinism.” To illustrate, he used the metaphor of a cart wheel rolling down hill; the rim, the spokes, the hub, indeed, all of its atoms are compelled to go along for the ride. Sperry also employed Lloyd Morgan’s term, “supervenience.” Meanwhile, in physics Herman Haken and his colleagues broke new ground with “synergetics” — the science of dynamic, “cooperative” phenomena in the physical realm (though he later ventured into neurological and cognitive phenomena as well). Over the past 20-odd years, synergetics has produced a large body of holistic theory.[25-30] Likewise, the Nobel physicist Ilya Prigogine’s work in non- equilibrium thermodynamics, especially his concept of “dissipative structures,” represents yet another holistic approach to the rise of complexity in nature.[31-37] In the U.S., much of the recent work on the subject of emergence has been fueled by the resources and leadership of the Santa Fe Institute. Beginning in the mid-1980s, the Instititute’s annual Proceedings have contained many articles related to this subject, and a number of the scholars who are associated with the Institute have published books on complexity and emergence. (See especially the volumes by Stuart Kauffman, John Casti, and John Holland; also the two popular books by science writers Roger Lewin and Mitchell Waldrop).[1, 38-44] Kauffman, for instance, theorizes that life is an emergent phenomenon in the sense that it represents a “spontaneous crystallization” of pre-biotic molecules that can catalyze networks of reactions. Life is a collective property of a system of interacting molecules, says Kauffman: “the whole is greater than the sum of its parts”(1995, pp. 23-24). Likewise, Holland published an entire book devoted to the subject, entitled Emergence: From Chaos to Order (1998). WHAT DOES EMERGENCE MEAN? Despite the recent proliferation of writings on the subject, it is still not clear what the term denotes or, more important, how emergence emerges. One problem is that the term is frequently used as a synonym for “appearance”, or “growth”, as distinct from a parts-whole relationship. Thus, one of the dictionaries I consulted defined the term strictly in perceptual terms and gave as an example “the sun emerged from behind a cloud.” Even the Oxford English Dictionary, which offered four alternative definitions, gives precedence to the version that would include a submarine which submerges and then re-emerges. It is not surprising, then, that the overwhelming majority (close to 100%) of the new journal articles on “emergence” and “emergent” that are identified each week by my computer search service involve such subjects as the emergence of democracy in Russia, the emergence of soccer as a school sport in the U.S., the emergence of the Internet, the emergence of mad cow disease, and the like. I have deliberately played on this conflation of meanings in this article to illustrate the point, but even 6 avowed complexity theorists commonly use the term (perhaps unwittingly) in both ways. Thus, the subtitle of Mitchell Waldrop’s book Complexity (1992) is The Emerging Science at the Edge of Order and Chaos.[1] Unfortunately, some theorists seem to take the position that emergence does not exist if it is not perceived; it must be apparent to an observer. But what is a “whole” — how do you know it when you see it, or don’t see it? And is the mere perception of a whole — a “gestalt” experience — sufficient, or even necessary? John Casti, like Lewes and Morgan, associates emergence with dynamic systems whose behavior arises from the interaction among its parts and cannot be predicted from knowledge about the parts in isolation.[41] “The whole is bigger than the sum of its parts,” echoes editor Michael Lissack in the inaugural issue (1999) of the new journal Emergence [45]. John Holland [43], by contrast, describes emergence in reductionist terms as “much coming from little” and imposes the criterion that it must be the product of self-organization, not centralized control. Indeed, Holland tacitly contradicts Casti’s criterion that the behavior of the whole is irreducible and unpredictable. Holland’s approach represents reductionism of a different kind — more like Herbert Spencer’s search for a universal “law” of evolution than Bertrand Russell’s focus on identifying the parts. (Holland does not stand alone these days, as we shall see.) Perhaps the most elaborate recent definition of emergence was provided by Jeffrey Goldstein in the inaugural issue of Emergence.[46] To Goldstein, emergence refers to “the arising of novel and coherent structures, patterns and properties during the process of self-organization in complex systems.” The common characteristics are: (1) radical novelty (features not previously observed in the system); (2) coherence or correlation (meaning integrated wholes that maintain themselves over some period of time); (3) A global or macro “level” (i.e., there is some property of “wholeness”); (4) it is the product of a dynamical process (it evolves); and (5) it is “ostensive” — it can be perceived. For good measure, Goldstein throws in supervenience — downward causation. Goldstein’s definition is hardly the last word on this subject, however. One indication of the ambiguous status that the term currently holds in complexity science is the discordant dialogue that occurred in an on-line (Internet) discussion of the topic hosted by the New England Complex Systems Institute (NECSI) during December 2000 and January 2001. Here are just a few abbreviated (and paraphrased) excerpts: * Emergence has more to do with concepts and perceptions; * Emergence arises when an observer recognizes a ‘pattern’; * Perception is irrelevant – emergence can occur when nobody is there to observe it; * The mind is an emergent result of neural activity; * In language, meaning emerges from combinations of letters and words; * A society is an emergent, but it is in turn composed of emergent collections of cells; * When water boils and turns to steam, this is emergence — something new in the macro- world emerges from the micro-world; * Temperature and pressure are emergents — macro-level averages of some quantity 7 present in micro-level phenomena; * Emergence involves a process. Thus, economists can say that a recession emerges; * It’s like a dynamical attractor, or the product of a ‘deep structure’ — a pre-existing potentiality; * Another participant responded to this with: “I don’t know what a deep structure is, but it feels good to say it;” * Still another objected that dynamical attractors are mathematical constructs – they say nothing about the underlying forces; * Emergence requires some form of ‘interaction’ — it’s not simply a matter of scale; * Others disagreed – if the properties of the whole can be calculated from the parts and their interactions, it is not emergence; * Emergents represent rule-governed creativity based on finite sets of elements and rules of combination; * Emergence does not have logical properties; it cannot be deduced (predicted); * Another participant replied, maybe not, but once observed, future predictions are possible if it is deterministic; * Another discussant asserted that a ‘very simple example’ is water, and its properties should in principle be calculable by detailed quantum-level analysis: * A discussant familiar with quantum theory disagreed – given the vast number of “choices” (states) that are accessible at the quantum level, one would, in effect, have to read downward from H 0 to make the right choice. 2 * Yet another discussant pointed out that quantum states are always greatly affected by the boundary conditions — the environment. * Finally, one discussant disputed the entire concept of emergence – it’s all in the eye of the beholder – if we cannot even know that there is a real world, that hydrogen and oxygen actually exist, how can we ‘know’ what they do in combination? In short, contradictory opinions abound. There is no universally acknowledged definition of emergence, nor even a consensus about such hoary (even legendary) examples as water. And if emergence cannot be defined in concrete terms — so that you will know it when you see it — how can it be measured, or explained? As Jeffrey Goldstein noted in his Emergence article, “emergence functions not so much as an explanation but rather as a descriptive term pointing to the patterns, structures or properties that are exhibited on the macro-scale” (p. 58).[46] Editor Michael Lissack, in his own inaugural Emergence article, acknowledged that “it is less than an organized, rigorous theory than a collection of ideas that have in common the notion that within dynamic patterns there may be underlying simplicity that can, in part, be discovered through large quantities of computer power...and through analytical, logical and conceptual developments...” (p. 112).[45] (Well, not always – see below.) SYNERGY IN NATURE How can we sort all of this out? The place to start, I believe, is with the more inclusive (and 8 - - more firmly established) concept of “synergy”. This concept has been treated in depth elsewhere by this author. [47-53] (See also the two volumes on the evolution of complexity by Maynard Smith and Szathmáry.) [54,55] So here I will be brief. Broadly defined, synergy refers to the combined (cooperative) effects that are produced by two or more particles, elements, parts or organisms – effects that are not otherwise attainable. In this definition, synergy is not “more” than the sum of the parts, just different (as Aristotle long ago argued). Furthermore, there are many different kinds of synergy. One important category involves what can be called “functional complementarities” effects produced by new combinations of different parts. Water is an obvious example, but so is sodium chloride — ordinary table salt. Na Cl is composed of two elements that are toxic to humans by themselves, but, when they are combined, the resulting new substance is positively beneficial (in moderate amounts). Another commonplace example is Velcro, where the two opposing strips, one with many small hooks and the other with loops, are able to create a secure bond with one another. Another important form of synergy – in living organisms and complex social organizations alike – involves the division of labor (or what could perhaps more felicitously be called a “combination of labor”). Anabaena provides an unusual example. Anabaena is a cyanobacterium that engages in both photosynthesis and nitrogen fixing. However, these two processes are chemically incompatible. So Anabaena has evolved a way of compartmentalizing these two functions. The nitrogen fixing is done in separate heterocysts, and the products are then passed through filaments to other cells. [56] Likewise, there are many different kinds of “symbiosis” between two or more different species in the natural world that involve a division/combination of labor. Thus, virtually all species of ruminants, including some 2,000 termites, 10,000 wood-boring beetles and 200 Artiodactyla (deer, camels, antelope, etc.) are absolutely dependent upon the services provided by endosymbiotic bacteria, protoctists or fungi for the breakdown of the cellulose in plants into usable cellulases. [57] Still another form of synergy involves what I refer to as a “synergy of scale” — an aggregation of interchangeable, like-kind parts that produce unique cooperative effects (say a river, or a sand pile ). Indeed, many synergies of scale produce yet another form of synergy commonly known as “threshold effects” (say a flood, or an avalanche). An elegant example involves the Volvocales, a primitive order of marine algae that form colonies of different sizes, from a handful of cells to quasi-organisms with several dozens to hundreds of functionally-integrated cells. As it happens, Volvocales are subject to predation from filter feeders, and a detailed study some years ago by the biologist Graham Bell documented that Volvox, the largest of the Volvocale species, is virtually immune from filter feeders. [58] The reason, as it turned out, was that there is an upper limit to the prey size that the filter feeders can consume. In a similar vein, in the orb web spider, Metabus gravidus, 15-20 females are able to produce a synergy of scale when they band together to build a giant collective web that can span a stream where their prey are especially abundant. [59] These and many other forms of synergy – such as joint environmental conditioning, information-sharing and joint decision-making, animal-tool “symbioses”, gestalt effects, cost- and risk-sharing, convergent effects, augmentation or facilitation (e.g., catalysts), and others – are discussed in several recent and forthcoming publications by this author. [48- 53] 9 - - It should also be stressed that, far from being vague or ephemeral, synergistic effects are, as a rule, very concrete and eminently measurable. To cite one of the many examples in the publications cited above, during the bitterly cold Antarctic winter emperor penguins (Aptenodytes forsteri) huddle together in dense colonies, sometimes numbering 10,000 or more, for months at a time. In so doing, they are able to share precious body heat and provide insulation for one another. A careful study of this collective behavior many years ago showed that these animals were thereby able to reduce their individual energy expenditures by up to 50 percent. [60] Similarly, in a comparative study of reproduction among southern sea lions (Otaria byronia) during a single breeding season, it was documented that only one of 143 pups born to gregarious group-living females died before the end of the season, compared to a 60 percent mortality rate among solitary mating pairs. The main reasons were that pups in colonies were protected from harassment and infanticide by subordinate males and were far less likely to become separated from their mothers and die of starvation. [61] In short, functional synergies are the source of many “economies” in the natural world. A crucial corollary of this point is that the synergistic effects produced by “wholes” provide a definitive answer to the charge that wholes are merely “epiphenomena” — nothing more than an expression of their parts. In a nutshell, a whole exists when it acts like a whole, when it produces combined effects that the parts cannot produce alone. Moreover, the synergies produced by wholes provide a key to understanding “why” complex systems have evolved. (We will return to this crucial point shortly.) And if there is any doubt about the matter, one can test for the presence of synergy by removing an important part and observing the consequences — a test first suggested by Aristotle in the Metaphysics (Book H 1043b-1044a). I call it “synergy minus one.” As a thought experiment, imagine the consequences if you were to remove the gut symbionts from a ruminant animal. Or imagine the consequences for an automobile of removing, say, a wheel, or the fuel supply, or the ignition key, or the driver for that matter. Of course, there are also a great many cases where the removal of a single part may only attenuate the synergy; you may have to remove more than one part to destroy the synergy completely. (Call it synergy minus n.) Thus, if you take away a chrome strip from a car, it may only affect the sale price. RE-DEFINING EMERGENCE Accordingly, some of the confusion surrounding the term “emergence” might be reduced (if not dissolved) by limiting its scope. Rather than using it loosely as a synonym for synergy, or gestalt effects, or perceptions, etc., I would propose that emergent phenomena be defined as a “subset” of the vast (and still expanding) universe of cooperative interactions that produce synergistic effects of various kinds, both in nature and in human societies. In this definition, emergence would be confined to those synergistic wholes that are composed of things of “unlike kind” (following Lewes’s original definition). It would also be limited to “qualitative novelties” (after both Lewes and Lloyd Morgan) — i.e., unique synergistic effects that are generated by functional complementarities, or a combination of labor. In this more limited definition, all emergent phenomena produce synergistic effects, but many synergies do not entail emergence. In other words, emergent effects would be associated specifically with contexts in which constituent parts with different properties are modified, re-shaped or transformed by their 10 - -
Description: