I got this nice/great letter and pushback:
question:
"2. This 'cause' has to be at-least-as-complex as this universe. The most complex unit that appears in this universe is the human personality. The 'cause' must therefore be at-least-as-personal as we are." [note: this is a quote from https://www.Christianthinktank.com/nextseat.html]
This assumption is unwarranted. I can give you an example of this very thing happening.Great response!--I really appreciate your taking the time to pushback like this...and for advancing some possible counter-examples to my thesis above...As a chess player and a computer programmer, I can write a chess program that plays chess better than I do.
My response to your first example is one of respectful and cordial disagreement (surprise, surprise): that
(1) this is not an example of 'more complexity from less complexity';
(2) this is not an example of 'more complexity' at all.
(1) This is not an example of 'more complexity from less complexity':
The reason I conclude this, is that I consider your comparison elements (you vs a chess-playing computer) to be incomplete specifications, since you are not the only factor, cause, or agent involved in the production of that chess-playing computer.If you had said, for example, that you ALONE could produce a computer that could BY ITSELF:
- invent metallurgy (and all the required precedent arts and sciences)
- invent the semiconductor, logic gate, and transistor (and all the required precedent technologies)
- design fabrication and miniaturization technologies needed to make the computer chips (and all the required precedent engineering and technologies)
- discover and harness electrical forces (and all the required precedent scientific discoveries)
- derive Boolean algebra (and all the required precedent mathematics and logics)
- invent/create the various logic designs embedded in all the components of a computer system (computational, addressing, sequencing, data storage/fetch, i/o, bus, interrupts, peripherals)...plus all the antecedent, required technologies
- write all the software for the computer, including OS, drivers, applications, I/O, chess-board transformations, etc (and all the required precedent software and algorithms)
- purchase, gather, and assemble a computer
- program (and do the mystical science of debugging!) by itself a second computer to ALSO play chess
...then I might consider the comparison more accurate...(smile)
In other words, the amount of complexity that ACTUALLY was required to produce the computer was/is vastly more than your personal programming skills...In many ways, it required a complexity level equal to huge amounts of corporate human learning, education, science, art, technology, invention, discovery, and engineering (and even some sales and marketing...) to produce just the computer itself and the tools YOU would use to produce the chess-playing program. [BTW, I worked on a chess-playing project during my graduate studies in computer science, so I know at least the basics about some of these issues.]
This means that it is not your chess-playing ability alone that is to be compared with the computer's chess-playing ability (and, actually, even you recognized this implicitly in your mention that you were 'both' a programmer and a chess player--your chess playing abilities would not be enough by themselves to even program the computer--they would need to be supplemented by at least your programming skills...another example of how the mismatch is present).
Even at this level of you-only, the mismatch is still present--the combo of your 'programming skills' and 'chess-playing skills' would need to literally produce a computer that had BOTH 'programming skills' AND 'chess-playing skills' for the comparison to be a little closer. [I hope this is not too terse.]
So, I think the example is not close enough to reality to demonstrate 'more complexity from less'--in fact, it would certainly look like the opposite.
[This, by the way, would likely lead us to abandon looking at ANY human creation as an example of this, since all of these will presuppose HUGE AMOUNTS of 'antecedent' complexity, and a HUGE NUMBER of antecedent intelligent agents, due to the nature of scientific and technological discovery...I am reminded of all the fanfare that accompanied Stanley Miller's creation of the basic building blocks of protein many decades ago. Some hailed it as 'proof' that the complex could emerge from the simple, but the context of Miller's intellect, learning, design, plus all the antecedent requirements necessary to carefully reproduce what they considered the 'primeval conditions' led others to realize the 'complexity factor' was very much necessary to the production of the 'complex from the simple.' This charge could likely be lodged against ANY human creation/artifact, so I suspect you would need to look outside human endeavors to locate something WITHOUT the backdrop of human scientific achievement. Thus, human artifacts would be subject to what we might call the 'artifactual problem'. In other words, there were VASTLY MORE agents involved that just you.] 2) this is not an example of 'more complexity' at all.
Even if we unfairly restrict the comparison to be between your chess-playing ability (leaving out your programming ability, and all the human knowledge background required to make the computer to begin with) and the computer's derivative chess-playing ability, I still don't think there is an increase in complexity at all.My reason for thinking this is that the computer doesn't play chess any differently (or more 'intelligently') than you do--it just does 'more of it' and/or does it 'more consistently'. It does not play chess more "complex-ly" than you.
Since you brought it up, I will assume you know enough about how this works in an "AI" context. You (the programmer) will build a 'weighting' procedure in the software, by which to determine the relative "value to success" of a possible board position (creatable by a single move of yours). You decide--on the basis of your personal experience and skills--what the relative values of having a Knight in the center of the board would be versus having a Rook in the center of the board. The computer then uses a 'brute force' approach and calculates all the "values" of all possible board positions, reachable within 3-5-7 moves from the current board position.
But the way it evaluates these moves is identical to the way you would. It is neither "more complex" nor "less" (actually, I think you would probably also evaluate board positions in light of certain 'strategic' possibilities (a la Kasparov vs. Deep Blue) or in light of certain 'psychological' motives--"faking an opponent out" or "misleading them"). But in any event, the machine's entire process of board evaluation would NOT be 'more complex' than yours. It may evaluate MORE possible futures than you, but this is quantity, not complexity. It does "more of it" not 'more complex'...
Likewise, it may beat you (assuming you don't use the "Undo function" without regard to fairness and equity...smile) by applying your exact rules "more consistently", but this again is not more complex, but rather "more of it". You may not be able to properly apply your skills after 72 solid hours of play, but the computer would. But the computer would not be doing something 'more complex' than you, but only the "same thing" as your normal level of ability.
In other words, "better" (in this case meaning "beating you") does NOT equate to "plays more complex-ly than you" at all. It means (in the case of this chess-playing example), simply "plays more, or faster".
Accordingly, I think the chess-playing example doesn't work well as a counter-example....
Furthermore, the basic unit of programming is merely a dozen or so
logic gates. Yet computers can be programmed to awesome levels of complexity.
At one level, this example fails for the same reason as the above: the knowledge required to construct compilers from simple logic gates (I ASSUME you are talking about software here, but the same would apply to constructing workstation-class CPU's, killer I/O subsystems, or massively parallel computing complexes), and then applications that used the compilers would be much greater than anything the final program (however complex) would be/exhibit. In other words, we are face-to-face again with the Artifactual problem.
But two new issues can be seen in this case, and they are the issues of (a) 'relationships' as 'components' and (b) "cause vs. components"
(a) 'relationships' as 'components'
The point here is that any larger assembly of simpler items contains much more than just the items. The relationships between the items need to be recognized as critical 'components' also.(b) "cause vs. components"For example, let me use hardware logic gates.
A two-bit, binary adder circuit (see diagram below) can be built from 7 logic gates (3 XORs, 3 ANDS, and 1 OR). But the logic gates have to be placed in a certain arrangement and in some cases, in a certain sequence. If I place the wrong type of logic gate where another should be (e.g., replacing the OR gate with an AND gate for the last step of determining the carry digit), the adder doesn't not function correctly. Likewise, if I route the wrong input to a gate (e.g., low-order digit, d1 into the AND gate for the high-order digit), the adder fails.
![]()
These relationships, sequences, arrangements are not part of the logic gates themselves, but are additional components of the adder. They cannot be 'touched', 'tasted', or 'sensed' in any way (except perhaps visually, through 'looking' at the spatial layout?), but are absolutely essential to the logic. [Another example of where the arrangement and/or relationships between the items are crucial to the overall 'function' of the whole would be music. A sonata is a collection of notes/tones, but if those notes/tones are sounded in a different order, the music is NOT the same--even though the tones/notes are identical in themselves.]
The relevance of this example to our discussion is that any determination of "what went into something" needs to include all the relevant components--both 'things' and 'relationships" (in addition to the immense context of physical and mathematical law!). It is probably apparent that the relationship elements are likely to be more complex than even the elements/atoms themselves. For example, in the case of the simple adder above, there are 17 'connections' between the various inputs, gates, and outputs. And, in the real world, there are many other complex relationships that must 'exist' for this to work also--there are clocking mechanisms, distance relationships, signaling conventions, etc...and we are back to starting with something vastly more complex than a two-bit adder.
And, if we imagine that somehow we can eventually build something so much more complex than even all the background, we can dispel that notion simply by observing that the number of 'constituents 'under the complexity' grows non-linearly.
For example, a one-bit adder only takes 2 gates (1 AND and 1 XOR) and 6 connections (see diagram below). The move from 1-bit to 2-bits increased the number of gates from 2 to 7, and the number of connections from 6 to 17--a non-linear increase. [Plus, the number of possible arrangement mistakes increased non-linearly also, emphasizing the increase in the importance of "non-material" elements of the whole.]
![]()
So, construction of higher levels of complexity require much more 'substrate' than might at first appear, and as complexity of the whole increases, so also the "number of elements" increases faster.
But a more problematic issue for this example is that the logic gates are components of the adder--NOT its 'cause' per se. The logic gates don't 'wiggle on over close to one another', then arrange themselves in the required configuration, then 'grow' attachment connections, and finally develop clock timing, signaling protocols, etc...they are not CAUSES, but rather COMPONENTS. Anything materially "complex" is composed of things "simpler" , but they aren't caused/created by those components. It takes other agencies, influences, and/or forces to 'assemble' or 'transform' an unassembled 'pile of things' into a functional whole. This might be called the 'compositional problem' of such possible examples. ["Self-assembly" and "Self-organizing" theories are clearly inapplicable in this case, BTW, since this situation could not even remotely be considered a case of a "dissipative system, far from equilibrium"!]So this example doesn't seem to work either.
Finally, all of matter is made up of electrons, protons, and neutrons.
(Even counting the exotic particles there are less than 2 dozen of them).
Yet there are some 100+ elements, each with its own particular behaviour,
and all made from just 3 things!
This possible counter-example falls short for the 'compositional' issue just discussed. The basic particles constitute the atoms, but don't create them per se.And, since atoms actually cohere because of relationships between the particles (e.g. strong nuclear force, weak electromagnetic force) and not the particles themselves (remember our example of the 'relationships' between the logic gates), whatever 'created' or 'caused' the existence of any particular atom must have orchestrated the very spatial arrangement of the particles, for the slightest deviation in distance would cause the nucleus to implode or explode, and/or cause the electron shells to collapse or disperse.
And, in a very real sense, the atom 'looks like' the binary adder we made above--the 'larger' the atom, the more relationships that exist between the particles. The interaction, for example, between the electrons in the various shells (where electrons can simply 'trade places' with others--and emit energy in the process), illustrates the massive number of interrelationships that exist among those 'few particles'. [Some of these relationships, of course, are manifested as transitory sub-atomic particles, but this only increases the number of components also.]
So, although this might not suffer from the 'artifactual problem' (although there would be some Anthropic cosmologists and physicists who might argue otherwise--smile), it definitely suffers from the 'compositional problem.'
Your assumption leaves out an important ingredient: time. Time + simplicity = complexity.
I know you are about to give your example of how this works below (i.e.,
your fractal illustration), but let me make a few general statements about
this principle first. [Note: your statement is quite terse, of course,
so I may be misunderstanding your meaning...if so, please let me know if
I have misunderstood and I will try to correct this.]
1. "Time" per se is the enemy of complexity, not its friend. Any system at or near equilibrium--left alone for a 'time'--will degenerate according to the Second Law of Thermo.2. What I understand you to mean by this is that "simple systems, if allowed to operate over some period of time, will develop into more complex systems". But this is manifestly untrue in most cases:
a. Isolated systems, at or near equilibrium, will degenerate to a less orderly/less organized state (i.e., higher entropy)-never go the other way to 'higher complexity':"Let us again restrict ourselves to an isolated system, that is, one cut off from its surrounding environment and into which no new energy or matter flows. In such a system, energy states always tend to even out, at is, achieve an equilibrium...In short, an equilibrium is characterized by an absolute minimum of free energy and a consequent maximization of entropy." [NS:CERCN:23-24]"In short, thermodynamics' second law strongly inhibits ordered structures in isolated systems. Consequently, the apparent contradiction between the observed universal order and the theoretical physical laws cannot be easily resolved in terms of the usual methods of equilibrium thermodynamics or even equilibrium statistical mechanics...we need to appeal to non-equilibrium systems. ." [NS:CERCN:51]"Although the destruction of order always prevails in a system in or near thermodynamic equilibrium, the construction of order may occur in a system far from equilibrium." [NS:CERCN:59]b. It's not even true in a number of cases of biotic forms, nor is it obvious to evolutionists from the historical evidence, as to how 'order' or 'complexity' arose-it's a puzzle to mainstream science:"When an iron bar is heated at one end, the other end will eventually warm until the temperature of the whole bar becomes equal...The reverse phenomenon-namely, a uniformly warm iron bar suddenly become hot at one end and cold at the other-has never been observed." [NS:CERCN:20f]
"Complexity itself, however, is insufficient to demonstrate the direction of evolution, at least not biological evolution. Not all species have become increasingly complex: sponges, roaches, spiders, and bees, among numerous invertebrates, are trapped in an endless cycle of perfected daily routines and thus have remained virtually unchanged for eons." [NS:CERCN:32]"Here we face another curious consequence of Darwin's way of looking at life: despite the power of molecular genetics to reveal the hereditary essences of organisms, the large-scale aspects of evolution remain unexplained, including the origin of the species. There is 'no clear evidence...for the gradual emergence of any evolutionary novelty,' says Ernst Mayr, one of the most eminent of contemporary evolutionary biologists. New types of organisms simply appear upon the evolutionary scene, persist for various periods of time, and then become extinct. So Darwin's assumption that the tree of life is a consequence of gradual accumulation of small hereditary differences appears to be without significant support. Some other process is responsible for the emergent properties of life, those distinctive features that separate one group of organisms from other-fishes and amphibians, worms and insects, horsetails and grasses. Clearly something is missing from biology. It appears that Darwin's theory works for the small-scale aspects of evolution: it can explain the variations and the adaptations within species that produce fine-tuning of varieties to different habitats. The large-scale differences of form between types of organism that are the foundation of biological classification systems seem to require another principle than natural selection operating on small variations, some process that gives rise to distinctly different forms of organism. [NS:HLCS:viii]
"Darwin turned the argument for God's existence (i.e., from design) on its head: Nature produces contraptions...The creationists so animating one another, the lay public, and our contemporary court system today rest uneasy with Darwin's heritage. Natural selection, operating on variations which are random with respect to usefulness, appears a slim force for order in a chaotic world. Yet the creationists' impulse is not merely misplaced religion. Science consists in discovering that point of view under which what did occur is what we have good grounds to expect might have occurred. Our legacy from Darwin, powerful as it is, has fractures as its foundations. We do not understand the sources of order on which natural selection was privileged to work. As long as our deepest theory of living entities is the genealogy of contraptions and as along as biology is the laying bare of the ad hoc, the intellectually honorable motivation to understand partially lying behind the creationist impulse with persist." [CS:OOSSE:643]
c. "Open" systems, at or near equilibrium, remain the same or degenerate unless there is a (relative) massive source of energy 'pushing' the system to a far-from-equilibrium state (and keeping it there, without destroying the system). In some cases, this outside source of lower-entropy energy (i.e., more 'complex'!) can produce spatial organizational structures that exist only as long as the external source of energy continues."Thus, once again we conclude that an energy flow through an open system is an absolute necessity if order is to be created from disorder." [NS:CERCN:47]"In short, thermodynamics' second law strongly inhibits ordered structures in isolated systems. Consequently, the apparent contradiction between the observed universal order and the theoretical physical laws cannot be easily resolved in terms of the usual methods of equilibrium thermodynamics or even equilibrium statistical mechanics...we need to appeal to non-equilibrium systems. For with departures from equilibrium, like the breaking of symmetry, new things can be created." [NS:CERCN:51]
"The more complex and intricate the structure, generally the more energy intake (per unit mass) needed for sustenance" [NS:CERCN:52]
"Nor can an open system near equilibrium evolve spontaneously to new and interesting structures. But should those fluctuations become too great for the open system to damp, the system will then depart far from equilibrium and be forced to reorganize. Such reorganization generates a kind of 'dynamic steady state,' provided the amplified fluctuations are continuously driven and stabilized by the flow of energy from the surroundings, namely, provided the energy flow rate exceeds the thermal relaxation rate." [NS:CERCN:52]
"Beyond certain instability thresholds, systems can dramatically change, fostering the spontaneous creation of an entire hierarchy of new structures displaying surprising amounts of coherent behavior. Such highly ordered, dissipative structures can be maintained only through a sufficient exchange of energy with their surroundings; only this incoming-outgoing energy flow can support the requisite organization for an ordered system's existence. The work needed to sustain the system far from equilibrium is the source of the order" [NS:CERCN:56]
"Although the destruction of order always prevails in a system in or near thermodynamic equilibrium, the construction of order may occur in a system far from equilibrium." [NS:CERCN:59]
"The phenomenon is also sometimes called 'self-organization,' although that term and others like it (those with the prefix 'self-') are deceptive in that such ordering is actually occurring not by itself, as though by magic, but only with the introduction of energy." [NS:CERCN:61]
d. The amount of free, well-organized, external energy required to create (temporary in most cases) order is HUGE, and is essentially 'draining' the Universe-'destroying order' elsewhere, and destroying more than it creates:"the thermodynamics of open systems allow a system's entropy to remain constant or even to decrease. Here, then, is the gist of non-equilibrium thermodynamics: Localized, open systems can be sites of emergent order within a global (i.e., universal) environment that is largely and increasingly disordered." [NS:CERCN:27]"In other words, a large ensemble of particles in the absence of gravity will tend to disperse, yet in the presence of gravity will tend to clump; either way, the net entropy increases...Thus (in the collapse of a cloud into a star), the increase in entropy of the surrounding interstellar environment is more that an hundred times greater than the computed decrease in entropy of the newly structured star...Clearly, the star has become an ordered clump of matter at the considerable expense of the rest of the Universe." [NS:CERCN:73]
"Such is the nature of change: The emergence of order and the growth of complexity, everywhere and on all scales, do exact a toll-and that toll means a Universe sinking further into an ever-disordered realm of true chaos." [NS:CERCN:78]
3. And, strictly speaking, what is actually being produced in these far-from-equilibrium phase states, is spatial order and previously unpredicted behavior; NOT 'complexity' (and certainly not any 'new stuff'). In the case of the Benard instability of water-it's still water. And the spatial structure only exists until it has 'helped' the material get back to the stable, equilibrium condition:
"The newly formed structures themselves, then, are Nature's way of trying to return the system to equilibrium." [NS:CERCN:62]4. And the "emergence of order" in this situation is NOT automatic at all (and it is very fragile--if the outside energy force exceeds some thresholds, the entire system can be destroyed):"In system theory, complexity means not only nonlinearity but a huge number of elements with many degrees of freedom." [PS:TIC:3]
"Mathematically, this procedure is well known as the so-called 'adiabatic elimination' of fast relaxing variables, for instance, from the master equation describing the change of probabilistic distribution in the corresponding system. Obviously, this elimination procedure enables an enormous reduction in the degrees of freedom" [PS:TIC:67]...giving rise to "new spatio-temporal structure" [p.66] Note: an 'enormous reduction in the degrees of freedom' means a decrease in COMPLEXITY (by the above definition) and yet an increase in spatial ORDER. These dissipative structures are NOT necessarily more 'complex' (from a systems theory standpoint) than their at-equilibrium counterparts, but rather less so.
"In general, to summarize, a dissipative structure may become unstable at a certain threshold and break down, enabling the emergence of a new structure. As the introduction of corresponding order parameters results from the elimination of a huge number of degrees of freedom, the emergence of dissipative order is combined with a drastic reduction of complexity." [PS:TIC:68]
"Nature abounds in complex structures. But defining and measuring complexity is not a trivial problem. Entropy is not a good measure, since it increases with disorder (so a dead organism would be more complex than a living one). A properly defined complexity measure C should reach its maximum at some intermediate levels between the order of a perfect crystal and the disorder of a gas." [NS:SLCPB:34] [Note: order is NOT the same as 'complexity']
"As a further example, imagine an utterly frozen crystal at absolute zero temperature. Ignoring quantum fluctuations, such a system can have only one possible configuration for its many molecular parts; thus, its entropy is zero, its [spatial/morphological] order high. On the other hand, a gas at ordinary (room) temperature displays much randomness in the distribution of its atoms and molecules; it is said to be disorderly, to have high entropy" [NS:CERCN:25] [The preceding definition of complexity would make the crystal state 'ordered' but not 'complex']
"I hope this [example] gives you pause. We are searching for laws that suffice to yield orderly dynamics. Our Boolean networks are nonequilibrium, open thermodynamic systems. Since a little network with only 200 light bulbs can twinkle away for an eternity without repeating a pattern, order is in no way automatic in nonequilibrium, open thermodynamic systems." [PS:AHU:82]5. The first-to-mind examples of simple-to-complex are the least persuasive, since the complexity is often already there to start with. A human being for example is a minute subset of the genetic possibilities present in their gene-base (among other relationships). There is so much more unrealized 'complexity' in a human than ever makes it out into the light of day--but any complexity that does materialize in life was already there in the genetic and metabolic material, the adaptive tools of the individual, and the range of choices made by that individual, in selecting what influences would be operative upon itself:"That means that a married couple has the possibility of producing over 70 trillion different children by this process [meiosis] alone. No wonder, apart from identical twins, no two siblings look exactly alike. This is the process of random assortment...Recombination, even more than random assortment is the process responsible for the uniqueness of different individuals of the same species. New combinations of genes or chromosomes are constantly formed. The odds that the end-products of any two series of meiotic division will be the same are practically nil. It has been estimated that each human individual is capable of producing more different germ cells that there are atoms in the universe (estimated at 10**80)." [NS:NLBC:54.57]
6. We are beginning to realize that the 'simple' things are much more complex than we first thought, and that what appears as 'complex' behavior of theirs might indeed be already present in their makeup already. The complex behavior is not 'created' in those situations, but rather was potential in them already:"These studies [of Elizabeth Ko] show that genotype and environment do not determine cell state in bacteria. Changes of state can occur spontaneously, without any defined internal or external cause. By definition, these changes are epigenetic phenomena: dynamic processes that arise from the complex interplay of all the factors involved in cellular activities, including the genes...Genetic determinism in developmental biology is a type of preformationism. It regards the information in the genome as the sufficient cause of the developmental process that gives rise to a particular type of organism. This perspective leaves no room for emergent phenomena. However, the studies of Ko et al. tell us that even cells as simple as bacteria have dynamic properties that cannot be understood from a determinist point of view. A constant genome and uniform external environment do not themselves provide a causal basis for diversification of state in progeny." [NS:SLCPB:63]"There is new experimental evidence, initiated by John Cairns in 1988, that unicellular organisms such as bacteria and yeasts can indeed change their DNA in a directed, adapted manner." [NS:HLCS:37]
"As Sherrington observed, the cytoskeleton may act as the nervous system of single-cell organisms. Paramecia, for example, can apparently learn, remember, and exhibit adaptive responses such as avoidance and habituation which involve movement performed by coordinated actions ('metachronal waves') of hundreds of hair-like appendages called cilia"[CS:JCS_1.1.98f]
7. In fact, complexity (in both abiotic and biotic systems) doesn't arise from 'time' at all, but from the interactions of the components and elements themselves--they are pre-built somehow to exhibit higher-order behavior (i.e., the complexity is already "there" before it become visible in far-from-equilibrium states), when subjected to those conditions:"Biological information transfer [e.g. RNA] is based on chemical complementarity, the relationship that exists between two molecular structures that fit one another closely. Images such as lock and key, mold and status, are often used to illustrate such a relationship. In the chemical realm, complementarity is a more dynamic phenomenon that these images suggest. The two partners are not rigid. When they embrace, they mold themselves to each other to some extent. Furthermore, the embrace leads to binding. Its degree of intimacy is such that electrostatic interactions and other short-range physical forces act strongly enough to prevent the association from being disrupted by thermal jostling...Base pairing, the support of the genetic language, is the most spectacular manifestation of chemical complementarity in biology. But it is only one of many. Every facet of life depends on molecules that 'recognize' each other. Self-assembly, the phenomenon whereby complex structures are formed from a number of parts, rests on complementarity relationships between the parts, as did the assembly of furniture in the old days, except that chemical parts even provide their own glue" [NS:VD:83]"Since complexity emerges from the interactions of the individual units, interactions must somehow be present in our measurements." [NS:SLCPB:42]
"This simple definition allows us to make clear what characterizes complexity: the emergence of interactions among different units and their conflict with randomness. Order and disorder find a compromise right at the critical point." [NS:SLCPB:43]
8. What this creates for specialists in this field is the problem of context-there must be some 'principle or force of change' outside the individual systems or components that create a context in which this emergence can materialize. The fact that order appears on such a massive scale--in spite of the huge 'costs' to the universe and improbabilities of it--leads them to look for 'larger forces' (or at least "better forces") at work:"Darwin supposed that living systems evolve by mutations that cause small modifications in the properties of the organism. Is this graceful property of minor changes hard to achieve? Or is it, too, part of order for free? A pure Darwinist might argue that this kind of graceful stability could arise only after a series of evolutionary experiments, yet this begs the question. We are trying to explain the origin of the very ability to evolve! However life started, with nude replicating RNA molecules or with collectively autocatalytic sets, this stability cannot be imposed from outside by natural selection. It must arise from within as a condition of evolution itself." [PS:AHU:80]"The uniformity and ubiquity of life's essential biochemistry, despite the rich diversity of resulting biological types, speaks volumes about the likelihood of an underlying factor, principle, or process-if we could only find it-that drives change." [NS:CERCN:39]
"Molecules more complex than life's elementary acids and bases are even less likely to be synthesized by chance alone. For example, the simplest operational protein, insulin, comprises fifty-one amino acids linked in a specific order along a molecular chain. Using probability theory, we can estimate the chances of randomly assembling the correct number and order of acids; given that twenty amino acids are known to partake of life as we known it, the answer is 20**(-51), which equals approximately 10**(-66). As the inverse of this is obviously a great many permutations, the twenty amino acids could be randomly assembled trillions upon trillions of times for every second in the entire history of the Universe and still not achieve by chance the correct composition of this protein. Clearly, to assemble larger proteins and nucleic acids, let alone a human being, would be vastly less probable if it had to be done randomly, starting only with atoms or simple molecules. Not at all an argument favoring creationism, spiritualism, mysticism, and the life, rather it is once again the natural agents of order that tend to tame chance." [NS:CERCN:40]
9. But of course this takes us back to some type of "artifactual" problem--our 'simple' systems are not really 'simple' at all, they require highly-organized energy (i.e., complex) from the outside to manifest any structural order, and the laws/forces of change under which they operate are exceptionally complex in themselves. In fact, this looks more like complexity producing 'less complexity' if ALL the contextual factors are taken into consideration (remember the number of agents involved in the computer issue?). [Note also that this theme of "complexity" produces "less complexity" would also be commensurate with the Second Law of Thermodynamics...]
So, if we now look at the standard/textbook examples of emergent phenomena
(at the physical level), we might be a little underwhelmed:
To be sure, there are TONS of examples in living systems,
but I have already pointed out that living systems already manifest massive
complexity, before we even get to cases of emergent behavior. The gap between
these categories of behavior just don't seem adequate to warrant such a
grand conclusion that basically "things run uphill" (often enough, and
long enough to produce durable and extensive complexity...
Let me make a few observations about these examples:
1. There seems to be nothing really 'spontaneous' about these phase changes. They occur 'rapidly', but it is completely predictable (once we have seen the first one). Water anywhere and anytime will exhibit this phase change, given identical conditions. This behavior is built-in, not 'spontaneous' as in 'never before happened'...........................................................................2. This very reproducibility indicates that this behavior is not even 'novel' or even really 'emergent'. Water a billion years ago and water a billion years in the future will manifest the same behavior. It might have been novel to us in 1900 (when Benard was doing his work), but it was not 'novel to water'! There is no additional tangible complexity change in water in this case. It's just still water...time (and in this case, energy) didn't make it any more complex--we just noticed it acted more complex than we had ever seen water 'act' before.
3. In each case, the system had to be operated on from the outside, with the addition of lower-entropy energy. It didn't 'self-do' anything; it was operated on by an external force, of much higher organization, substantially lower entropy, and much greater 'force'. This sounds suspiciously like 'the cause must be at least as complex as...' to me (at least for physical systems).
4. If anything, this demonstrates that order/complexity is much more difficult to create/sustain/achieve than would be suggested by the formula 'time plus simplicity = complexity'. It certainly wouldn't support the formula at all.
Okay, let me try to net the preceding section out right quick:
1. Time by itself is an enemy of complexity--things run down.2. Physical systems--both open and isolated--invariably run down in near-equilibrium conditions.
3. Open systems, in far-from-equilibrium conditions can exhibit changes in spatial arrangement of the material, but this is not 'novelty', 'self-anything', or 'material change' in the matter under the stress--this behavior would have been "present" in the material since the beginning of time.
4. Any such temporary changes are due to the operation of an external force/energy, of higher complexity and lower entropy that the system itself.
5. Production of complexity in living systems cannot be used a counter-examples, since there is more than enough pre-existent complexity in living systems (we are just now discovering!) to account for these novelties, and since any basic reproductive process (through time) is largely that of replication (a change in quantity) instead of 'change/improvement' (a change in quality).
6. The source/origination of large-scale observed order in the universe is unknown in mainstream science today, and older, more 'from simplicity' approaches are being beginning to be questioned and sometimes abandoned.
7. The examples offered as textbook cases, although fascinating, show only behaviors that are "new to us"-NOT behaviors "new to the material". There is no evidence whatever in these examples of a change in water, iron, or laser materials "from the simple to the complex".
8. The very reproducibility of these examples document the reality that this 'complexity' was already built into the material/system--it has not 'arisen', 'emerged', or 'been spontaneously created' in the way these terms are sometimes mystically used in the literature.
So, I think I have to respectfully disagree with the formula 'time
plus simplicity equals complexity'--the data seems otherwise to me. ..
[At the same time, I do not want to denigrate in any way,
the significant work being done by those working in the field of 'emergent
behaviors'. The work of the Santa Fe Institute, and of Brian Goodwin, I
hold in highest esteem and watch with eagerness. Their more organism-centered
approach will yield much more reasonable results than previous research
paradigms, in my opinion. But at the same time, I want to suggest we temper
their "enthusiasm" somewhat, in order to avoid the more mystical, Bergsonian-worldview
assumptions (of which they themselves are trying to avoid).]
..........................................................................
This is how fractals work. With a fractal, you can condense a complex
image into just an equation. Where did all the information go? The answer
is, instead of storing it in the third dimension (i.e. as bits and bytes
in computer memory) you have stored it in the 4th dimension (i.e. time).
To recover that information requires time to operate the equation. This
sounds all radical and stuff, but really it's quite mundane.
You give this as an example of 'time plus simplicity equals complexity' but I am afraid that I am not at all sure I understand what you mean, friend...and what I THINK you mean, I cannot see any relevance to this discussion about 'complexity arising from simplicity without prior greater complexity'...But I'll at least try to bring out some of the elements of the topic, in hopes that something will 'emerge' (smile)...
1. I think you might be mixing your metaphors here. The image is not condensed into the equation, but the equation 'generates' the image. [I am assuming you are talking about geometrical self-similarity and not statistical self-similarity, of course, since you are referring to 'equations'.]2. Strictly speaking, you cannot condense a complex fractal image into "just an equation". Fractal images are described by more than just an equation. There are various dimension numbers [note: not dimensions like third and fourth, though] required/used to describe a fractal image (i.e., fractal dimension, topological dimension, embedding dimension), and different measurement methods for each of these (e.g., Hausdorff-Besicovitch, Iterative, Non-integer). I suspect that the way you are using 'time' in your example (i.e., applying the fractal equation to successive edge points, which were already generated in a previous iteration) is much like the 'screen saver' version, in which finer levels of edge-transformation are being performed. If this is what you mean, this 'time' element is captured in the Fractal dimension numbers and NOT in just the equation. The entire image is "identical to" the specifying equation+ dimension numbers.
3. I don't really understand how you are using 'information' in your example, unfortunately. The fractal image does require time (or at least iteration) to be graphed, but I am not sure the image contains any more 'information' than that contained in the equation+dimension numbers. If you are talking about the visual or spatial arrangements, these are either (a) exhaustively specified in the mathematics; or (b) strictly a function of the perceiver (and therefore, not relevant to our discussion on 'real complexity'). So, I don't think the 'information went anywhere'--it is still "present" in the mathematical specifications. I personally don't consider the graph of an equation to contain any more 'information' that the mathematical formulation of the equation, so I am afraid I don't know what you mean by 'information going somewhere' here.
4. At most, successive iterations of edge-transformations would be replication of simplicity, and not creation of complexity per se. The very notions of self-similarity and scalability of (mathematical) fractals should indicate that we have 'quantitative' increase of simplicity (i.e., the same formula applied 'linearly' to single points on the edge), instead of something qualitatively 'new' (i.e., unlike the original fractal equation).
Perhaps I have misunderstood you, friend...and perhaps I am reading
much more into your few words than I should, so you might need to email
me back and set me straight on what you meant by this...thanks.
................................................
Summary:
1. The example of the chess-playing computer illustrates the problem of Artifacts--that huge amounts of complexity and large numbers of agents are required to perform any such task. In other words, very much more complexity was very much required to create this 'simpler' complexity.2. The example of building a computer from logic gates illustrated the problem of Components--that the context in which elements are 'placed' is HUGELY complex, and the relationships between the 'simples' are quite complex in themselves. [The example also showed a vast amount of complexity to even construct a single-digit binary addition task.]
3. The example of matter being made up of basic building blocks illustrated the Compositional problem again.
4. Complexity is exceptionally difficult to get out of 'simplicity', and the examples often advanced for this are questionable as to their relevance to the question. The formula 'time plus simplicity equals complexity' (simply stated) does not seem to be supported at any significant scale by the data of physics. In fact, the required low-entropy, carefully applied, external forces required to even generate short-term spatial order in these examples argued somewhat for the position that 'the cause must be more ordered/complex than'...
5. The fractal example seemed to me off-center a bit, because it is more a matter of 'flat replication' than anything 'really emergent', and the informational 'identity' between the mathematical specification of the fractal (i.e., equation plus dimension numbers) and the graphical representation of that specification suggests that there is no loss or gain of 'complexity' involved in this case.
There are other issues/challenges/difficulties involved in my original
position (e.g., 'required casual complexity') of course, but at least these
suggested counter-examples do not seem to derail the position, in my opinion...
I do congratulate you on the website. It is refreshingly rational. At least you are trying to play by the rules.
Hey, thanks! But on this side of the thinking/writing, it is all but 'refreshing'-I am exhausted by having to think all this stuff (smile)...but hey, I am pretty far-from-equilibrium--as any of those who know me will testify--so maybe I CAN produce some 'order' (but I will need to go eat some lower-entropy pizza right now)...smile
Thanks for the question and for stimulating the discussion,
Warmly,
Glenn miller
-------------------------------------------------------
[ .... notuphill.html ........ ]