Home | Categories | Most Popular Articles | Top Authors | Submission Guidelines | Submit Articles | RSS
 



Amazon.com - Shop Now and Save
 
Welcome to ArticleSpectrum.com!



By: admin
"Everything is simpler than you think and at the same time more complex than you imagine." - Johann Wolfgang von Goethe

Complexity rises spontaneously in nature through processes such as self-organization. Emergent phenomena are common as are emergent traits, not reducible to basic components, interactions, or properties.

Complexity does not, therefore, imply the existence of a designer or a design. Complexity does not imply the existence of intelligence and sentient beings. On the contrary, complexity usually points towards a natural source and a random origin. Complexity and artificiality are often incompatible.

Artificial designs and objects are found only in unexpected ("unnatural") contexts and environments. Natural objects are totally predictable and expected. Artificial creations are efficient and, therefore, simple and parsimonious. Natural objects and processes are not.

As Seth Shostak notes in his excellent essay, titled "SETI and Intelligent Design," evolution experiments with numerous dead ends before it yields a single adapted biological entity. DNA is far from optimized: it contains inordinate amounts of junk. Our bodies come replete with dysfunctional appendages and redundant organs. Lightning bolts emit energy all over the electromagnetic spectrum. Pulsars and interstellar gas clouds spew radiation over the entire radio spectrum. The energy of the Sun is ubiquitous over the entire optical and thermal range. No intelligent engineer - human or not - would be so wasteful.

Confusing artificiality with complexity is not the only terminological conundrum.

Complexity and simplicity are often, and intuitively, regarded as two extremes of the same continuum, or spectrum. Yet, this may be a simplistic view, indeed.

Simple procedures (codes, programs), in nature as well as in computing, often yield the most complex results. Where does the complexity reside, if not in the simple program that created it? A minimal number of primitive interactions occur in a primordial soup and, presto, life. Was life somehow embedded in the primordial soup all along? Or in the interactions? Or in the combination of substrate and interactions?

Complex processes yield simple products (think about products of thinking such as a newspaper article, or a poem, or manufactured goods such as a sewing thread). What happened to the complexity? Was it somehow reduced, "absorbed, digested, or assimilated"? Is it a general rule that, given sufficient time and resources, the simple can become complex and the complex reduced to the simple? Is it only a matter of computation?

We can resolve these apparent contradictions by closely examining the categories we use.

Perhaps simplicity and complexity are categorical illusions, the outcomes of limitations inherent in our system of symbols (in our language).

We label something "complex" when we use a great number of symbols to describe it. But, surely, the choices we make (regarding the number of symbols we use) teach us nothing about complexity, a real phenomenon!

A straight line can be described with three symbols (A, B, and the distance between them) - or with three billion symbols (a subset of the discrete points which make up the line and their inter-relatedness, their function). But whatever the number of symbols we choose to employ, however complex our level of description, it has nothing to do with the straight line or with its "real world" traits. The straight line is not rendered more (or less) complex or orderly by our choice of level of (meta) description and language elements.

The simple (and ordered) can be regarded as the tip of the complexity iceberg, or as part of a complex, interconnected whole, or hologramically, as encompassing the complex (the same way all particles are contained in all other particles). Still, these models merely reflect choices of descriptive language, with no bearing on reality.

Perhaps complexity and simplicity are not related at all, either quantitatively, or qualitatively. Perhaps complexity is not simply more simplicity. Perhaps there is no organizational principle tying them to one another. Complexity is often an emergent phenomenon, not reducible to simplicity.

The third possibility is that somehow, perhaps through human intervention, complexity yields simplicity and simplicity yields complexity (via pattern identification, the application of rules, classification, and other human pursuits). This dependence on human input would explain the convergence of the behaviors of all complex systems on to a tiny sliver of the state (or phase) space (sort of a mega attractor basin). According to this view, Man is the creator of simplicity and complexity alike but they do have a real and independent existence thereafter (the Copenhagen interpretation of a Quantum Mechanics).

Still, these twin notions of simplicity and complexity give rise to numerous theoretical and philosophical complications.

Consider life.

In human (artificial and intelligent) technology, every thing and every action has a function within a "scheme of things." Goals are set, plans made, designs help to implement the plans.

Not so with life. Living things seem to be prone to disorientated thoughts, or the absorption and processing of absolutely irrelevant and inconsequential data. Moreover, these laboriously accumulated databases vanish instantaneously with death. The organism is akin to a computer which processes data using elaborate software and then turns itself off after 15-80 years, erasing all its work.

Most of us believe that what appears to be meaningless and functionless supports the meaningful and functional and leads to them. The complex and the meaningless (or at least the incomprehensible) always seem to resolve to the simple and the meaningful. Thus, if the complex is meaningless and disordered then order must somehow be connected to meaning and to simplicity (through the principles of organization and interaction).

Moreover, complex systems are inseparable from their environment whose feedback induces their self-organization. Our discrete, observer-observed, approach to the Universe is, thus, deeply inadequate when applied to complex systems. These systems cannot be defined, described, or understood in isolation from their environment. They are one with their surroundings.

Many complex systems display emergent properties. These cannot be predicted even with perfect knowledge about said systems. We can say that the complex systems are creative and intuitive, even when not sentient, or intelligent. Must intuition and creativity be predicated on intelligence, consciousness, or sentience?

Thus, ultimately, complexity touches upon very essential questions of who we, what are we for, how we create, and how we evolve. It is not a simple matter, that...

TECHNICAL NOTE - Complexity Theory and Ambiguity or Vagueness

A Glossary of the terms used here

Ambiguity (or indeterminacy, in deconstructivist parlance) is when a statement or string (word, sentence, theorem, or expression) has two or more distinct meanings either lexically (e.g., homonyms), or because of its grammar or syntax (e.g., amphiboly). It is the context, which helps us to choose the right or intended meaning ("contextual disambiguating" which often leads to a focal meaning).

Vagueness arises when there are "borderline cases" of the existing application of a concept (or a predicate). When is a person tall? When does a collection of sand grains become a heap (the sorites or heap paradox)?, etc. Fuzzy logic truth values do not eliminate vagueness - they only assign continuous values ("fuzzy sets") to concepts ("prototypes").

Open texture is when there may be "borderline cases" in the future application of a concept (or a predicate). While vagueness can be minimized by specifying rules (through precisifaction, or supervaluation) - open texture cannot because we cannot predict future "borderline cases."

It would seem that a complexity theory formalism can accurately describe both ambiguity and vagueness:

Language can be construed as a self-organizing network, replete with self-organized criticality.

Language can also be viewed as a Production System (Iterated Function Systems coupled with Lindenmeyer L-Systems and Schemas to yield Classifiers Systems). To use Holland's vocabulary, language is a set of Constrained Generating Procedures.

"Vague objects" (with vague spatial or temporal boundaries) are, actually, best represented by fractals. They are not indeterminate (only their boundaries are). Moreover, self-similarity is maintained. Consider a mountain - where does it start or end and what, precisely, does it include? A fractal curve (boundary) is an apt mathematical treatment of this question.

Indeterminacy can be described as the result of bifurcation leading to competing, distinct, but equally valid, meanings.

Borderline cases (and vagueness) arise at the "edge of chaos" - in concepts and predicates with co-evolving static and chaotic elements.

(Focal) meanings can be thought of as attractors.

Contexts can be thought of as attractor landscapes in the phase space of language. They can also be described as fitness landscapes with optimum epistasis (interdependence of values assigned to meanings).

The process of deriving meaning (or disambiguating) is akin to tracing a basin of attraction. It can be described as a perturbation in a transient, leading to a stable state.
See All articles From Author