Academic Home

Thoughts on the Edge of Chaos

M.M.Taylor and R.A. Pigeau

 

1. Overview

2. Basic Ideas: Information and structure, Attractors and Repellors

3. Basic Ideas: Catastrophe

4. Structure and Chaos

5. Six Kinds of Replication

6. Categories and Logic

7. Surprise and importing structure

8. Replication

 

Basic Ideas

Several foundational notions underly this discussion.

Information and Structure

The concept of structure as a measure of information or entropy (and vice-versa) is central. Any dynamic system has a state which changes. Its changes of state occur within a "state space" whose boundaries represent the range of states the system could possibly occupy. The information that an observer obtains by knowing the state depends on the probability distribution of states that the observer had before the observation, and the structure of the system (for that observer) depends on the difference between that information and the maximum information that could have been provided by the state space of the system. In symbols: H = Sigmai pi log pi is the information gained, where pi is the prior probability of the system being in state i, and R = M - H is the structure (redundancy) of the system, where M is what the information would have been if all the pi had been equal. (These formulae are the simplest forms; interactions and conditional information complicate them, but are not necessary to the immediate argument).

Structure from non-equilibrium energy flow

Structure develops from strong energy flows in systems that are far from thermodynamic equilibrium, and is maintained through feeback. The system whose structure is maintained by this energy flow is obviously not isolated, and thus the conventional analysis of ever-increasing entropy (decreasing structure) is not directly applicable. Negative entropy can be imported to compensate for the increase of internal entropy. Indeed, a strong energy flow is the only way a complex system can maintain a constant level of structure, although the flow itself may (and usually will) change the details of the structure.

The energy flow in itself provides no specific information to determine the form of the structure that it maintains, and thus informational analyses can be performed as if the structure were isolated. To make the example more concrete, the food one eats is necessary to provide the energy that enables one to think, but it does not affect the nature of what one thinks (except by routes different from those through which it provides nutrition).

Attractors, repellors, and chaos

Feedback within an isolated structure (regenerative feedback) leads the structure from one state to another in an internally-defined way that produces either a trajectory or a map in state space (a trajectory represents a continuous change of state, whereas a map represents successive states in a discrete series). The functions that determine the trajectory or map are inherent in the structure, and are non-linear. We will consider mainly maps, since they may be derived from trajectories in the continuous case, and are complete representations of the successive states in the discrete case.

The map of a particular point in state space is called an orbit. A forward orbit represents the future of the system starting from the given point, a backward orbit the past. Forward orbits must be unique, whereas backward orbits can be multivalued (several past states could have given rise to the present one). The initial state that induces an orbit may be chosen freely within the state space, but after a while, most orbits will have converged to approximate an attractor. Attractors are of three kinds: fixed, periodic, or chaotic. A given function, even a very simple one such as xn = k x(n - 1)(1 - x(n - 1)) where 0 <= x <= 1, can result in all three possibilities, depending on the value of the parameter and the initial value of the variable (values of x outside this range tend to grow without bound unless k is smaller than 1 / | x - x 2 |, in which case they converge to zero). To give some specific examples, if k = 0.5, x approaches zero; if k = 2.0, x approaches 0.5; if k = 3.3, x cycles between 0.8326... and 0.4794...; and if k=3.7, the value of x varies chaotically, as shown in Fig.1.

Fig 1. The bifurcation route to chaos, showing successive period doubling of the function xn = r xn-1(1 - xn-1) for r between 2.9.and 4. (From http://www.pa.msu.edu/~bauer/applets/Chaos-Feigenbaum/feig.html)

Between the value of k for which the attractor is a single value and that for which the attractor is chaotic, there is a sequence of bifurcation values of k where the period of the attractor successively doubles. These values of k are related by Feigenbaum's number. If kn is the lowest value of k at which the period of the attractor is 2n, then kn- kn-1 / ( kn+1- kn) -> 4.6692016.... Feigenbaum's number is a universal constant like PI, which appears whenever the route to chaos is through a sequence of period doublings, not only for the simple parabolic recursion mentioned shown in Fig 1.

The attractors of its map define the possible self-consistent states of a dynamic system, no matter how complex. A state on the attractor always leads to another state on the attractor. If the system state is externally perturbed slightly away from an attractor, it will (almost always) return to that attractor. The region of state space within which the state will move toward an attractor is called the basin of attraction for that attractor.

In addition to attractors, there are self-consistent maps called repellors. As with an attractor, a state on a repellor will always lead to another state on the repellor, but if the system state is externally perturbed from a repellor, it will not return, but will move to an attractor. Repellor maps usually form the boundary between basins of attraction, but may occur totally enclosed within a single basin of attraction. If a repellor bounds more than two basins of attraction, it has the strange property that every point on it is on the boundary of all the basins that it bounds. This fact has two consequences: (i) the repellor has a fractal geometry within the state space, and (ii) the dynamics of the repellor are chaotic.

Chaotic repellors seem to be central to a consistent view of cognition. They represent states of alertness, which permit the system rapidly to fall into basins of attraction that correspond to externally provided information.

 

Top of Page
Previous pagenext page