next up previous
Next: Pendulum Bifurcation Diagram Up: Driven Oscillatory Systems and Previous: Nonlinear Behavior

Chaotic Behavior

What is chaos? Colloquially, it is a state of disorder or confusion. But in physics it has a more specific (albeit in some sense related) definition: A system is said to be chaotic if a small change in the initial conditions results in an exponentially divergent final state.

So far in this course we have mostly been integrating the equations of motion for systems that give deterministic solutions; i.e., for a given set of initial conditions, we can use these equations to determine the state of the system for all time, and small changes in those initial conditions would result in small changes in the final state. For instance, for orbits Earth around the Sun, one could imagine changing the orbital parameters of the Earth slightly, and the result would be that Earth would settle into a slightly different orbit; it would not, for example, rapidly get shot out beyond the orbit of Pluto. Hence this is a non-chaotic system.

Even for the orbital problem, however, the behavior can quickly become quite complicated. Consider setting up a problem with three bodies, all mutually interacting, with a particular set of initial positions and velocities. Let's say two bodies are orbiting each other calmly, when the third body falls into the system. The near-collision of third body and one of first two will result in a large deflection of their directions into new orbits. Eventually (if all three objects are bound gravitationally) they will come together again resulting in another set of large deflections into another set of new orbits, and so on. Now, each deflection varies strongly with the distance of closest approach. A small difference in distance will change the deflection angle appreciably, which will then change the time and manner of subsequent interactions, and so on. The resulting motion is very complex; it is in fact chaotic. In particular, if we sent in the third particle on a slightly different orbit, that particle could end up very far from where it did previously, because of the strong perturbations inherent in the close approaches.

Rather than examining chaotic orbital dynamics in two dimensions, it is easier to get a flavor of what is going on by examining systems in one dimension. As an example, let us consider a situation that arises in simple models of ecology and economics: the logistic equation.

Consider a population of organisms whose rate of reproduction depends somehow upon the number density of individuals (similar to the Lotka-Volterra equations we had earlier, but only for a single species). Clearly, the rate of reproduction depends upon the number of individuals already present as these are the ultimate means of reproduction. The rate at which these individuals reproduce is also frequently dependent on the availability of resources, which in turn is proportional to the number of individuals using up those resources. We can represent this behavior schematically as

P_{n+1} = P_n (a - b P_n),
\end{displaymath} (11)

where $P_n$ is the population of generation $n$, $a$ is a coefficient describing how the population would grow with time if no other constraints (such as limited resources) applied, and the $b P_n$ term describes how the present population limits the available resources. If $b=0$, we have the usual exponential growth expected of a non-limited population:
P_n = P_0 a^n
\end{displaymath} (12)

where $n$ is the number of generations evolved.

If we adopt a non-zero $b$, however, the equation exhibits more interesting behavior. To cast this equation into the form usually employed, let $P_n = (a/b) x_n$ and let $r=a/4$:

x_{n+1} = 4 r x_n (1 - x_n).
\end{displaymath} (13)

It is easy to show that if we restrict $0 \leq r \leq 1$, then the population is bounded by $0\leq x\leq 1$ (assuming the initial population is within these bounds). This equation is therefore a mapping of a point in the interval $[0,1)$ back onto another point in the same interval $[0,1)$ which is known as a logistic map. For a given initial condition $x_0$, we can determine the state of the system at all future times. The set ${x_n}$ is called the trajectory, or orbit, of the system that we generate by iteration, by applying equation (13) recursively for many generations.

Figure 3: Trajectories for various values of the parameter $r$ in the logistic map

The results of solving equation (13) are shown in Figure 3. Here the logistic equation has been solve for a range of $r$ values, and it is clear that varying $r$ can give more or less complex behavior. For some $r$ values, the solution oscillates at first, and then settles down to a constant value. This value is known as an attractor for the system; no matter what the starting value, the solution settles down to, or is attracted to a given value. The set of initial points which are attracted to this value is called the basin of attraction. Think of a dropping a marble into a basin; wherever you drop it, it will fall toward the bottom as if it were attracted to it.

For other values of $r$, the solution oscillates between several values in what is known as a limit cycle. For $r=0.75$ the limit cycle has an attractor of period 2. As we increase $r$, the behavior becomes even more complex, so that the attractors have a period of 4 or larger.

We can describe the behavior of this system as we increase $r$ by using a bifurcation diagram. A bifurcation diagram shows all the attractor values at any given value of $r$.

Figure 4: Bifurcation diagram for the logistic map. The top region displays the values of $x$ which are visited by the trajectory for a given value of the parameter $r$. The lower curve is the value of the Lyapunov exponent as a function of $r$.

The top curve in Figure 4 shows a bifurcation diagrams for our logistic equation. For each value of $r$, we have iterated the logistic map for some fairly large number of iterations to allow the initial transient effects to damp out, and then plot the points $x$ attained for some additional number of iterations. By varying the value of $r$, you will then have a plot of the values of $x$ visited by the dynamical system (the orbits) as a function of $r$.

For values of $r<0.75$, there is an attractor of period 1, as you can see from Figure 3. When $r$ is increased, however, the line of $x(r)$ bifurcates into an attractor of period 2: the solution oscillates between two values of $x$. For $r$ slightly greater than 0.862, the line bifurcates again to an attractor of period 4. This period doubling continues as we further increase $r$, until the attractor has a very large period indeed. But look in detail at $r$ on $[0.90,0.92]$, as shown in the expanded view in Figure 5: at $r=0.907$ we are back down to a period of six! Hence there can be ``islands'' of simpler behavior within the chaotic motion.

How does this relate to our original definition of chaos? Well, imagine that two populations are started with slightly different values of $x$. In general, the value of $x$ attained after a long period of time will be any one of the attractor values accesible to it in the bifurcation diagram. If $r$ is small (i.e. less than 0.75), then the small difference in initial populations will damp out and they will both converge to the single attractor value. However, as $r$ becomes large, the late-time value of $x$ will be (effectively) randomly selected from all the attractor values accesible to each system. For large $r$, since there are a wide range of attractor values, even very small initial differences can result in large deviations in the final value. This is the essence of the physical definition of chaos.

Returning to the example discussed earlier, the population of an organism after many generations does not depend on initial population so much as on the parameter $r$ which describes the interplay between population growth and limits on the population. Thus, for any initial population $x_o$ between 0 and 1 and a value of $r<0.75$, the population will converge to the same steady state value. For $0.75 < r < 0.862$, any value of $x_o$ converges to a solution with period 2. This means if the time step for each iteration is 1 year then the population of the organism is the value of the upper curve one year, the lower curve the next year, the upper again the next, etc. Thus, the population has a 2 year cycle. For $0.75 < r < 0.862$ the population has a 4 year cycle. $r$ corresponding to the shaded areas of the logistic map indicate the population from year to year is effectively chaotic.

Figure 5: Same as Figure 4, but blown up for a smaller range in $r$ that exhibits interleaved chaotic and non-chaotic behavior.

By now you should be convinced that even a fairly simple dynamical system can exhibit extremely complex behavior. To get an idea of the exponential divergence of orbits discussed above, try the following experiment. Choose a value of $r$ large enough to be in the ``chaotic'' regime. Now pick an initial condition for one orbit (e.g. $x=0.5$) and another, similar, initial condition for another (say $x=0.5001$). How long does it take for the two orbits to diverge significantly as a function of $r$?

We can quantify how fast chaotic trajectories diverge using the Lyapunov exponent $\lambda$, defined by

\vert\Delta x_n\vert = \vert\Delta x_0 \vert e^{\lambda n}
\end{displaymath} (14)

$\Delta x_0$ is the ratio of the initial conditions for two trajectories, and $\Delta x_n$ is the ratio of the points in the respective trajectories after $n$ iterations. If $\lambda<0$, the two trajectories will converge to the same limit cycle, but if $\lambda>0$, the trajectories will diverge exponentially, implying a chaotic behavior.

For our simple logistic map, unfortunately, the magnitude of the separation ceases to increase once it reaches of order one half (why?), so we need a better method to characterize the divergence. Take the logarithm of the previous equation to write

\lambda = {1\over n} ln \left\vert {{\Delta x_n}\over {\Delta x_0}}\right\vert
\end{displaymath} (15)

and note that
{{\Delta x_n}\over {\Delta x_0}} = {{\Delta x_1}\over {\Delt...
...a x_2}\over {\Delta x_1}} {{\Delta x_3}\over {\Delta x_2}} ...
\end{displaymath} (16)

\lambda = {1\over n} \sum_{i=0}^{n-1} ln \left\vert {{\Delta x_{i+1}}\over {\Delta x_i}}\right\vert.
\end{displaymath} (17)

We are really interested in the limit as $\Delta x_i \rightarrow 0$. We can write the ratio of two successive values analytically using the iterated function as
{{dx_{i+1}}\over{dx_i}} = f^\prime(x_i) = 4 r (1 - 2x_i),
\end{displaymath} (18)

and so finally the Lyapunov exponent can be defined by
\lambda = {1\over n} \sum_{i=0}^{n-1} ln \left\vert f^\prime(x_i)\right\vert.
\end{displaymath} (19)

To get a feeling for the general behavior of the system, it is best to compute the sum after the initial transient behavior has died away.

A plot of the Lyapunov exponent as a function of $r$ is shown as the lower curves in Figures 4 and 5. Note the behavior of $\lambda$ in relation to the behavior of the logistic map. $\lambda$ increases from negative values to zero at a bifurcation, and then falls to more negative values. In those regions where chaotic behavior occurs, it rises to a positive value, indicating exponential divergence.

next up previous
Next: Pendulum Bifurcation Diagram Up: Driven Oscillatory Systems and Previous: Nonlinear Behavior
Romeel Dave 2005-03-28