Arpan Dey

# Chaos Theory And Fractals

Updated: Aug 28, 2022

The edge of a coastline, population growth, weather forecasting and many more. What links and explains so many phenomena is an entirely new science (although technically, chaos theory is a branch of mathematics) – the science of order within disorder, the science of chaos. And although chaos theory is the science of unpredictability, the rules of complexity are universal, and apply to *all *dynamical systems, regardless of their constituents. In short, chaos theory is the study of the underlying rules and order in chaotic, apparently-random systems. At this point, please note that a complex system is not necessarily chaotic, and the reverse is also true. A chaotic system need not be complex. A simple system can give rise to chaos.

It was difficult to set up chaos as a mainstream science. However, today it is clear that chaos theory is a highly important science. Chaos theory is a new science. As James Gleick says in his book titled Chaos: "Only a new kind of science could begin to cross the great gulf between knowledge of what one thing does – one water molecule, one cell of heart tissue, one neuron – and what millions of them do." Chaos theory, as is evident from the name, deals with complex dynamical systems, which show chaotic behavior, that is, behavior that can’t be predicted easily. Chaotic dynamics essentially depend on two things: expansion and recurrence. Most systems which show expansion and recurrence will show chaotic behavior. Complexity arises when, roughly speaking, there are competing effects. Like gravity trying to crush everything down, the expansion of the universe trying to blow everything apart.

So far in physics, we have been considering ideal situations and using perturbation techniques to get an approximate result. But the complex Nature around us doesn’t work that way. Prediction is a messy business, and nothing can be predicted with a 100% certainty. But still, prediction is important in modern science. Now, predicting complex behavior using a set of equations is not possible, even in theory. We must use chaos theory or nonlinear dynamics for that. Chaos theory is a new way of looking at Nature. To understand chaos, we need to understand fractals. Fractals are intricate and complex patterns, some of which repeat endlessly. You can see fractal patterns everywhere in Nature. And all of these patterns arise from some underlying rules. Something as beautiful and complex as a fern can be constructed mathematically. I am talking about Barnsley’s fern. There are many other examples of fractal patterns in Nature like the galaxies, trees, the lightning sound, snowflakes and more.

One of the best ways to understand chaos theory is to look at animal populations. Let us assume that the equation *xnext=r.x.(1–x)* represents the growth of a population. Here, *xnext* represents the population for the next year, while *x *is the population for that existing year. *r *represents a rate of growth, which may change. The term *(1-x) *keeps the growth within bounds; as *x *increases, *(1-x) *falls. (It should be noted that, in this model, for convenience, the population is expressed as a fraction between zero and one, where zero represents extinction, and one the upper limit.) If the population falls below a certain level one year, it is liable to increase in the next. But if it rises too high, competition for space and resources will tend to bring it within bounds. Any population will reach equilibrium after many initial fluctuations. The population gradually goes extinct for small values of *r*. For bigger values of *r*, the population may converge to a single value. For greater values still, it may fluctuate between two values, and then four, and so on. But everything becomes unpredictable for greater values. The population-versus-rate curve, gradually, though initially single, breaks into two, four… and then goes chaotic. When *r *is between 0 and 1, the population ultimately goes extinct. Between *r=1* to *r=3*, the population converges to a single value. At about *r=3*, the graph bifurcates, since at this value of *r*, the population doesn’t converge to a single value, but fluctuates between two values. For greater values of *r*, the bifurcation speeds up; and after a quick succession of period doublings soon the graph becomes chaotic. By this is meant that, for those corresponding values of *r*, the population fluctuates unpredictably between random values, and never exhibits a periodic behavior. However, on closer inspection, it is evident that the graph becomes predictable at certain points, between the chaotic portion. These can be referred to as ‘windows of order amidst the chaos.’ On further investigations, Mitchell Feigenbaum found that, on dividing the width of each bifurcation section by that of the next one in the above graph, the ratio always converges to a constant value, now known as the Feigenbaum constant, 4.669... For all bifurcation diagrams, no matter what function has been used, this number is the same. The chaotic portion of the graph is actually a fractal. A fractal is a complex pattern that repeats endlessly on closing in. On zooming in, it is evident that the chaotic part, in the above graph, repeats the same pattern endlessly. However, a fractal might not always be self-similar, that is, reveal similar patterns on zooming in. Even the coastline of Great Britain is a fractal. A fractal, roughly speaking, is a complex pattern which has a measure of roughness.

One of the most important predictions of chaos theory is that systems with *slightly *different initial conditions give rise to *hugely *different results. Technically, this is called sensitivity toward initial conditions. The most popular example is the butterfly effect. A butterfly flapping its wings can give rise to a chain of events which might end up creating a thunderstorm in some distant place. This is only an example, and this idea applies to everything in our universe. Tiny changes in the initial conditions produce results that are very different from each other and are, thus, unpredictable.

The applications of chaos theory in weather prediction are widely known. Edward Lorenz wanted to predict weather conditions. He used three differential equations: *dx/dt=σ(y-x)*; *dy/dt=ρx-y-xz* and *dz/dt=xy-βz*. Here, σ represents the ratio of fluid viscosity to thermal conductivity, ρ represents the difference in temperature between the top and bottom of the system and β is the ratio of the box width to the box height (the entire system is assumed to be taking place in a three dimensional box). In addition, there are three time-evolving variables: x (which equals the convective flow); y (which equals the horizontal temperature distribution) and z (which equals the vertical temperature distribution). For a set of values of σ, ρ and β, the computer, on predicting how the variables would change with time, drew out a strange pattern (now referred to as the Lorenz attractor). Basically, the computer plotted how the three variables would change with time, in a three dimensional space. The lines curved out by the computer seem to be ‘attracted’ to two points. Also, in the attractor, no paths cross each other. This is because, if a loop is formed, the path would continue forever in that loop and become periodic and predictable. Thus, each path is an infinite curve in a finite space. (Though this idea seems strange, this can actually be demonstrated by a fractal. Essentially, a fractal continues infinitely; though it can be represented in a finite space.) What is interesting about the Lorenz attractor (and many other systems) is that it is, simultaneously, chaotic yet stable (perhaps the wording is not fully correct, but you get the idea). No matter what perturbations the system is exposed to, one always gets back the infinite, complex fractal, which is itself chaotic.

Another interesting phenomenon in chaos theory is mode locking or entrainment. In Chaos, Gleick writes: "This phenomenon, in which one regular cycle locks into another, is now called entrainment, or mode locking. Mode locking explains why the Moon always faces the Earth, or more generally why satellites tend to spin in some whole number ratio of their orbital period: 1 to 1, or 2 to 1, or 3 to 2. When the ratio is close to a whole number, nonlinearity in the tidal attraction of the satellite tends to lock it in. Mode locking occurs throughout electronics, making it possible, for example, for a radio receiver to lock in on signals even when there are small fluctuations in their frequency. Mode locking accounts for the ability of groups of oscillators, including biological oscillators, like heart cells and nerve cells, to work in synchronization. A spectacular example in Nature is a Southeast Asian species of firefly that congregates in trees during mating periods, thousands at one time, blinking in a fantastic spectral harmony." Although the term sounds technical, the idea is not difficult to understand. Simply put, when two chaotic systems couple, it results in synchronization and stability. Another example is that when the audience starts clapping, although initially everyone claps at their own pace, after a few seconds, the clapping spontaneously becomes synchronized and everyone (unconsciously, of course) claps together. Really amazing, isn't it? The underlying message is the same: *it is not random and chaotic: there is some order within the chaos. *It may, for instance, seem that a chaotic system behaves randomly and unpredictably, but there are some underlying rules, which may be simple. In fractals, we get a complex pattern, but it is mostly formed by following a simple iteration (as we will see).

We will talk about a very interesting fractal now: the Sierpiński triangle. And how to generate it by playing the chaos game. Three non-collinear points (say, A, B and C) are chosen on a plane, such that they form an equilateral triangle. A random starting point (say, P) is chosen anywhere on the plane. The game proceeds by following certain simple rules. A die is rolled. If the outcome is 1 or 2, the point halfway between the points P and A, is marked. Similarly, if the outcome is 3 or 4, the midpoint of the line segment joining the points P and B is marked. For outcomes 5 or 6, the midpoint of the line segment joining the points P and C is marked. As the game continues, the midpoint of the line segment joining the point last obtained, with A, B or C (depending on the outcome), is marked. If this is continued for long enough, the collection of all the marked points resembles a beautiful fractal called the Sierpiński triangle. The Sierpiński triangle has an infinite length, because the fractal continues infinitely. However, the area tends to zero, since most of it is just empty space and there is no solid surface. The Sierpiński triangle behaves like an attractor. All points on the plane seem to be attracted in a certain pattern, away from the empty triangular regions. Such a system is *not *sensitive to initial conditions. This is because, no matter wherever we choose the starting point to be, we will always get back the same pattern, provided we plot points as per the rules of the chaos game.

Intuitively, this fractal, with an infinite length, is ‘more’ than a one dimensional pattern, but ‘less’ than a two dimensional figure, since its area tends to zero. In fact, the fractal dimension of the Sierpiński triangle lies between 1 and 2. Yes, fractals can have fractional dimensions. Though this idea seems absurd, the dimension of a fractal is basically a measure of its roughness. If a one dimensional line is broken into two equal halves, that is, if it is scaled by one half; its mass is also scaled down by one half, since two such halves will reproduce the original line. Similarly, if we scale the side of a square by one half, its mass is scaled by one fourth, since it takes four squares (each of a length one half the length of the original square) to reconstruct the original square. One fourth is just one half raised to the power of two, and this number is the dimension of the square, which is two. Similarly, just as a line is one dimensional and a square two dimensional, a cube is three dimensional because if a side of the cube is scaled by one half, the mass is scaled down by one eighth (or one half raised to the third power), and it takes eight copies of the smaller cube to generate the original cube. For a Sierpiński triangle, on scaling it by one half, we get a similar, but smaller pattern, three of which, when arranged in the right pattern, give back the original triangle. Thus, the mass has been scaled by one-third. Following the above line of reasoning, this means that one half raised to the power of (say) x, should equal one third. This x is the dimension of the Sierpiński triangle. This gives x≈1.585, which is the fractal dimension of a Sierpiński triangle. The ‘chaos game’ may be played with more than three points, to generate more complex fractals. Another interesting fractal is the Sierpiński carpet. It can be generated by dividing a square into a 3X3 matrix, that is, nine squares and then removing the middle square. This operation is then repeated on the eight remaining squares, and so on infinitely. When this same activity is carried out on a three dimensional cube, the Menger sponge is formed. Interestingly, it has an infinite surface area but zero volume.

Now, we will look at some applications of chaos theory. Chaos theory has numerous applications in medicine. The idea is that mathematical tools can help biologists and physiologists understand the complex systems of the human body, *without a thorough knowledge of local detail*. Chaos theory has successfully explained the sudden, aperiodic and chaotic behavior of the heart, called ventricular fibrillation. According to chaos theory, the fibrillation is the result of disorder of a complex system, like the human heart. Though all individual parts of the heart seem to work perfectly, yet the whole system becomes chaotic, and fatal for human life. Ventricular fibrillation is not a behavior that returns to stable conditions on its own; rather this fibrillating state is itself 'stable chaos.' Fractal geometry also allows the formation of bounded curves of great lengths, and that is how the lungs manage to accommodate so large a surface area inside so small a volume, which in turn, increases the efficiency of the respiratory system. Fractal geometry has also been used to model the dynamics of the HIV virus, which is responsible for AIDS. Bone fractures are fractal and even the surface structures of cancer cells display fractal properties, and perhaps this property can be manipulated to detect cancerous cells at an early stage. Fractal patterns exist throughout the body – from the tissues to the way blood vessels branch.

Nigel Lesmoir-Gordon writes in his book Introducing Fractal Geometry, "It is entirely conceivable that the low level of fractal complexity in modern inner cities is a strong contributing factor to the high incidence of depression reported in these kinds of environment." This may be why we still are fascinated by the complex architecture of ancient times.

Before we end this blog, let us address two common misconceptions about chaos theory and fractals. Number one. Chaos theory does *not *prove that Nature is random and unpredictable. It is, in theory, possible to make a prediction, but for that you need to know the initial conditions of the system to a high degree of accuracy, which is *practically *impossible. Number two. While fractals are used to model natural things like snowflakes, coastlines and ferns beautifully, a snowflake, the coastline of Great Britain or a fern is not *really *a fractal. A natural snowflake does not really repeat *infinitely *if we care to zoom in.