The Chaotic Mechanisms of Scientific Revolution
Perhaps it is human nature to enjoy small pockets of order in this universe that is tending towards chaos. Many of us seek tidiness in our rooms and organisation within our filing systems, and enjoy products with more complexity and functionality. We feel safe in a society with law and controls, and in our houses; structures so boldly displaying straight edges, symmetry and right angles. Maybe this is not just because our lives are made easier or safer but also because this type of order and evolution of technology, politics and society reminds us that as a race we have a direction. However, while enjoying the products of such evolution and our ongoing development it is easy to forget that much of what has become rooted in our culture is a product of revolution, not just evolution. While the Industrial Revolution provided the basis for our cars, or the Digital Revolution resulted in our computers and mobile phones, it is not often considered that such monumental changes were not always intuitive, and indeed often opposed the accepted mainstream of science and ideological thought.
It is the aim of this thesis to discuss the nature of scientific revolutions, and propose that not only are revolutions inevitable but that the direction of human development is no clearer now than it ever was in history. Thus is the chaotic nature of progress.
What is Scientific Progress?
To begin exploring such profound claims, we must first define what is meant by scientific progress. This is in fact a topic that is subject to much philosophical as well as scientific debate, and indeed the ambiguity of such definition is exactly what provides the basis to elaborate in this essay. In an effort to avoid entering into the depths of abstract speculation a general outline will be described here, provided that the reader is aware that this content and indeed the content of the whole essay is intended to be more thought provoking than documentary.
A widely accepted view, and one suggested by the famous philosopher of science Karl Popper, is that it is the task of the scientist to propose a theory that is not only a general statement of how all things of a certain kind behave in a same way, but must also make forecasts of what future observations or experiments will show. In this way, we must never say that the theory has been proved, only that it has stood a test successfully and so can go on making further statements and predictions. If, however, such a test goes against the predictions of this theory then it has been disproved, and one has to start over with a new theory that not only encompasses all the implications of the old theory but also makes provision for the conditions which disproved it.
To take an example from history: When Isaac Newton first proposed his theory of gravitation, over three centuries ago, it was subject to enormous amounts of scrutiny and testing. The fact that it passed all these tests with flying colours did not help the theory much when the first signs of discrepancies started appearing towards the end of the last century. As our technology improves so does our ability to perform more searching experiments, and so eventually we will have enough accuracy to find such small discrepancies, in this case overthrowing Newton’s rule and eventually leading to the birth of Einstein’s theory. So we can often see this zig-zag interaction between theory and experiment; The improvement of our equipment technology until it is sensitive enough to test a theory outside its limits of prediction eventually leads to a new theory, better than the last, that then has implications for developing our technology. Here then we can link science with technology and hence product development, considering that our products today are borne from the same principles of technology advancement.
Most important here however is the concept that one cannot speak of progress as progress in a particular direction. At times, like with the fall of Newton’s theory, we make discoveries that sharply reduce the knowledge we have, knowledge that was without doubt progressing in a certain direction. However it is these processes of knowledge reversal and resettling that lead to the jumps in understanding that later bloom into new and revolutionary science.
Chaos and Self-Organised Criticality
Bak, Tang & Wiesenfeld are the pioneers of a very influential and renowned experiment that was first conducted in 1987, known as the “sand-pile game”. As simple as it sounds, this involved sprinkling grains one at a time onto a table top and watching what happens. As the grains pile up, it is clear to imagine a broad mountain of sand slowly creeping skywards. However, obviously things cannot continue in this way. As the pile grows, its sides become steeper, and the likelihood of an avalanche being triggered by the next falling grain increases. When this happens, the sand grains would then slide downhill to a flatter region below, reducing the height of the mountain, until again it had grown by virtue of the addition of more grains, and thus this process of growing and shrinking would forever fluctuate. By measuring the severity of these avalanches on the height of the mountain, and also how frequently these occurred, they hoped to find out what the typical size of an avalanche was.
To the average reader this might seem like perhaps an unproductive experiment, but actually it had immense implications. By running the experiment on a computer, and highlighting the areas of the sand pile that had become steep, it was found that as the pile grew the scattering of danger spots increased until a dense skeleton of instability ran through the pile. Thus here was a clue to the peculiar behaviour: a grain falling on a danger spot can, through sequential action, trigger sliding at other nearby danger spots. If the distribution of these spots is sparse and isolated, then a single grain has only limited repercussions. If however a grain falls on a pile riddled with danger spots, its consequences become diabolically unpredictable. It might merely trigger a few tumblings, or it might start a catastrophic chain reaction that involves legions of grains. The hypersensitive state that the computer sand pile organises itself into is known as the critical state.
It was found that when measuring the size of these avalanches and how frequently they occurred, that there was no typical size for any shift. That is, one could simply not predict even with an accurate depiction of the sand pile what extent the next grain would have on an avalanche. At this point it would seem fruitless to try and deduce any conclusions of such a model. However, if one considers the frequency of past avalanches in conjunction with their severity, this relationship follows a power law. This profound suggestion indicates that if you double the size of an avalanche, it becomes 4 times less likely (for example). By looking at the past record of avalanches and their sizes, Bak Tang and Weisenfeld noted this discovery. There were lots of records of small shifts, a slightly smaller number of larger shifts, and a tiny number of cataclysmic downfalls.
An interesting characteristic of such power laws is that they are scale-invariant. The principle of scale invariance is that the characteristic of the power law does not change depending on which sizes of avalanches you are considering. Shifts of a microscopic scale would still have the same probability distribution as those that were considered large, and in fact the using the labels “small” and “large” goes against the very essence of scale invariance. Unfortunately it is human nature to try and fit phenomena to defined scales, in order to find some sort of comfortable comprehension of their magnitude. Indeed, this has in the past had detrimental effects on the understanding on complex phenomena. Take for example the occurrence of earthquakes. The movement and friction of plates beneath the earth’s surface causes a build up of stresses that eventually leads to a slipping release of energy. For half a century scientists have been trying to get a grasp of the highly complex nature of the shifting continental plates in an effort to try and predict where and/or when the next big one might occur. But by doing this they are already asking the wrong questions. By ignoring all the tiny almost undetectable slips that are occurring all the time and focussing attention on the larger earthquakes, we are already assigning the phenomena a scale, and this suggests that there is something special about the big shakers, and that there is a typical size. However, apart from the fact that they cause a lot more damage (which is exactly why we give them special attention) the large earthquakes are no different in nature to the small ones that are ignored. The distribution of earthquakes still follows a scale-invariant power law: The Gutenberg-Richter Law, which suggests that if you double the amount of energy released in an earthquake, they become 4 times less likely to occur. By treating the catastrophic occurrences as different to the others, this may have prevented scientists from understanding just how simple the process that generates earthquakes really is. Indeed, the build up of stresses in the earth’s crust is no dissimilar to the build up of steep regions in the sand pile game; both are systems that have organised themselves into a critical state.
In his book entitled “Ubiquity”, Mark Buchanan proposes that the peculiar and exceptionally unstable organisation of the critical state is ubiquitous in our world. Many examples of scale-invariance can be given, including that of earthquakes as mentioned above. Take for example the size of forest fires related to their likelihood; a survey by the US fish and wildlife service revealed that if you double the area covered by a fire and it becomes 2.48 times as rare. In 1996 two researchers from the University of Oxford found the same beautiful power law pattern with outbreaks of measles. In 1998 the physicist Gene Stanley found that in the financial markets price changes become about 16 times less likely each time you double the size. It would seem that the idea of self-organised criticality is a commonly applicable explanation for the workings of things, irrespective of their nature and scale.
So what of Chaos? Although it is more than a century ago that chaos theory has its origins, scientists only realised its true importance in the 80’s. Chaos describes the behaviour of certain non-linear dynamical systems which are highly sensitive to initial conditions. Take for example a trail of smoke creeping through the air. Tiny influences along the way, such as a pocket of hotter air or a tiny gust, can have an extraordinary effect on the future shape and passage of that smoke. In the context of the earth’s atmosphere, chaos arrives us at the “butterfly effect”: the paradoxical conclusion that a butterfly flapping its wings in Spain might just lead to a severe thunderstorm forming over New York in a couple of week’s time.
This sensitivity and the way it invokes unpredictability provokes a comparison with criticality as discussed above. Chaos itself may indeed explain why a tiny cause can greatly change the details of the future, but while it can explain unpredictability it cannot explain the upheavals of complex systems. Chaos can still take place in regions of equilibrium (e.g. the movement of gas inside a balloon) but it is not inside a balloon that we will see thunderstorms. The origin of upheavals such as this, and earthquakes or avalanches of the sand pile game lies in the distinction between simpler equilibrium systems and complex, non-equilibrium systems. Indeed, the idea of criticality is a rapidly growing field of non-equilibrium physics that is hoped can explain both unpredictability and upheavals; by studying the natural patterns that evolve in networks of interacting things.
Furthermore, there is an importance in this distinction of equilibrium and non-equilibrium. On the road to consider complex systems, scientists have come to appreciate that in our surrounding world, history is important. In engineering and science, one cannot understand the nature or composition of an object without referring to the full history of its making. For an environment where nothing changes (like that inside a balloon) history does not matter, however, by contrast one can only make sense of the infinitely intricate shape of a snow flake by following the history of its formation by slow freezing. It are these such problems in non-equilibrium physics that point to the reason why, in a world governed by such simple laws, we see such complexity.
Is Science Chaotic?
By digressing to the topic of self-organised criticality and chaos perhaps now a proposition is in order. Can science be compared to such models? To evaluate this claim there is the need to justify how the network of scientific thought is analogous to that of a physical complex non-equilibrium system like the rocky plates of earth’s crust or the sand pile. Unlike these examples though, scientific ideas are very much non-tangible and so much less accessible to have their effects quantified. There is however one method that we can consider that stems from an intriguing suggestion by Thomas Kuhn in his book “The Structure of Scientific Revolutions”:
“if I am right that each scientific revolution alters the historical perspective of the community that experiences it, then that change of perspective should affect the structure of post-revolutionary textbooks and research publications. One such effect — a shift in the distribution of technical literature cited in the footnotes to research reports — ought to be studied as a possible index to the occurrence of revolutions.”
The implication of this statement is that the size of the effect of a scientific idea presented in a publication can be gauged by monitoring how many times it is cited subsequently to this. If it has only a few citations then clearly it has not done much to cause much stir within the scientific community, or spark new lines of research. If however it has thousands of citations in later publications then obviously this paper has had a huge impact on the structure of science and its members. Indeed, such a study was carried out in 1998 by physicist Sidney Redner. The interesting discovery of this was that for 400,000 papers studied that had citations, their distribution followed a scale-invariant power law. The regular pattern was thus: double the number of citations, and the number of papers receiving this many reduces by about 8. This is exactly what would be expected if, like the sand pile game or the earth’s crust, the network of ideas had organised itself into a critical state.
By identifying this pattern, Kuhn’s suggestion points to a remarkable similarity between the fabric of science and a whole array of other complex systems that demonstrate scale invariant power law distributions. Thus the analogy continues: Each individual idea in a person’s mind that is introduced to the scientific community is like a single grain falling on the sand pile of current knowledge. Depending not on the profundity of the idea but on which field of knowledge this is incident, it might simply land and add more knowledge to the pile. This is what Kuhn refers to as “normal science”, the scientific evolution of current trends that attempts to force nature into the relatively fixed boundaries of a popular paradigm. However, the new idea might cause a shift in more ideas, which in turn causes other scientists to rethink their own ideas to accommodate the new knowledge, which in turn might spark off other tumblings until a full blown “intellectual avalanche” causes a mass upheaval of popular paradigms and research trends. Just as how sand piles high and the tectonic plates build up stress through the action of physical friction — so does the network of prevailing knowledge hold by intellectual friction, and just as how sand tumbles under the action of gravity and earthquakes occur due to the inexorable drift of the earth’s plates — so does scientific revolution get encouraged by intellectual curiosity. Furthermore, when such a revolutionary shift does occur, its effects only propagate as far as absolutely necessary. The sand in the sand pile game, when a portion becomes too steep, will only slip until it has arranged itself just below the threshold, and similarly in an earthquake the rocks will slip until the friction in the rocks is just barely enough to bring things back to a halt. Kuhn suggests the same with the structure of scientific knowledge; pointing to the fact that scientists are, after all, humans who struggle with their share of prejudice, blind ambition, timidity and bias whilst all the time craving certainty of their own ideas, neatly fitting within their treasured paradigm. Any reversal of knowledge only travels as far as it needs to in order to allow for the resume of normal science, where once again new theories can be developed piece by piece.
This idea of self-organised criticality in scientific progress is, of course, not met with universal acclaim. The field of non-equilibrium complex systems is still in its infancy, and the ubiquity of the critical state in our surrounding world is certainly an unconventional belief in today’s scientific community. Nevertheless, the type of revolutionary change explained by this approach is certainly consistent with many physical and intellectual systems. Indeed, perhaps this in itself is a perfect example of the struggle between intellectual curiosity and intellectual friction, with authors like Kuhn and Buchanan providing the strain against more popular “normal” theories of revolution; where big ideas cause big changes, and we know which direction we are going.
The citation method for gauging the impact of scientific thought seems a sensible method, since it even takes into account the mass quantity of publications that have next to or no impact on the sand pile of knowledge. In his paper “…et augebitur scientia”, J.R.Ravetz points out that these invisible pieces of research that seem to serve no purpose, except to exist as a statistic, forbids us from claiming that science is progressing just because work is being published. However, maybe these are just extra grains adding to the instability of our network of ideas; nobody knows when they might be a part of the next avalanche of thought. The Einsteinian revolution only started when Einstein began puzzling over a quirky feature of Maxwell’s equations; it was almost a mere curiosity that ultimately led to the revision of several hundred years of physics and the theory of relativity.
So are these revolutions predictable? Surely we know that the sand pile will always topple, and earthquakes will continue to occur. The problem is that wherever contingency is dominant, a tiny event can shunt the future irrevocably down a complex chain of events. This is true for the examples given in this thesis, as well as in human history. Yet the former cases also share the character of the critical state, reflected in the simple, statistical scale-invariant power laws that reveal a profound hypersensitivity inherent in the system and the lack of any expected magnitude for the next event. So, even though chains of events in these systems may not be predictable, it is not true that nothing is predictable. It is in the emerging statistical pattern of outcomes over many chains of events that we might hope to discover the laws for things historical.
As we advance through this developing world at break-neck speed, we must remember then that although we might be adding the fuel, we are not holding the wheel. To think our technology, science, and products have a predetermined direction is to deny the very mechanisms that caused the upheavals so rooted in the history of our development. Furthermore, if the notion of self-organised criticality is to be applied to this context then there is no typical magnitude of revolution, and the largest changes could occur at any time, depending on where they were initiated. Amongst all of this speculation and complexity, one thing is for certain though; we should expect the unexpected.
KUHN, T,. 1996. The Structure of Scientific Revolutions (3rd. Ed.). Chicago Press
BONDI, H,. 1973. What is Scientific Progress?. The Herbert Spencer Lectures. Chicago Press
MONOD, J, L,. 1973 On the Molecular Theory of Evolution. The Herbert Spencer Lectures. Chicago Press.
BUCHANAN, M,. 2000 Ubiquity. Phoenix: Guernsey Press.
HARRE, R,. 1974 The Problems of Scientific Revolution. Oxford: Clarendon Press
HORWICH, P,. 1993 Thomas Kuhn and the Nature of Science. MIT Press
RAVETZ, J, R,. 1976 ..et Augebitur Scientia (and knowledge shall increase). Chicago Press
POPPER, K,. 2002 Conjectures and Refutations the Growth of Scientific Knowledge. Routledge