What distinguishes "complexity economics"? W. Brian Arthur offers a short readable overview in "Foundations of complexity economics" (Nature Reviews Physics 3: 136–145, 2021).
This is a personal essay, rather than a literature review. For example, Arthur explains how the modern research agenda for complexity economics emerged from work at the Santa Fe Institute in the late 1980s.
How is complexity economics different from regular economics?
Complexity economics sees the economy — or the parts of it that interest us — as not necessarily in equilibrium, its decision makers (or agents) as not superrational, the problems they face as not necessarily well-defined and the economy not as a perfectly humming machine but as an ever-changing ecology of beliefs, organizing principles and behaviours.
How does a researcher do economics in this spirit? A common approach is to describe, in mathematical terms, a number of decision-making agents within a certain setting. The agents start off with a range of rules for how they will perceive the situation and how they will make decisions. The rules that any given agent uses can change over time: the agent might learn from experience, or might decide to copy another agent, or the decision-making rule might experience a random change. The researcher can then look at the path of decision-making and outcomes that emerge from this process--a path which will sometimes settle into a relatively stable outcome, but sometimes will not. Arthur writes:
Complexity, the overall subject , as I see it is not a science, rather it is a movement within science ... It studies how elements interacting in a system create overall patterns, and how these patterns, in turn, cause the elements to change or adapt in response. The elements might be cells in a cellular automaton, or cars in traffic, or biological cells in an immune system, and they may react to neighbouring cells’ states, or adjacent cars, or concentrations of B and T cells. Whichever the case, complexity asks how individual elements react to the current pattern they mutually create, and what patterns, in turn, result.
As Arthur points out, an increasingly digitized world is likely to offer a number of demonstrations of complexity theory at work.
Now, under rapid digitization, the economy’s character is changing again and parts of it are becoming autonomous or self- governing. Financial trading systems, logistical systems and online services are already largely autonomous: they may have overall human supervision, but their moment-to-moment actions are automatic, with no central controller. Similarly, the electricity grid is becoming autonomous (loading in one region can automatically self- adjust in response to loading in neighbouring ones); air-traffic control systems are becoming autonomous and independent of human control; and future driverless-traffic systems, in which driverless-traffic flows respond to other driverless-traffic flows, will likely be autonomous. ... Besides being autonomous, they are self- organizing, self- configuring, self-healing and self- correcting, so they show a form of artificial intelligence. One can think of these autonomous systems as miniature economies, highly interconnected and highly interactive, in which the agents are software elements ‘in conversation with’ and constantly reacting to the actions of other software elements.
To put it another way, if we want to understand when these kinds of systems are likely to work well, and how they might go off the rails or be gamed, complexity analysis is likely to offer some useful tools.
But what about using complexity theory for economics in particular? As Arthur writes: "A new theoretical framework in a science does not really prove itself unless it explains phenomena that the accepted framework cannot. Can complexity economics make this claim? I believe it can. Consider the Santa Fe artificial stock market model."
For example, there's a long-standing issue of why stock markets see short-run patterns of boom and bust. Another puzzle of stock markets is why there is so much trading of stocks. Sure, stock traders will disagree about the underlying value of stocks and about the meaning of recent news which affects perceptions of future value. Such disagreements will lead to a modest volume stock trading, but it's hard to see how they lead to the extremely high volumes of trading seen in modern markets. John Cochrane phrased this point nicely in a recent interview with Tyler Cowen:
Why is there this immense volume of trading? When was the last time you bought or sold a stock? You don’t do it every 20 milliseconds, do you? I’ll highlight this. If I get my list of the 10 great unsolved puzzles that I hope our grandchildren will have figured out, why does getting the information into asset prices require that the stock be turned over a hundred times? That’s clearly what’s going on. There’s this vast amount of trading, which is based on information or opinion and so forth. I hate to discount it at all just as human folly, but that’s clearly what’s going on, but we don’t have a good model.
Here is Arthur's description of how complexity economics looks at these stock market puzzles:
We set up an ‘artificial’ stock market inside the computer and our ‘investors’ were small, intelligent programs that could differ from one another. Rather than share a self- fulfilling forecasting method, they were required to somehow learn or discover forecasts that work. We allowed our investors to randomly generate their own individual forecasting methods, try out promising ones, discard methods that did not work and periodically generate new methods to replace them. They made bids or offers for a stock based on their currently most accurate methods and the stock price forms from these — ultimately, from our investors’ collective forecasts. We included an adjustable rate-of-exploration parameter to govern how often our artificial investors could explore new methods.
When we ran this computer experiment, we found two regimes, or phases. At low rates of investors trying out new forecasts, the market behaviour collapsed into the standard neoclassical equilibrium (in which forecasts converge to ones that yield price changes that, on average, validate those forecasts). Investors became alike and trading faded away. In this case, the neoclassical outcome holds, with a cloud of random variation around it. But if our investors try out new forecasting methods at a faster and more realistic rate, the system goes through a phase transition. The market develops a rich psychology of different beliefs that change and do not converge over time; a healthy volume of trade emerges; small price bubbles and temporary crashes appear; technical trading emerges; and random periods of
volatile trading and quiescence emerge. Phenomena we see in real markets emerge. ...
I want to emphasize something here: such phenomena as random volatility, technical trading or bubbles and crashes are not ‘departures from rationality’. Outside of equilibrium, ‘rational’ behaviour is not well- defined. These phenomena are the result of economic agents discovering behaviour that works temporarily in situations caused by other agents discovering behaviour that works temporarily. This is neither rational nor irrational, it merely emerges.
Other studies find similar regime transitions from equilibrium to complex behaviour in nonequilibrium models. It could be objected that the emergent phenomena we find are small in size: price outcomes in our artificial market diverge from the standard equilibrium outcomes by only 2% or 3%. But — and this is important — the interesting things in real markets happen not with equilibrium behaviour but with departures from equilibrium. In real markets, after all, that is where the money is made.
In other words, the key to understanding dynamics of stock markets may reside in the idea that investors are continually exploring new methods of investing, which in turn leads to high volumes of trading and in some cased to dysfunctional outcomes. Of course, Arthur offers a variety of other examples, as well.
For those who would like more background on complexity economics, one starting point would be the footnotes in Arthur's article. Another place to start is the essay by J. Barkley Rosser, "On the Complexities of Complex Economic Dynamics," in the Fall 1999 issue of the Journal of Economic Perspectives (13:4, 169-192). The abstract reads:
Complex economic nonlinear dynamics endogenously do not converge to a point, a limit cycle, or an explosion. Their study developed out of earlier studies of cybernetic, catastrophic, and chaotic systems. Complexity analysis stresses interactions among dispersed agents without a global controller, tangled hierarchies, adaptive learning, evolution, and novelty, and out-of-equilibrium dynamics. Complexity methods include interacting particle systems, self-organized criticality, and evolutionary game theory, to simulate artificial stock markets and other phenomena. Theoretically, bounded rationality replaces rational expectations. Complexity theory influences empirical methods and restructures policy debates.
Timothy Taylor is an American economist. He is managing editor of the Journal of Economic Perspectives, a quarterly academic journal produced at Macalester College and published by the American Economic Association. Taylor received his Bachelor of Arts degree from Haverford College and a master's degree in economics from Stanford University. At Stanford, he was winner of the award for excellent teaching in a large class (more than 30 students) given by the Associated Students of Stanford University. At Minnesota, he was named a Distinguished Lecturer by the Department of Economics and voted Teacher of the Year by the master's degree students at the Hubert H. Humphrey Institute of Public Affairs. Taylor has been a guest speaker for groups of teachers of high school economics, visiting diplomats from eastern Europe, talk-radio shows, and community groups. From 1989 to 1997, Professor Taylor wrote an economics opinion column for the San Jose Mercury-News. He has published multiple lectures on economics through The Teaching Company. With Rudolph Penner and Isabel Sawhill, he is co-author of Updating America's Social Contract (2000), whose first chapter provided an early radical centrist perspective, "An Agenda for the Radical Middle". Taylor is also the author of The Instant Economist: Everything You Need to Know About How the Economy Works, published by the Penguin Group in 2012. The fourth edition of Taylor's Principles of Economics textbook was published by Textbook Media in 2017.