The last post tried to relate the discussion in road traffic and other traffic networks to the current debate in macroeconomics. It has found that the emphasize on the local origins or systematic traffic congestion is matched by the shift of economists' attention from looking at aggregate shocks to considering idiosyncratic sectoral shocks when trying to explain the volatility and sudden starts and stops of the business cycle.
I have argued that some insights that the discussion of road traffic inefficiencies yielded might both support and maybe even enhance the standpoint that academics like Acemoglu or Gabaix take. Local digestions, i.e. idiosyncratic sectoral shocks, are inherent to the system road traffic, due to our imperfect anticipation of other drivers' behaviour. Likewise, most papers that try to argue that sectoral shocks might lead to aggregate fluctuations assume that these idiosyncratic shocks come about with equal probability in every sector. What I want to argue is that these sectoral shocks do only spread through the input-putput supply network if other sectors are not able to substitute away from the sector that experienced the shock (ie change routes as they see a congestion coming up) or if firms that are sufficiently close connected to the sector act sufficiently unccoperative. In other words, countries where sectors are heavily dependent on smooth supply schedules from closely connected sectors should experience higher levels of fluctuations in the business cycle, ie higher GDP volatility.
In order to support this, it might be interesting to see whether one can find some empirical evidence for the argument that the flexibility of a sector's production function across all sectors (ie how volatile the economy as a whole is to sectoral supply shocks) and intersectoral cooperation are related to GDP volatility.
In order to do so, one will need measures that allow to compare this asymmetry in input-output networks and intersectoral ooperation across countries.
The previous post already hinted at the importance of the notion of network centrality when trying to assess a sector's potential ability to substitute away from certain inputs. In their paper "Vertex centralities in Input-Output Networks reveal the Structure of Modern Economies" (2011) Florian Bloechl, Fabian Theis, Fernando Vega-Redondo and Eric Fisher suggest two possible ways of measuring vertex centralities in input-output networks. I will only focus on Random Walk Centrality, as this seems to be the more relevant measure for the purposes of this project.
Based on Freeman's closeness centrality, which is defined as the inverse of the mean geodesic distance from all nodes to a particular one, the authors define random walk centrality to be a generalization of this measure that allows its application to input-output tables.
The idiosyncratic shocks are assumed to be supply shocks that cloesely connected sectors experience. These supply shocks flow through the network of intermediate inputs. The pattern of this flow is modelled as a random walk. A high random walk centrality of a sector therefore corresponds to the idea that a sector is sensitive to supply conditions anywhere in the economy, ie that he is volatile to idiosyncratic shocks in many other sectors in the economy.
The authors provide codes to obtain the sectoral random walk centralities for a given input-output network. Data on these networks is provded by the OECD and the results are comparable across countries. I would like to argue that the simple mean across a country's sectoral random walk centralities might provide a decent measure for the asymmetry in a country's input output network. The higher the average random walk centrality of a given network, the more sensitive an economy is to idiosyncratic sectoral shocks.
The OECD input-output tables are most reliably available for the year 2000. I therefore suggest to consider the standard deviation from trend for a given country from 1995 to 2005 as a measure of a country's GDP volatility. I obtained this data using the OECD quarterly national accounts and using the standard procedure to obtain the standard deviation from trend (ie the applying the hp-filter on the logged variables and taking the standard deviation of the cycle).
I yet have some issues finding an appropriate measure that allows to quantify levels of intersectoral cooperation across country's. The World Bank's corruption index might be an option, yet does not fully capture the concept of intersectoral cooperation. It might be possible that the hypothesis needs to be modified in order to allow for an empirical test of the predictions.
For now, I have carried out a simple OLS on the model:
volatility=constant+a*asymmetry+error
The first results are promising , yet far from being ready to be published as more data will need to be obtained in order to allow for a more sophisticated model.
Learning from Ants and Birds: Hot and Cold Flushes in Business Cycles
A Complexity view on Business Cycles
Saturday, February 18, 2012
Saturday, February 4, 2012
Some first results
This post will try to analyse how the previous findings could enhance the most recent discussion on the origin of the occurrence of sudden starts and stops in aggregate economic activity. In order to do so it will establish an understanding of where research seems to be standing at the moment by looking at some more recent papers. It will conclude that complexity science supports the current direction of research and might actually be able to provide an alternative point of view on the current standing of research in business cycles.
The analysis of this project so far has highlighted the role that individual incentive-structures and local occurrences of congestion seem to play when trying to disentangle the reasons for systematic inefficiencies in road traffic. Consequently, when trying to apply our results to the research question, our analysis will focus on the role of shocks at the micro level and how these could potentially spread to become systematic, i.e. to cause aggregate fluctuations. Yet, most of the research that macroeconomics has devoted to the business cycle focuses on the role of aggregate shocks to output (such as changes in monetary or economic policy, wars, natural disasters) and argues that idosyncratic shocks to individual firms and sectors average out in the aggregate, due to a diversification effect. But more recently scholars like Gabaix (2009) or Acemoglu (2011) have argued that such idiosyncratic shocks to individual firms or sectors might in fact be responsible for most aggregate fluctuations.
In their paper "The Network Origins of Aggregate Fluctuations" Acemoglu et al (2011) argue that "local" microeconomic shocks may lead to aggregate fluctuations due to inter-sectoral input-output linkages.
Acemoglu et al model the aggregate economy as an inter-sectoral network where different sectors interact through input-output linkages. They present a static variant of the model that Long and Posner first presented in their 1983 paper "Real Business Cycles". In this economy households supply labor and choose their input output bundles, maximizing their Cobb-Douglas utility function. There are n numbers of goods, where each is being produced by a sector and can either be consumed or used as an input for production by other sectors. The subsequent inter-sectoral input-output relations form the inter-sectoral network of this economy.
The authors now prove mathematically that given the presence of such input-output linkages, the diversification argument that has been brought forward to dismiss the role of micro shocks in aggregate fluctuations may not hold. If the roles that sectors play as direct or indirect suppliers to others are significantly asymmetric, then "sizable aggregate fluctuations may originate from microeconomic shocks".
These findings correspond to the results of Xavier Gabaix's paper "The Granular Origins of Aggregate Fluctuations" (2009). Gabaix argues that if the distribution of firm size is fat-tailed, then the diversification argument breaks down. A fat-tailed distribution of firm size and a significant asymmetry in the roles of sectors as output suppliers might here be interpreted to be mirror-images of the same phenomenon.
So current research seems to be moving away from focusing on the role of aggregate shocks (note that these are mostly exogenous) to underlining and further examining the importance of shocks to individual firms or sectors. I find this very pleasing as this tendency is in line with the sentiment of this project for the entire discussion on road traffic was exclusively centered around endogenous causes for the occurrence of inefficiency and highlighted the role of "shocks" that happened at the link level.
It will now be interesting to see how, based on the model that Acemoglu et al and Gabaix use, one could apply some of our previous results to the occurrence of hot and cold flushes in business cycles and relate them to the current standing of macroeconomic research.
Compare the inter-sectoral network to the model of road traffic where road traffic is represented as a network of junctions and links. The following interpretation of possibly existing parallels between both models might seem reasonable. A particular route might represent a particular sector of the economy. The cars on this route at any given point in time could represent firms that operate in this sector. Firms that engage in the production of several products in different sectors can be represented by cars that pass through different roads. The input-output linkages between sectors in the inter-sectoral network model could possibly be compared to the junctions that connect different roads and routes. All of the above is of course simplistic, yet I feel that given an awareness for the simplicity of this approach, it might provide a ground on which we can potentially transfer some of our results.
In line with Acemoglu et al, our former considerations yield the observation that systematic congestions, i.e. aggregate fluctuations across the wider network, can be rooted in congestions, i.e. shocks on the micro level. The mechanisms through which these local shocks might spread seem to be quite well understood in both systems. In road traffic the velocity fluctuations can create backward spreading waves of increasing fluctuations which then break down the free flow of traffic. These fluctuations can spread over the junction to the system level and cause aggregate breakdowns in wider areas. In the economy a idiosyncratic shock leads to low output levels in the relevant sectors, which in turn leads to low input levels in first-order connected sectors, which in turn affects second-order connected sectors etc. Acemoglu et al call this a cascade effect, an effect that will become more relevant as shocks hit sectors with more linkages.
Before proceeding, note how all of this already yields important results for our initial research question. It shows how, even if the general trend is positive, a idiosyncratic shock, i.e. a local congestion, can cause a negative aggregate response. It therefore already hints at some explanations for the sudden occurrence of starts and stops in business cycles.
So far we could interpret our results to support the findings in the literature. But how can they add something to the discussion? The literature so far has been able to identify a mechanism through which shocks that hit individual central sectors in the network may give rise to aggregate fluctuations. But it has so far been unable to determine whether or not the network-centrality of a sector is a sufficient or a necessary condition for the occurrence of aggregate fluctuations. Can our analysis add something to that discussion?
Note that by definition the network-centrality of a sector in the inter-sectoral economic model is crucially determined by the number of firms that operate in this sector, the number of firms that use the respective good as an input and by the number of goods that firms in this sector use as production inputs. Transferring this into our road traffic model, the centrality of a route is affected by both the number of firms that pass through it and the number of junctions that it crosses. But remember, complexity research on road traffic has given rise to the conclusion that a certain level of vehicle density is a necessary, yet it is not a sufficient condition for the occurrence of traffic congestion on systematic level. Complexity science hence seems to suggest that a sufficient network centrality alone is not enough to trigger aggregate fluctuations when a sector experiences a shock.
What else did matter? In road traffic, the initial shocks themselves are brought about by drivers' imperfect anticipation of the actions of other drivers. But whether or not they spread to cause breakdowns of thefree traffic flow at the aggregate level did heavily depend upon the incentives and actions that drivers displayed at the junction level. Uncooperative behavior at the junction and/or failures to change the route in order to avoid the congested links were key factors that would allow local inefficiencies to spread to the system level. How can we use this result in order to enrich the existing discussion?
Complexity science seems to suggest that given the fulfillment of certain requirements, shocks to an individual sector can cause aggregate fluctuations. This holds if the sector is sufficiently connected in the inter-sectoral network (in line with the existing research) and if either closely connected sectors cannot or fail to substitute away from the input that the sector in trouble produces (i.e drivers fail to alternate their route at junctions) or firms that operate in closely connected sectors display excessively uncooperative behavior (i.e. drivers fail to cooperate at junctions).
A next step will be to empirically test the results that these results, i.e. to test whether the failure or incapability to substitute away from goods that are produced by sectors that experience shocks and/or uncooperative behavior of firms in closely connected sectors can provide further necessary conditions under which the diversification argument might not hold. In order to do so it will be key to provide real world data that either back or support the above results.
The analysis of this project so far has highlighted the role that individual incentive-structures and local occurrences of congestion seem to play when trying to disentangle the reasons for systematic inefficiencies in road traffic. Consequently, when trying to apply our results to the research question, our analysis will focus on the role of shocks at the micro level and how these could potentially spread to become systematic, i.e. to cause aggregate fluctuations. Yet, most of the research that macroeconomics has devoted to the business cycle focuses on the role of aggregate shocks to output (such as changes in monetary or economic policy, wars, natural disasters) and argues that idosyncratic shocks to individual firms and sectors average out in the aggregate, due to a diversification effect. But more recently scholars like Gabaix (2009) or Acemoglu (2011) have argued that such idiosyncratic shocks to individual firms or sectors might in fact be responsible for most aggregate fluctuations.
In their paper "The Network Origins of Aggregate Fluctuations" Acemoglu et al (2011) argue that "local" microeconomic shocks may lead to aggregate fluctuations due to inter-sectoral input-output linkages.
Acemoglu et al model the aggregate economy as an inter-sectoral network where different sectors interact through input-output linkages. They present a static variant of the model that Long and Posner first presented in their 1983 paper "Real Business Cycles". In this economy households supply labor and choose their input output bundles, maximizing their Cobb-Douglas utility function. There are n numbers of goods, where each is being produced by a sector and can either be consumed or used as an input for production by other sectors. The subsequent inter-sectoral input-output relations form the inter-sectoral network of this economy.
The authors now prove mathematically that given the presence of such input-output linkages, the diversification argument that has been brought forward to dismiss the role of micro shocks in aggregate fluctuations may not hold. If the roles that sectors play as direct or indirect suppliers to others are significantly asymmetric, then "sizable aggregate fluctuations may originate from microeconomic shocks".
These findings correspond to the results of Xavier Gabaix's paper "The Granular Origins of Aggregate Fluctuations" (2009). Gabaix argues that if the distribution of firm size is fat-tailed, then the diversification argument breaks down. A fat-tailed distribution of firm size and a significant asymmetry in the roles of sectors as output suppliers might here be interpreted to be mirror-images of the same phenomenon.
So current research seems to be moving away from focusing on the role of aggregate shocks (note that these are mostly exogenous) to underlining and further examining the importance of shocks to individual firms or sectors. I find this very pleasing as this tendency is in line with the sentiment of this project for the entire discussion on road traffic was exclusively centered around endogenous causes for the occurrence of inefficiency and highlighted the role of "shocks" that happened at the link level.
It will now be interesting to see how, based on the model that Acemoglu et al and Gabaix use, one could apply some of our previous results to the occurrence of hot and cold flushes in business cycles and relate them to the current standing of macroeconomic research.
Compare the inter-sectoral network to the model of road traffic where road traffic is represented as a network of junctions and links. The following interpretation of possibly existing parallels between both models might seem reasonable. A particular route might represent a particular sector of the economy. The cars on this route at any given point in time could represent firms that operate in this sector. Firms that engage in the production of several products in different sectors can be represented by cars that pass through different roads. The input-output linkages between sectors in the inter-sectoral network model could possibly be compared to the junctions that connect different roads and routes. All of the above is of course simplistic, yet I feel that given an awareness for the simplicity of this approach, it might provide a ground on which we can potentially transfer some of our results.
In line with Acemoglu et al, our former considerations yield the observation that systematic congestions, i.e. aggregate fluctuations across the wider network, can be rooted in congestions, i.e. shocks on the micro level. The mechanisms through which these local shocks might spread seem to be quite well understood in both systems. In road traffic the velocity fluctuations can create backward spreading waves of increasing fluctuations which then break down the free flow of traffic. These fluctuations can spread over the junction to the system level and cause aggregate breakdowns in wider areas. In the economy a idiosyncratic shock leads to low output levels in the relevant sectors, which in turn leads to low input levels in first-order connected sectors, which in turn affects second-order connected sectors etc. Acemoglu et al call this a cascade effect, an effect that will become more relevant as shocks hit sectors with more linkages.
Before proceeding, note how all of this already yields important results for our initial research question. It shows how, even if the general trend is positive, a idiosyncratic shock, i.e. a local congestion, can cause a negative aggregate response. It therefore already hints at some explanations for the sudden occurrence of starts and stops in business cycles.
So far we could interpret our results to support the findings in the literature. But how can they add something to the discussion? The literature so far has been able to identify a mechanism through which shocks that hit individual central sectors in the network may give rise to aggregate fluctuations. But it has so far been unable to determine whether or not the network-centrality of a sector is a sufficient or a necessary condition for the occurrence of aggregate fluctuations. Can our analysis add something to that discussion?
Note that by definition the network-centrality of a sector in the inter-sectoral economic model is crucially determined by the number of firms that operate in this sector, the number of firms that use the respective good as an input and by the number of goods that firms in this sector use as production inputs. Transferring this into our road traffic model, the centrality of a route is affected by both the number of firms that pass through it and the number of junctions that it crosses. But remember, complexity research on road traffic has given rise to the conclusion that a certain level of vehicle density is a necessary, yet it is not a sufficient condition for the occurrence of traffic congestion on systematic level. Complexity science hence seems to suggest that a sufficient network centrality alone is not enough to trigger aggregate fluctuations when a sector experiences a shock.
What else did matter? In road traffic, the initial shocks themselves are brought about by drivers' imperfect anticipation of the actions of other drivers. But whether or not they spread to cause breakdowns of thefree traffic flow at the aggregate level did heavily depend upon the incentives and actions that drivers displayed at the junction level. Uncooperative behavior at the junction and/or failures to change the route in order to avoid the congested links were key factors that would allow local inefficiencies to spread to the system level. How can we use this result in order to enrich the existing discussion?
Complexity science seems to suggest that given the fulfillment of certain requirements, shocks to an individual sector can cause aggregate fluctuations. This holds if the sector is sufficiently connected in the inter-sectoral network (in line with the existing research) and if either closely connected sectors cannot or fail to substitute away from the input that the sector in trouble produces (i.e drivers fail to alternate their route at junctions) or firms that operate in closely connected sectors display excessively uncooperative behavior (i.e. drivers fail to cooperate at junctions).
A next step will be to empirically test the results that these results, i.e. to test whether the failure or incapability to substitute away from goods that are produced by sectors that experience shocks and/or uncooperative behavior of firms in closely connected sectors can provide further necessary conditions under which the diversification argument might not hold. In order to do so it will be key to provide real world data that either back or support the above results.
Saturday, January 21, 2012
A wrap up on road traffic and a perspective on ants and birds
The aim of this post is to comprehensively summarize the key factors complexity science seems to identify when looking at road traffic inefficiencies. I will then be discussing how looking at alternative systems of collective motion, such as the behavior of ants and birds, might help us to further enhance our understanding of these inefficiencies. This post will also hint at possible ways in which one might want to apply these findings to the research question.
Having looked at both, choice and action-based approaches to road traffic, the following findings appear to be most relevant.
1) Over time, we might observe a decrease in system efficiency as individual agents optimize their behavior according to their respective key parameters
2) External efforts to reduce these inefficiencies, i.e. through the provision supplementary roads, might actually further increase the inefficiency of the system
3) Not the density of the vehicles on a particular stretch of a road, but the interaction between these vehicles represents the origin of endogenous breakdowns of the free flow of traffic (through backward spreading velocity fluctuations)
4) The spread of a congestion from a local level to the system level crucially depends on the level of cooperation that drivers choose at the junction level.
Looking at these factors, one might already be able to think of ways of how to apply those results to the research question. For now though, I would like to look at a group of other systems that exhibit both complex and traffic behavior as well. Contrasting the efficiency and the inherent incentive structure of both systems will hopefully further enhance our understanding by gaining yet another perspective on the conclusions so far.
Ants and birds are famous for exhibiting sophisticated forms of collective motion. Colonies of the New World army ant Eciton burchelli for example consist of up to a half million members. The ants form traffic systems made up by up to 200,000 virtually blind individuals that transport up to 30,000 items in one run (Franks et al, 1991) and display minimal congestions. Likewise birds, the interaction of ants hence can give rise to self-organized structures that seem to be vastly superior to road traffic in terms of efficiency. These structures display a swarm intelligence that vastly exceeds the intelligence of every individual member and that hence is beneficial to every participant. In fact, scholars were capable of simulating the behavior of ants without having to assume that the ants possess any form of memory at all (e.g. Millonas: Swarms, Phase Transitions, and Collective Intelligence").
For the purposes of this project it will be sufficient to look at the simplest forms of models that aim to simulate the collective behavior of ants and birds and at the assumptions with regards to the behavioral incentive-structures that they make. Craig Reynolds introduced an agent-based model for the aggregate motion of flocks, herds or swarms in his paper "Flocks, Herds and Schools: A Distributed Behavioral Model (1987)" that simulated birds as independent individual actors that navigate according to their local perceptions of the environment and a set of of behavioral patterns. The behaviors that lead to the simulated collective motion are as following:
1) Decision-makers seek to avoid collisions with other agents (Collision avoidance)
2) Decision-makers attempt to match their speed with other nearby agents (Velocity matching)
3) Decision-makers want to stay as close as possible to nearby agents (Flock centering)
Given a certain density of interacting agents, these behavioral patterns are sufficient to bring about realistic forms of swarming behavior and hence the transition from chaotic to ordered behavior. The graphical results of Reynold's simulations can be seen on the following page: http://www.red3d.com/cwr/boids/ . These principles also represent the cornerstones of other, more sophisticated self-propelled particle models that aim to simulate the complex behavior displayed by New World ants or birds.
How can this analysis help us to further enhance our understanding of factors that might contribute to the occurrence of endogenous congestions in road traffic? Comparing the behavioral patterns and conditions that allow for the occurrence of the respective macro-structures, one finds that it seems to be the flock centering behavior that makes the crucial difference.
Both collision avoidance and velocity matching are to an extent inherent to the behavioral structure of drivers on the road and have in fact been identified to cause the fluctuations in velocity that can potentially cause the breakdown of the free traffic flow. Flock centering on the other hand reflects the inherently different incentive-structures on the microscopic level of individual agents between both systems. Whilst an ant or a bird greatly benefits from staying close to other flockmates, a driver on the road when faced with a choice between two routes of equal length will always pick the less crowded one. For ants and birds , a sufficiently high density of interacting agents is necessary for collective motion to occur in the first place, whilst the density of vehicles on the road is a necessary condition for the endogenous occurrence of traffic jam.
Cooperation and mutual interests on the individuals' level hence seem to crucially affect the efficiency of a system. This can be seen to further underline and support the implied conclusion of our previous analysis that incentive-structures might to an extent turn out to be more important than macro factors, such as the number of available roads for the efficiency of road traffic. The introduction of a second class of reference models therefore allows us to further strengthen the results of our analysis.
The next post will make use of the results obtained so far by applying these results to the observation of sudden starts and stops in business cycles. In order to so it will, based on the previous discussion on how road traffic could represent a reference model for aggregate economic, behavior carefully construct analogies between road traffic and the economy that will help to reveal the relevance of this findings for the research question. The goal has to be to comprehensively formulate a hypothesis that can then be tested against the available empirical evidence.
Having looked at both, choice and action-based approaches to road traffic, the following findings appear to be most relevant.
1) Over time, we might observe a decrease in system efficiency as individual agents optimize their behavior according to their respective key parameters
2) External efforts to reduce these inefficiencies, i.e. through the provision supplementary roads, might actually further increase the inefficiency of the system
3) Not the density of the vehicles on a particular stretch of a road, but the interaction between these vehicles represents the origin of endogenous breakdowns of the free flow of traffic (through backward spreading velocity fluctuations)
4) The spread of a congestion from a local level to the system level crucially depends on the level of cooperation that drivers choose at the junction level.
Looking at these factors, one might already be able to think of ways of how to apply those results to the research question. For now though, I would like to look at a group of other systems that exhibit both complex and traffic behavior as well. Contrasting the efficiency and the inherent incentive structure of both systems will hopefully further enhance our understanding by gaining yet another perspective on the conclusions so far.
Ants and birds are famous for exhibiting sophisticated forms of collective motion. Colonies of the New World army ant Eciton burchelli for example consist of up to a half million members. The ants form traffic systems made up by up to 200,000 virtually blind individuals that transport up to 30,000 items in one run (Franks et al, 1991) and display minimal congestions. Likewise birds, the interaction of ants hence can give rise to self-organized structures that seem to be vastly superior to road traffic in terms of efficiency. These structures display a swarm intelligence that vastly exceeds the intelligence of every individual member and that hence is beneficial to every participant. In fact, scholars were capable of simulating the behavior of ants without having to assume that the ants possess any form of memory at all (e.g. Millonas: Swarms, Phase Transitions, and Collective Intelligence").
For the purposes of this project it will be sufficient to look at the simplest forms of models that aim to simulate the collective behavior of ants and birds and at the assumptions with regards to the behavioral incentive-structures that they make. Craig Reynolds introduced an agent-based model for the aggregate motion of flocks, herds or swarms in his paper "Flocks, Herds and Schools: A Distributed Behavioral Model (1987)" that simulated birds as independent individual actors that navigate according to their local perceptions of the environment and a set of of behavioral patterns. The behaviors that lead to the simulated collective motion are as following:
1) Decision-makers seek to avoid collisions with other agents (Collision avoidance)
2) Decision-makers attempt to match their speed with other nearby agents (Velocity matching)
3) Decision-makers want to stay as close as possible to nearby agents (Flock centering)
Given a certain density of interacting agents, these behavioral patterns are sufficient to bring about realistic forms of swarming behavior and hence the transition from chaotic to ordered behavior. The graphical results of Reynold's simulations can be seen on the following page: http://www.red3d.com/cwr/boids/ . These principles also represent the cornerstones of other, more sophisticated self-propelled particle models that aim to simulate the complex behavior displayed by New World ants or birds.
How can this analysis help us to further enhance our understanding of factors that might contribute to the occurrence of endogenous congestions in road traffic? Comparing the behavioral patterns and conditions that allow for the occurrence of the respective macro-structures, one finds that it seems to be the flock centering behavior that makes the crucial difference.
Both collision avoidance and velocity matching are to an extent inherent to the behavioral structure of drivers on the road and have in fact been identified to cause the fluctuations in velocity that can potentially cause the breakdown of the free traffic flow. Flock centering on the other hand reflects the inherently different incentive-structures on the microscopic level of individual agents between both systems. Whilst an ant or a bird greatly benefits from staying close to other flockmates, a driver on the road when faced with a choice between two routes of equal length will always pick the less crowded one. For ants and birds , a sufficiently high density of interacting agents is necessary for collective motion to occur in the first place, whilst the density of vehicles on the road is a necessary condition for the endogenous occurrence of traffic jam.
Cooperation and mutual interests on the individuals' level hence seem to crucially affect the efficiency of a system. This can be seen to further underline and support the implied conclusion of our previous analysis that incentive-structures might to an extent turn out to be more important than macro factors, such as the number of available roads for the efficiency of road traffic. The introduction of a second class of reference models therefore allows us to further strengthen the results of our analysis.
The next post will make use of the results obtained so far by applying these results to the observation of sudden starts and stops in business cycles. In order to so it will, based on the previous discussion on how road traffic could represent a reference model for aggregate economic, behavior carefully construct analogies between road traffic and the economy that will help to reveal the relevance of this findings for the research question. The goal has to be to comprehensively formulate a hypothesis that can then be tested against the available empirical evidence.
Wednesday, January 11, 2012
Further lessons about road traffic
The last post introduced two fundamental approaches to road traffic modeling. It mainly focused on the choice-based approach, meaning that it discussed papers that studied the way in which people choose a particular route and how this affects the performance of the system as a whole. This post is to focus on the action-based approach which is centered around the question how driving behavior affects the efficiency of road traffic.
Sugiyama et al. argue that factors that are related to the density of cars on the road, such as route-choices or bottlenecks, are "only a trigger and not the essential origin of a traffic jam" (Sugiyama et al: "Traffic jams without bottlenecks—experimental evidence for the physical mechanism of the formation of a jam", 2002). Clearly, this fits well into our interpretation of action-based approaches to road traffic.
Sugiyama et al argue that the development of traffic flow, given that the average vehicle density surpasses a certain critical point, is crucially dependent on the interaction of vehicles. They back up this claim both theoretically and experimentally. In their paper, road traffic is modeled "as a non-equilibrium physical system consisting of moving particles with asymmetric interaction of exclusive effect".
Their model is based on the experience that there are always fluctuations in the movement of vehicles as vehicles adjust their speed when they see other drivers in order to avoid collisions . If the vehicle density is sufficiently small, these fluctuations can disappear and the free flow of traffic is ensured. If on the other hand the vehicle density is beyond a certain critical value, the fluctuations can potentially grow steadily and eventually cause a breakdown of the free flow that manifests itself in the formation of a jam. Hence once a critical vehicle density is surpassed, the system mathematically exhibits two solutions. A free flow solution where all vehicles move at roughly the same velocity and a jam flow solution where vehicles are stuck in a cluster. These solutions are essentially not stable over time and the system will alternate between both in irregular patterns.
The authors conducted an experiment on a circular road in order to verify these theoretical results. As predicted, traffic jam did occur once a certain density level was reached and the velocities of the cars were controlled by drivers (as opposed to automatic velocity control). The following video that captures the experiment illustrates, how traffic jam emerges as a result of steadily growing fluctuations in velocity. These in turn are caused by individual agents' reactions to other agents.
Building on this result Manley and Cheng develop a model that describes the development of congestion as an emergent property of road traffic that is rooted in the microcosm of the behavior of individual drivers and try to identify circumstances under which a local congestion might lead to a network congestion. (Manley, Cheng: "Understanding Road Congestion as an Emergent Property of Traffic Networks", 2008).
In order to do so, they distinguish three different levels, namely the link, junction and network level. The link level refers to roads that are relatively unaffected by junctions or intersections (like the circular road in Yugisama's experiment). Here, congestion may start through the mechanism that Yugisama et al have described, i.e. increasing fluctuations that cause delays which in turn spread backwards. But as soon as congestion reaches Junction level, this relative linearity fades as individuals make choices that may or may not contribute to the spread of the congestion to network level. Such choices include whether or not to cooperate or to change the route at the junction. These choices will determine whether or not a congestion at link (i.e. local) level will spread to affect a wider area of road network, i.e. whether or not a congestion will spread over many junctions and links to affect the macro network.
Using the Blackwall Tunnel in London as a case study, the authors find that it is especially illegal or uncooperative behavior that contributed to the spread of a congestion from the local to the network level. In other words it is especially the selfishness of individual drivers that could lead to temporary or even permanent delay.
Summarising what we have learned from these two papers, one can say that action-based approaches to road traffic provide us with another perspective on the emergence of inefficiencies in traffic flows. We went from developing an idea of the emergence of traffic jam on the individual level to trying to understand, how local congestions spreads to become a network congestion. It gave us an idea of how we could see traffic jam to be more dependent on the interaction and choices of drivers whilst on the road than on the actual vehicle density. Both discussed papers might prove themselves to be highly relevant for the purposes of this project as they provide a framework of discussing why some local congestions, caused by asymmetric actions and the resulting fluctuations, do actually lead to a congestion on network level, whilst others do not. Relating this to business cycles, this might be helpful in explaining how the sudden starts and stops of economic activity come about and why some of them do actually reverse the business cycle, whilst others do not.
The next post will bring together the results of this and the previous post. It appears to be essential to comprehensively identify factors that contribute to and affect the performance of the system road traffic in order to proceed with our analysis. The next post will also study, how the behavior of ants and birds does essentially differ from the behavior of road drivers and how this contributes to the efficiency of the respective system.
Sugiyama et al. argue that factors that are related to the density of cars on the road, such as route-choices or bottlenecks, are "only a trigger and not the essential origin of a traffic jam" (Sugiyama et al: "Traffic jams without bottlenecks—experimental evidence for the physical mechanism of the formation of a jam", 2002). Clearly, this fits well into our interpretation of action-based approaches to road traffic.
Sugiyama et al argue that the development of traffic flow, given that the average vehicle density surpasses a certain critical point, is crucially dependent on the interaction of vehicles. They back up this claim both theoretically and experimentally. In their paper, road traffic is modeled "as a non-equilibrium physical system consisting of moving particles with asymmetric interaction of exclusive effect".
Their model is based on the experience that there are always fluctuations in the movement of vehicles as vehicles adjust their speed when they see other drivers in order to avoid collisions . If the vehicle density is sufficiently small, these fluctuations can disappear and the free flow of traffic is ensured. If on the other hand the vehicle density is beyond a certain critical value, the fluctuations can potentially grow steadily and eventually cause a breakdown of the free flow that manifests itself in the formation of a jam. Hence once a critical vehicle density is surpassed, the system mathematically exhibits two solutions. A free flow solution where all vehicles move at roughly the same velocity and a jam flow solution where vehicles are stuck in a cluster. These solutions are essentially not stable over time and the system will alternate between both in irregular patterns.
The authors conducted an experiment on a circular road in order to verify these theoretical results. As predicted, traffic jam did occur once a certain density level was reached and the velocities of the cars were controlled by drivers (as opposed to automatic velocity control). The following video that captures the experiment illustrates, how traffic jam emerges as a result of steadily growing fluctuations in velocity. These in turn are caused by individual agents' reactions to other agents.
all rights reserved by the authors
In order to do so, they distinguish three different levels, namely the link, junction and network level. The link level refers to roads that are relatively unaffected by junctions or intersections (like the circular road in Yugisama's experiment). Here, congestion may start through the mechanism that Yugisama et al have described, i.e. increasing fluctuations that cause delays which in turn spread backwards. But as soon as congestion reaches Junction level, this relative linearity fades as individuals make choices that may or may not contribute to the spread of the congestion to network level. Such choices include whether or not to cooperate or to change the route at the junction. These choices will determine whether or not a congestion at link (i.e. local) level will spread to affect a wider area of road network, i.e. whether or not a congestion will spread over many junctions and links to affect the macro network.
Using the Blackwall Tunnel in London as a case study, the authors find that it is especially illegal or uncooperative behavior that contributed to the spread of a congestion from the local to the network level. In other words it is especially the selfishness of individual drivers that could lead to temporary or even permanent delay.
Summarising what we have learned from these two papers, one can say that action-based approaches to road traffic provide us with another perspective on the emergence of inefficiencies in traffic flows. We went from developing an idea of the emergence of traffic jam on the individual level to trying to understand, how local congestions spreads to become a network congestion. It gave us an idea of how we could see traffic jam to be more dependent on the interaction and choices of drivers whilst on the road than on the actual vehicle density. Both discussed papers might prove themselves to be highly relevant for the purposes of this project as they provide a framework of discussing why some local congestions, caused by asymmetric actions and the resulting fluctuations, do actually lead to a congestion on network level, whilst others do not. Relating this to business cycles, this might be helpful in explaining how the sudden starts and stops of economic activity come about and why some of them do actually reverse the business cycle, whilst others do not.
The next post will bring together the results of this and the previous post. It appears to be essential to comprehensively identify factors that contribute to and affect the performance of the system road traffic in order to proceed with our analysis. The next post will also study, how the behavior of ants and birds does essentially differ from the behavior of road drivers and how this contributes to the efficiency of the respective system.
Friday, December 30, 2011
Different perspectives on road traffic
This post aims to introduce different ways in which road traffic can be modeled. I will focus on outlining the conceptional differences between what I call choice-based and action-based models of traffic and look at two particularly interesting papers in detail.
Most publications do primarily focus on the "choose-a-route" problem, meaning that road traffic and congestion as a macroscopic emergence of the system are primarily influenced by the route choices of individuals. This assumes that it is mainly the density of traffic (i.e. the ratio of the number of travelers on over the capacity of a particular route) that gives rise to congestions and traffic jams. In other words, once you chose a particular road, there is not much left to be done in terms of decision-making (i.e. driving behavior) in order to avoid getting stuck in a jam. This appears sensible, as it serves the intuition that every individual driver is intrinsically dependent on the behavior of other drivers and that his influence on the overall system should be very limited.
Yet, it is also interesting to observe how, given a certain density-level, different driving styles can yield very different outcomes in terms of efficiency of the traffic system. Kai Nagel has done some interesting work on life-times of simulated traffic jams that might become particular relevant when we try to draw conclusions about the business cycles, i.e. which policies might be most effective when trying to get the economy out of a recession, say.
For now, I want to introduce two papers. The first paper "Individual Apaption in a path-based simulation of the freeway network of Northrhine-Westfalia" by Kai Nagel aims to examine how individuals learn and adapt their route-choices, based on the past performance of their respective choices and how the system evolves accordingly. The second paper "How Individuals take turns: Emergence of alternating cooperation in a congestion game and the prisoner's dilemma" by Dirk Helbing, Martin Schoenhof, Hans-Ulrich Stark and Janusz A. Holyst takes a game-theoretical approach to observe how people's choices in a two and higher dimensional set-up might or might not yield socially efficient equilibria. Both papers give a nice intuition about how we might want to model the choose-a-route problems associated with road traffic.
Here is the basic set-up of the choice-based models. All roads can be thought of as having a certain comfort limit L. If the number of cars on the road exceeds this comfort limit L, the road gets uncomfortable to be on, i.e. jam or at the very least traffic delays occur. Drivers choose particular routes, essentially trying to outguess everyone else. The essential bit is that they won't know that the correct decision, i.e. the route that minimizes the travel time from A to B, is until it is too late.
Nagel's paper introduces a similar set-up. Using the freeway network of the German Land Northrhine-Westfalia, a simulation is built, where there are many travelers with different origin-destination pairs. Each traveler has a choice between 10 different paths (or routes) that connect his origin with his destination. In the course of the simulation, each driver tries every single route and from then on on a daily basis choses the route that has performed best in the past (i.e. the route that minimized the time to get from A to B). The simulation runs 6000 iterations where each iteration consists of a preparation phase, where each traveler chooses his path for the day according to past performances, and a traffic microsimulation phase, where the daily traffic dynamics are simulated and macroscopic phenomena like jams are recorded.
The results of these simulations are somehow surprising. Nagel finds that the network performance decreases as drivers optimize their individual route and settle down on a route which is most convenient for themselves. In other words as individuals learn and adapt strategies that optimize their own performance, the aggregate system performs worse. This is partly due to the fact that jams are "equilibrated" , meaning that fastest ways around congested areas tend to vanish over time.
Dirk Helbing et al take a different approach and yet, they draw somehow similar conclusions.
Helbing et al conducted experiments where the test persons were instructed to choose between route 1 which corresponded to a freeway and route 2 that represented a side road. Participants were told that if more than half of the participants would choose route 1, everyone would receive 0 points. If on the other hand half would choose route 1, they would receive the maximum average of 100 points, but 1-choosers would profit at the cost of 2-choosers. This paper introduces the route-choice game as a "multi-stage symmetrical N-person single commodity congestion game". What exactly does this mean?
Multi-stage: The game is played several times.
"Symmetrical": Refers to the fact that the payoffs associated with a certain strategy do only depend on the other strategies that are played and not on who plays them.
"single commodity ": the single available commodity is space on the road
"congestion game": The payoff for each player depends on the resource that he choses and the number of players choosing the same resource.
Previous research on congestion games has shown that there always exists a Wardrop equilibrium. This is characterized by the property that no driver can decrease her travel time by choosing a different route and that travel times on all used routes is roughly the same. Yet, the Wardrop equilibrium does not generally reach the system optimum, i.e. minimize the overall travel time. According to Braess paradox additional streets might even increase the overall travel time. Note that this can be seen to correspond to Nagel's findings.
Nevertheless Helbing et al find that oscillatory cooperation (the notion of taking turns in order to maximize the system outcome) "can still emerge in route choice games with more than 2 players after a long period (rarely within 300 iterations)". They also find though that "emergent cooperation is unlikely to appear in real traffic systems" under the current conditions. The paper suggests that an automated route guidance system where drivers get individual and on average fair route choice recommendations based on the current traffic situation might help to reach the system optimum.
So looking at two papers that model road traffic from complexity science's point of view, we learn that the system might actually become less efficient as travelers adapt strategies that optimize their individual performance . Game theory tells us that the introduction of more space might actually further worsen the system's performance. All this might have very interesting consequences for our study of aggregate economic behavior. The next step is to look at models that try to measure how the behavior of individuals once they chose a particular road might affect the efficiency of the overall traffic system. Bringing both perspectives together will hopefully give us an idea of the key factors that yield traffic inefficiencies, which in turn will help us when comparing it to other systems that exhibit complex behavior as well, such as the behavior of ants and birds.
Most publications do primarily focus on the "choose-a-route" problem, meaning that road traffic and congestion as a macroscopic emergence of the system are primarily influenced by the route choices of individuals. This assumes that it is mainly the density of traffic (i.e. the ratio of the number of travelers on over the capacity of a particular route) that gives rise to congestions and traffic jams. In other words, once you chose a particular road, there is not much left to be done in terms of decision-making (i.e. driving behavior) in order to avoid getting stuck in a jam. This appears sensible, as it serves the intuition that every individual driver is intrinsically dependent on the behavior of other drivers and that his influence on the overall system should be very limited.
Yet, it is also interesting to observe how, given a certain density-level, different driving styles can yield very different outcomes in terms of efficiency of the traffic system. Kai Nagel has done some interesting work on life-times of simulated traffic jams that might become particular relevant when we try to draw conclusions about the business cycles, i.e. which policies might be most effective when trying to get the economy out of a recession, say.
For now, I want to introduce two papers. The first paper "Individual Apaption in a path-based simulation of the freeway network of Northrhine-Westfalia" by Kai Nagel aims to examine how individuals learn and adapt their route-choices, based on the past performance of their respective choices and how the system evolves accordingly. The second paper "How Individuals take turns: Emergence of alternating cooperation in a congestion game and the prisoner's dilemma" by Dirk Helbing, Martin Schoenhof, Hans-Ulrich Stark and Janusz A. Holyst takes a game-theoretical approach to observe how people's choices in a two and higher dimensional set-up might or might not yield socially efficient equilibria. Both papers give a nice intuition about how we might want to model the choose-a-route problems associated with road traffic.
Here is the basic set-up of the choice-based models. All roads can be thought of as having a certain comfort limit L. If the number of cars on the road exceeds this comfort limit L, the road gets uncomfortable to be on, i.e. jam or at the very least traffic delays occur. Drivers choose particular routes, essentially trying to outguess everyone else. The essential bit is that they won't know that the correct decision, i.e. the route that minimizes the travel time from A to B, is until it is too late.
Nagel's paper introduces a similar set-up. Using the freeway network of the German Land Northrhine-Westfalia, a simulation is built, where there are many travelers with different origin-destination pairs. Each traveler has a choice between 10 different paths (or routes) that connect his origin with his destination. In the course of the simulation, each driver tries every single route and from then on on a daily basis choses the route that has performed best in the past (i.e. the route that minimized the time to get from A to B). The simulation runs 6000 iterations where each iteration consists of a preparation phase, where each traveler chooses his path for the day according to past performances, and a traffic microsimulation phase, where the daily traffic dynamics are simulated and macroscopic phenomena like jams are recorded.
The results of these simulations are somehow surprising. Nagel finds that the network performance decreases as drivers optimize their individual route and settle down on a route which is most convenient for themselves. In other words as individuals learn and adapt strategies that optimize their own performance, the aggregate system performs worse. This is partly due to the fact that jams are "equilibrated" , meaning that fastest ways around congested areas tend to vanish over time.
Dirk Helbing et al take a different approach and yet, they draw somehow similar conclusions.
Helbing et al conducted experiments where the test persons were instructed to choose between route 1 which corresponded to a freeway and route 2 that represented a side road. Participants were told that if more than half of the participants would choose route 1, everyone would receive 0 points. If on the other hand half would choose route 1, they would receive the maximum average of 100 points, but 1-choosers would profit at the cost of 2-choosers. This paper introduces the route-choice game as a "multi-stage symmetrical N-person single commodity congestion game". What exactly does this mean?
Multi-stage: The game is played several times.
"Symmetrical": Refers to the fact that the payoffs associated with a certain strategy do only depend on the other strategies that are played and not on who plays them.
"single commodity ": the single available commodity is space on the road
"congestion game": The payoff for each player depends on the resource that he choses and the number of players choosing the same resource.
Previous research on congestion games has shown that there always exists a Wardrop equilibrium. This is characterized by the property that no driver can decrease her travel time by choosing a different route and that travel times on all used routes is roughly the same. Yet, the Wardrop equilibrium does not generally reach the system optimum, i.e. minimize the overall travel time. According to Braess paradox additional streets might even increase the overall travel time. Note that this can be seen to correspond to Nagel's findings.
Nevertheless Helbing et al find that oscillatory cooperation (the notion of taking turns in order to maximize the system outcome) "can still emerge in route choice games with more than 2 players after a long period (rarely within 300 iterations)". They also find though that "emergent cooperation is unlikely to appear in real traffic systems" under the current conditions. The paper suggests that an automated route guidance system where drivers get individual and on average fair route choice recommendations based on the current traffic situation might help to reach the system optimum.
So looking at two papers that model road traffic from complexity science's point of view, we learn that the system might actually become less efficient as travelers adapt strategies that optimize their individual performance . Game theory tells us that the introduction of more space might actually further worsen the system's performance. All this might have very interesting consequences for our study of aggregate economic behavior. The next step is to look at models that try to measure how the behavior of individuals once they chose a particular road might affect the efficiency of the overall traffic system. Bringing both perspectives together will hopefully give us an idea of the key factors that yield traffic inefficiencies, which in turn will help us when comparing it to other systems that exhibit complex behavior as well, such as the behavior of ants and birds.
Friday, December 16, 2011
Road traffic as a reference model for aggregate economic behavior
Can we assume road traffic to represent a reference model for aggregate economic behavior? I will not focus very much on justifying that road traffic exhibits complex behavior as well. I will instead focus on justifying the step of introducing it as a reference model. In order to establish this step, I will be looking at incentives and structures that might be similar in both systems.
Dirk Helbing argued in his paper "Modeling and Optimization of Production Processes: Lessons from Traffic Dynamics" (2003) that we can draw parallels between road traffic and production processes. Helbing does particularly mention "the presence of moving entities (persons or objects) which interact in a non-linear way" and the presence of a "competition for limited resources" (such as capacity, time or space) in order to justify this approximation. I want to argue that we can extend this approximation by saying that we can assume road traffic to represent a reference model for the whole economy, of course bearing in mind the simplifications that have to be made in order to justify this step.
Optimization is a key incentive that is present in both systems. Consumers, firms and governments aim to optimize their respective key parameters, i.e. their utility, profits or social welfare. In order to do so, they try to gain competitive advantages by making use of past experiences. Participants in road traffic usually aim to optimize the time it takes them to get from A to B (which can be seen as trying to minimize the cost of getting from A to B). This may involve picking a particular time of the day for the journey, but assuming that most working parts of the population do not have the freedom to make this choice, this optimization process is mainly about picking the right route. Neil Johnson puts it that way: Road traffic can be seen "as a collection of decision-making objects repeatedly competing...to find the least crowded route from A to B" (Johnson, Simply Complexity). Hence it seems as if the incentive-structures are similar in both systems.
To further see this imagine the situation of an individual firm in the economy when it has to decide about whether or not to enter a particular market. This is similar to an individual driver's decision of whether or not to enter a particular road. In a similar manner consumers have to decide how much to consume and how much to save for future consumption. In the economy individuals are constantly faced with "Choose -a-route" problems, the difference is simply that in the case of road traffic the number of problems narrows down to one. All these decisions seem to have to do with an optimal allocation of the resources that are available to individual agents.
Also the character of possible interventions in the system is of similar nature. Central banks try to influence the economy through monetary policy (such as interest rates), governments try to stimulate or slow activity down through fiscal policy and regulation. In the same manner, road planners try to optimize traffic flow through speed limits, lights and the general traffic structure.
How do the emerging macroscopic phenomena compare? In the aggregate economy they are visible in form of sudden starts and stops in economic activity. In road traffic we can observe the sudden appearance and disappearance of traffic jam. Traffic jam as such is a inherent inefficiency of road traffic (just like stops of economic activity or recessions) and a lot of the following will look at different ways in which complexity science models traffic jam and thereby explains the occurrence of these inefficiencies. Here it might be important to note that this UROP is particularly interested in inefficiencies that arise within the system, meaning that arise without external influence. It is especially the appearance and disappearance of traffic jam out or sudden starts and stops that does not have an external cause (such as construction sites, accidents or a change in monetary policy) that is of particular interest.
So the main parallels that seem to justify the introduction of road traffic as reference model for the aggregate economic behavior seem to be the following:
At this point it might be important to note that this approximation is of course limited. It does not seem to be clear how added value, a key component of economic activity, could arise within road traffic. I do not really have an answer for that.
Even though there might be more points that will make the approximation less powerful, I still believe that it is sufficient for the purposes of this project. I am aiming to introduce various ways in which road traffic can be modeled and extract some key factors that these models lists for the appearance of traffic jam. I will then look at a different complex system, the behavior of ants, birds or a fungus, and see what we can learn from their behavior about inefficiencies in road traffic and in the aggregate economy.
Dirk Helbing argued in his paper "Modeling and Optimization of Production Processes: Lessons from Traffic Dynamics" (2003) that we can draw parallels between road traffic and production processes. Helbing does particularly mention "the presence of moving entities (persons or objects) which interact in a non-linear way" and the presence of a "competition for limited resources" (such as capacity, time or space) in order to justify this approximation. I want to argue that we can extend this approximation by saying that we can assume road traffic to represent a reference model for the whole economy, of course bearing in mind the simplifications that have to be made in order to justify this step.
Optimization is a key incentive that is present in both systems. Consumers, firms and governments aim to optimize their respective key parameters, i.e. their utility, profits or social welfare. In order to do so, they try to gain competitive advantages by making use of past experiences. Participants in road traffic usually aim to optimize the time it takes them to get from A to B (which can be seen as trying to minimize the cost of getting from A to B). This may involve picking a particular time of the day for the journey, but assuming that most working parts of the population do not have the freedom to make this choice, this optimization process is mainly about picking the right route. Neil Johnson puts it that way: Road traffic can be seen "as a collection of decision-making objects repeatedly competing...to find the least crowded route from A to B" (Johnson, Simply Complexity). Hence it seems as if the incentive-structures are similar in both systems.
To further see this imagine the situation of an individual firm in the economy when it has to decide about whether or not to enter a particular market. This is similar to an individual driver's decision of whether or not to enter a particular road. In a similar manner consumers have to decide how much to consume and how much to save for future consumption. In the economy individuals are constantly faced with "Choose -a-route" problems, the difference is simply that in the case of road traffic the number of problems narrows down to one. All these decisions seem to have to do with an optimal allocation of the resources that are available to individual agents.
Also the character of possible interventions in the system is of similar nature. Central banks try to influence the economy through monetary policy (such as interest rates), governments try to stimulate or slow activity down through fiscal policy and regulation. In the same manner, road planners try to optimize traffic flow through speed limits, lights and the general traffic structure.
How do the emerging macroscopic phenomena compare? In the aggregate economy they are visible in form of sudden starts and stops in economic activity. In road traffic we can observe the sudden appearance and disappearance of traffic jam. Traffic jam as such is a inherent inefficiency of road traffic (just like stops of economic activity or recessions) and a lot of the following will look at different ways in which complexity science models traffic jam and thereby explains the occurrence of these inefficiencies. Here it might be important to note that this UROP is particularly interested in inefficiencies that arise within the system, meaning that arise without external influence. It is especially the appearance and disappearance of traffic jam out or sudden starts and stops that does not have an external cause (such as construction sites, accidents or a change in monetary policy) that is of particular interest.
So the main parallels that seem to justify the introduction of road traffic as reference model for the aggregate economic behavior seem to be the following:
- The inherent incentive structure: Optimization as a key component of both systems
- The nature of the problems that agents face: "Choose-a-route"
- The nature and means of interventions that institutions try to make are similar.
- Emergent phenomena are of similar nature and can arise within the system without any external influence
At this point it might be important to note that this approximation is of course limited. It does not seem to be clear how added value, a key component of economic activity, could arise within road traffic. I do not really have an answer for that.
Even though there might be more points that will make the approximation less powerful, I still believe that it is sufficient for the purposes of this project. I am aiming to introduce various ways in which road traffic can be modeled and extract some key factors that these models lists for the appearance of traffic jam. I will then look at a different complex system, the behavior of ants, birds or a fungus, and see what we can learn from their behavior about inefficiencies in road traffic and in the aggregate economy.
The Economy as a Complex System
This post will use the criteria that I have outlined earlier in order to establish how we can view the economy as a complex system.
So summing it up it seems as if we can justify the description of the economy as a complex system. The next posts will introduce the reference models and will look at how we can use the fact that all the economy and the reference models exhibit complex behavior to learn more about hot and cold flushes in business cycles.
- The system contains a collection of many interacting objects, whose behavior is affected by memory or „feedback“ (the objects include some capacity for adaption and learning)
The system is open, meaning that the system can be influenced by its environment
The system evolves in a highly non-trivial way and is generally far from equilibrium, meaning that in principle anything could happen and provided that we observe the system long enough, it probably will.
The emergent phenomena are not brought about by some central controller
The system exhibits a mix of ordered and disordered behavior
So summing it up it seems as if we can justify the description of the economy as a complex system. The next posts will introduce the reference models and will look at how we can use the fact that all the economy and the reference models exhibit complex behavior to learn more about hot and cold flushes in business cycles.
Thursday, December 8, 2011
Complexity? Comlexity!
What is complexity all about, and why exactly is it that it might help us to get a little bit closer to understanding why the observed hot and cold flushes come about? Some think of Complexity Science to be the science of all sciences. I want to explain why I share this view by briefly introducing Complexity and the phenomenon of emergence and hinting at some of the most fascinating aspects of this relatively young discipline.
Complexity science is concerned with the study of phenomena which emerge in a system that consists of a collection of interacting objects. Mostly, these phenomena are not explainable by simply looking at the behaviour of individual objects. What makes complexity so exciting is that we can observe complex, higher order structures emerging from the interaction of objects that are equipped with fairly simple decision rules. Some of the most popular examples of complex systems include the financial markets, our immune system, ecological systems or even the world of quantum physics. In fact, the range of systems that could potentially be subject to study appears endless.
The following set of necessary properties that a complex system should exhibit is proposed by Neil Johnson (Neil Johnson, Simply Complexity). This will be useful for the purposes of this project when trying to identify how both the economy and the introduced reference models fit these criteria.
- The system contains a collection of many interacting objects, whose behavior is affected by memory or „feedback“ (the objects include some capacity for adaption and learning)
- The system is open, meaning that the system can be influenced by its environment
- The system evolves in a highly non-trivial way and is generally far from equilibrium, meaning that in principle anything could happen and provided that we observe the system long enough, it probably will
- The emergent phenomena are not brought about by some central controller.
- The system exhibits a mix of ordered and disordered behaviour.
Looking at this set, it seems fairly intuitive to restrict study to systems “where we have useful descriptions in terms of rules and laws” (Holland, Emergence), at least for the purposes of this project.
The goal of complexity science is to model, describe and possibly predict the behavior of such systems. Here it might be interesting to note that we might not need to fully understand the constituent objects of a system in order to describe how the aggregate behaves, as simple bits interacting in a simple way might lead to a rich variety of higher order structures and outcomes. This is why complexity science might be so relevant. A lot of academic research has been focused on understanding every aspect of a particular problem. But it seems likely that no level of understanding of individual objects will help us to explain certain phenomena (for example understanding individual brain cells might not help to explain the occurrence of Alzheimer).
What I find particularly exciting is that a lot of the research suggests that partially understanding one system from say Physics might actually help us to increase our understanding of another system from a totally unrelated discipline, say economics. It is this inter-disciplinarity that this project will also make use of. All the work that is done at the Santa Fe Institute is a prime example of the possible relevance of this approach.
What I find particularly exciting is that a lot of the research suggests that partially understanding one system from say Physics might actually help us to increase our understanding of another system from a totally unrelated discipline, say economics. It is this inter-disciplinarity that this project will also make use of. All the work that is done at the Santa Fe Institute is a prime example of the possible relevance of this approach.
The next step will to be to look at the aggregate economy from a Complexity point of view. I will apply the above set of criteria in order to justify the claim that the economy can be seen as a complex system. I am then hoping to be able to justify that both road traffic and the behavior of ants are complex systems that can be seen to exhibit parallel or contrasting behavior to the observed hot and cold flushes in business cycles. The challenge will be to show that both models can be seen as reference models for the economy. This will involve a formulation of the possible dangers and limitations of such an approximation. Building on that, I will look at the occurrence of inefficient phenomena in both reference systems and hopefully will be able to draw some conclusions about the subject of interest.
(If complexity science is a fairly new subject to you, then looking at Mitchel Resnick's active essay "Exploring Emergence" (a link is provided at the sidebar) might be an entertaining and nice way to familiarize yourself with the subject matter. Also check out the homepage of the Santa Fe institute, if you are interested in the whole range of possible applications of the study of emergence)
(If complexity science is a fairly new subject to you, then looking at Mitchel Resnick's active essay "Exploring Emergence" (a link is provided at the sidebar) might be an entertaining and nice way to familiarize yourself with the subject matter. Also check out the homepage of the Santa Fe institute, if you are interested in the whole range of possible applications of the study of emergence)
Tuesday, November 29, 2011
Hot and Cold Flushes in Business Cycles?
All modern industrial economies experience significant swings in economic activity. The now standard definition of business cycles was provided by Arthur Burns and Wesley Mitchell in "Measuring Business Cycles", where a cycle consists of a time of expansion with increased general economic activity, followed by a time of recession.
Here it is important to note that the term cycle seems to be somewhat misleading. In fact, looking at the data one gets the impression that there are hardly any patterns that would allow us to account for regularity in both timing and duration of upswings and downswings. The figure to the right shows a business cycle for the US economy from 1955 to 2005. Recessions in the figure are negative deviations from trend. Clearly, both recessions and expansions vary significantly in both duration and intensity.
This UROP will focus on the varying magnitude and on the volatility of this curve. Starts and stops to economic activity do seem to come about very sudden and do seem to vary in their impact. We can see hot and cold flushes, represented by the many turning points of the curve. Some of them do eventually turn the economy around (the points of largest deviation in a particular period), but most of them seem to disappear more or less immediately after they have occurred.
This UROP will take a complexity point of view on this occurrence of hot and cold flushes and observe what this can teach us about the way in which we could interpret and improve our handling of such events. I want to discuss how we can view the economy as a complex system and how the introduction of road traffic and the behavior of ants and birds as reference models can help us to improve our understanding of aggregate economic behavior.
![]() |
Deviation from long-term growth in the US GDP |
This UROP will focus on the varying magnitude and on the volatility of this curve. Starts and stops to economic activity do seem to come about very sudden and do seem to vary in their impact. We can see hot and cold flushes, represented by the many turning points of the curve. Some of them do eventually turn the economy around (the points of largest deviation in a particular period), but most of them seem to disappear more or less immediately after they have occurred.
This UROP will take a complexity point of view on this occurrence of hot and cold flushes and observe what this can teach us about the way in which we could interpret and improve our handling of such events. I want to discuss how we can view the economy as a complex system and how the introduction of road traffic and the behavior of ants and birds as reference models can help us to improve our understanding of aggregate economic behavior.
Subscribe to:
Posts (Atom)