Advancing Economic Forecasting and Risk Analysis Models to Meet Demands of the 4IR

Managing dynamic complexity will pave the way for a more prosperous and efficient economy that operates with less risk and better predictability.

Economic systems that operate through a constantly expanding number of dynamic interactions have become too complex to fully represent using popular econometrics and economic modeling methods. When multiple functions of economy are connected to each other explicitly or through complex topology, the mathematics of these methods grows too complex to deliver reliable predictions. History has shown that an economic crisis, mathematically identified as a singularity, may occur even when classic estimations fail to expose a risk.

From our experience, it is clear that new tools and methodologies are needed to augment traditional risk analysis practices in order to take back control of financial systems. The intended goal is to guide rather than react to market fluctuations in ways that yield the desired outcomes, including the ability to execute a complete system disruption, when and if the dynamics of financial markets no longer meet the goals of society.

Lack of predictability and hyper-risk within financial systems

Today’s economists borrow their risk modeling methods from the core tenants of physical science with a heavy reliance on the concept of entropy to deal with the volatility of financial systems. While these methods did make it easier to represent the approximate behavior of financial systems once upon a time, they have become meaningless in today’s complex environment.

The current lack of predictability and hyper-risk within financial systems is a direct result of dynamic complexity. Dynamic complexity should be considered the enemy of the digital age because its presence erodes stability, hides risk and creates waste within any system that is influenced by the external environment in which it operates. Like an undiagnosed cancer, dynamic complexity consumes valuable resources as it grows without providing any useful return. Traditional forecasting methods are unable to quantify the impact of dynamic complexity, therefore surprises are inevitable because no one can predict how one small change can produce a ripple effect of unintended consequences.

In and of itself, the expansion of subprime loans in the early-to-mid 2000s was considered a manageable risk by economists, policymakers and financial practitioners who took advantage of the Feds interest rate reduction monetary policy to promote new financial instruments, e.g. subprime loans and trading of mortgage-backed securities (MBS). Even as the collapse of the US economy was already happening, the Feds main economic model saw a less-than-5-percent chance that the unemployment rate would rise above 6 percent in two years. The rate actually hit 10 percent, an event that the model said was close to impossible, therefore it was not considered as a plausible risk by policymakers.

Crises move faster than decisions

As the pace of innovation accelerates in the Fourth Industrial Revolution (4IR), the cause and effect of economic crisis will become nearly instantaneous. Governments and businesses can no longer afford to wait for the warning signs of an economic turmoil, before taking action. To regenerate prosperity for the betterment of humanity, economists, policymakers and financial practitioners must employ new problem-solving approaches. Eliminating the waste and managing the risk caused dynamic complexity will deliver new opportunities for growth and sustainability. But first, economic stakeholders must tame dynamic complexity and simplify decision cycles.

In the research paper entitled, “Advancing Economic Forecasting and Risk Analysis Models to Meet the Speed, Risk and Sustainability Demands of the 4IR,” URM Forum proposes a new approach to economic forecasting and predictive analysis that allows users to uncover a wide scope of unknown risks. Armed with this knowledge, economic stakeholders will be able to quickly take action with confidence in the outcomes—before imbalances proliferate through financial markets and interdependent subsystems. We believe the suggested use of tensors provides the right formulation to overcome the abused application of entropy and scalar quantities within economics, which neglect the direction of market changes. The utilization of aggregated vectors allows users to more accurately reproduce the magnitude and direction of market behaviors, which in turn allows economists, policymakers and practitioners to predict a wider range of risks and vet which corrective actions will yield the best results.


Learn more about our Economic Forecasting and Risk Analysis Modeling research

Advancing Economic Forecasting and Risk Analysis Models” paper to learn how economists, policymakers and practitioners to can use our proposed UDE method to predict a wider range of risks and vet which corrective actions will yield the best results.

Modeling Economic Dynamics

Examining the Problems with Traditional Risk Modeling Methods

Traditional financial risk management methods were formulated in an analogy with the early foundational principles of thermodynamics. However, traditional economic models are incomplete models of reality because economic systems are not inclined to attain equilibrium states unless we are talking about very short windows of time (similar to meteorological or most nuclear or gravitational systems).

Problems with risk modeling methods based on the laws of thermodynamics:

  • Predictability is limited to short windows, where the initial conditions varies in small amplitudes and in small frequencies
  • Complexities are dealt with once recognized, rather than as a result of structural evolution and systemic behavior of multiple-level interactions
  • Only closed systems that reach equilibrium are dealt with, no adaptive ability to an external or internal modification is allowed
  • Complex systems do not systematically expose equilibrium
  • Using Stochastic models that deal with randomness are difficult to determine 
small resonances and therefore do not tend to a long term representation

A New Way to Look at Economy and Risk

Financial systems are not wholly physical. They do not always behave in an expected manner as predicted from their patterns of past behavior. They are immature. They can sometimes exhibit unexpected and unknown behavior because we do not understand their complexity and how it changes.

To avoid future crisis in the proportions of 2008, we must identify new methods of economic risk analysis that more accurately model the dynamic reality of financial systems. To this end, we promote determinism, which is the view that every event, including human cognition, behavior, decision, and action, is causally determined by an unbroken sequence of prior occurrences.

Determinists believe the universe is fully governed by causal laws resulting in only one possible state at any point in time. Simon-Pierre Laplace’s theory is generally referred to as “scientific determinism” and predicated on the supposition that all events have a cause and effect and the precise combination of events at a particular time engender a particular outcome.

How the impact of dynamic complexity leads to economy non-equilibrium:

  • Different instruments within a portfolio have different dynamic patterns, evolution speeds, producing different impact on risk
  • But also they influence each other: in sharing, affecting, and operating in terms of both frequency and amplitude in the behavior of discriminant factors (econometrics, relation economy/finance, long term repercussion etc.)
  • In addition, each will have different reaction/interaction towards an external/ internal event.

Consequently, modeling economics dynamics is the right foundation to insure predictability of such self-organized evolutionary systems that may prevail towards even several points of singularities and larger number of degrees of freedom than the small number in traditional methods.

Using this method, we will be able to address most of the drawbacks of the traditional methods:

  • Both the need for predictable determinism and the intensive presence of high level of dynamic complexity justifies the use of Perturbation Theory
  • The condition of success to approach an exact solution at any moment of time relies on the use of deconstruction theory that will separate the constituents and find the proper mathematical expression of each prior to the deployment of the perturbed expression (i.e. two-level solution)
  • Evolutionary process guarantees wider window of representativeness and adaptability for the dynamic complexityeconomics
  • Tends to exact solution

Table: Dynamic Complexity versus Traditional Economics

Dynamic Complexity Economics Traditional Economics
Open, dynamic, non-linear in equilibrium Closed, static, linear in equilibrium
Each constituent of the system is model individually then aggregated through Perturbation Theory The system is modeled collectively in one step
No separation between micro and macro level behaviors Separation between micro and macro level behaviors
Evolutionary process guarantees wider window of representativeness and adaptability for the dynamic complexity economics Unstable for wider windows of time
Allows for continuous interactions of external and internal agents Does not allow for continuous interactions of external and internal agents
Optimal control is possible as sub product of dynamic complexity modeling Optimal control is not possible

Conclusion

From a scientific standpoint, the subject of financial dynamics and the best risk analysis method is still open and further mathematical, physical and engineering as well as economic risk analysis developments are necessary. A great body of contributions, covering a wide spectrum of preferences and expertise and from deeply theoretical to profoundly pragmatic, currently exists today. All show the interest, but also the urgency, to find a solution that can help us avoid the singularities that occurred in 2008. To progress, we must continuously seek to recognize the failures of past methods and strive to find solutions.

Reconstructing the 2007-2008 Financial Crisis

Using X-Act® OBC Platform, we were able to reconstruct the global financial singularity of 2007 and 2008. Obviously the constructed solution came too late to spare the world from economic turmoil. It is not our intent to join the after-event agents of prophecy. Rather our goal is to use a scientific approach to reverse engineer what happened and in doing so prove the usefulness of mathematical emulation as a preventative solution.

Financial Dynamics in 2007

To analyze the root cause of the 2007-2008 financial crisis, we built a mathematical emulator that represented the financial market dynamics prior to the crash. This included the financial engines and dynamic flows among them and explicitly the dependencies on the internal structure and the external influencers that impact market performance.

In building the mathematical emulator, we put particular emphasis on the first category of direct dependencies (in couples: edges, vertices or a mix) as well as the indirect dependencies based on the fact that each and every part of the first category could be influenced by the impact on each category from the already-impacted participating components.

A perturbed structural model can be mathematically expressed in the form of participating inequalities. Each inequality contributes, through different amplitude and frequency, to the overall solution. Mathematically based solutions of this class are generally validated through three criteria:

  1. The solution should be representative to the process dynamics
  2. The solution should be accurate with respect to a real environment outcome with identical initial conditions
  3. The solution should allow a predictable precision that provides sufficient confidence in decision making and execution under different initial conditions

Mathematically speaking, we can consider that the coupled business dynamics-initial conditions will express and produce speeds, and the influencers will provoke accelerations. If we project these principles on the financial meltdown of 2007, we find that the system was stable (at least apparently) until the foreclosure rate went from 2 to 3 percent—the increase corresponding to more than 50% for subprime mortgages— and representing 10.5% of the US housing market (which was supposedly a low risk financial instrument).  As is the case with mortgages, this amount was not distributed, but the full amount represented a direct loss to the financial institutions.

The Singularity is Precipitated by a Heating of the Market

The treasury secretary Paulson says, “The housing bubble was driven by a big increase in loans to less credit worthy, or subprime, borrowers that lifted homeownership rates to historic levels.” But this explanation alone is not sufficient to explain the collapse of the whole financial system based solely on foreclosure propagation.

Our discovery of dynamic complexity through mathematical emulation allowed us to clearly point to the real cause: the market dynamics caused a singularity and dynamic complexity was the real cause of the crisis. This fact was not identifiable by the risk management and mitigation methods used pre-crisis. In short, the housing crisis was quickly overshadowed by a much bigger crisis caused by the dependencies of intertwined financial structures (connected through financial instruments, such as mortgage-backed securities) that by design, or by accumulation, caused the market collapse.

In other words, if someone wanted to design a system that would lead to a devastating crisis, the financial system of 2006 through 2008 was the perfect example of doing just that. If you use a similar structure with the level of dynamic complexity, you can replace housing with credit card as the predominant financial instrument, then back it up with securities (along the lines of mortgage-backed securities) and you will the same recipe for disaster.

Using X-Act® OBC Platform service we were able to model how the crisis was communicated from mortgage through home equity origination to contaminate the whole financial market in much larger amplitude than the variance in home mortgage interest rate over 20 years. Mortgage and housing market conditions create cycles of economic crises that repeat approximately every six years. However, the singularity that surprised even the U.S. Federal Reserve Chairman in 2007 held much deeper and wider causes for the collapse—namely the effects of dynamic complexity—than was the case in previous cycles.

The Singularity Hits When the System Becomes Out of Control

We used the causal emulation step of our methodology to measure the impact of the system contamination agent, which had been identified as mortgage-backed securities (MBS). After MBSs hit the financial markets, they were reshaped into a wide variety of financial instruments with varying levels of risk. This bundling of activities blurred the traceability of the original collateral assets. Interest-only derivatives divided the interest payments made on a mortgage among investors. When interest rates rose, the return was good. If rates fell and homeowners refinanced, then the security lost value. Other derivatives repaid investors at a fixed interest rate, so investors lost out when interest rates rose since they weren’t making any money from the increase. Subprime mortgage-backed securities were created from pools of loans made to subprime borrowers. These were even riskier investments, but they also offered higher dividends based on a higher interest rate to make the investment more attractive to investors.

In August 2008, one out of every 416 households in the United States had a new foreclosure filed against it. When borrowers stopped making payments on their mortgages, MBSs began to perform poorly. The average collateralized debt obligation (CDO) lost about half of its value between 2006 and 2008. And since the riskiest (and highest returning) CDOs were comprised of subprime mortgages, they became worthless when nationwide loan defaults began to increase. This would be the first domino to fall in the series that fell throughout the U.S. economy.

How the MBS’s brought Down the Economy

When the foreclosure rate began to increase late in 2006, it released more homes on the market. New home construction had already outpaced demand, and when a large number of foreclosures became available (representing up to 50% of the subprime mortgages) at deeply discounted prices, builders found that they couldn’t sell the homes they’d built. A homebuilder can’t afford to compete with foreclosures at 40 percent to 50 percent off their expected sales price. The presence of more homes on the market brought down housing prices. Some homeowners owed more than their homes were worth. Simply walking away from the houses they couldn’t afford became an increasingly attractive option, and foreclosures increased even more.

Had a situation like this taken place before the advent of mortgage-backed securities, a rise in mortgage defaults would nonetheless create a ripple effect on the national economy, but possibly without reaching a singularity. It was the presence of MBSs that created an even more pronounced effect on the U.S. economy and made escaping the singularity impossible.

Since MBSs were purchased and sold as investments, mortgage defaults impacted all dimensions of the financial system. The portfolios of huge investment banks with large and predominant MBSs positions found their net worth sink as the MBSs began to lose value. This was the case with Bear Stearns. The giant investment bank’s worth sank enough that competitor JPMorgan purchased it in March 2008 for $2 per share. Seven days before the buyout, Bear Stearns shares traded at $70[2].

Because MBSs were so prevalent in the market, it wasn’t immediately clear how widespread the damage from the subprime mortgage fallout would be. During 2008, a new write-down of billions of dollars on one institution or another’s balance sheet made headlines daily and weekly. Fannie Mae and Freddie Mac, the government-chartered corporations that fund mortgages by guaranteeing them or purchasing them outright, sought help from the federal government in August 2008. Combined, the two institutions own about $3 trillion in mortgage investments[3]. Both are so entrenched in the U.S. economy that the federal government seized control of the corporations in September 2008 amid sliding values; Freddie Mac posted a $38 billion loss from July to August of 2008[4].

When Fannie Mae and Freddie Mac won’t lend money or purchase loans, direct lenders become less likely to lend money to consumers. If consumers can’t borrow money, they can’t spend it. When consumers can’t spend money, companies can’t sell products, and low sales means lessened value, and so the company’s stock price per share declines. So, on one side the capital market is tightening due to the MBSs and CDOs but also corporations suffer, as consumers lessened their consumption, and as money and credit tightened gradually.  Businesses then trim costs by laying off workers, so unemployment increases and consumers spend even less. When enough companies (not only banks and other financial institutions but also corporations and finally investors) lose their values at once, the stock market crashes. A crash can lead to a recession. A bad enough crash can lead to a depression; in other words, an economy brought to its knees[5].

Preparing to Avoid the Next Financial Singularity

The predictive emulation shows us that an excessive integration of financial domains without understanding how dynamic complexity will be generated is the equivalent of creating a major hidden risk that will always surprise everyone involved. Either because the predictive tools can’t easily identify the role dynamic complexity plays, or due to imprudent constructs that seem to be acceptable options—if the world is flat and the dynamics are linear. These two assumptions are obviously wrong. Science teaches us the concept of determinism (all events, including human action, are ultimately determined by causes external to will).

Undoubtedly, financial markets will continue to pose grave risks to the welfare of the global economy as long as economic stakeholders are unable to accurately measure dynamic complexity or understand all the steps required to protect their assets.  We must test and expand upon the presented universal risk management methods to define a better way forward that allows economic stakeholders to predict a potential singularity with sufficient time to act to avoid crisis of this magnitude in the future.

 

 

Dynamic Complexity’s Role in 2007-2008 Financial Crisis

After the economic events of 2007 and 2008, many economic experts claimed that they had predicted that such a disaster would occur, but none were able to preemptively pinpoint the answers to key questions that would have helped us prepare for such an event or even lessen its impacts, including: When will it occur? What will be the cause? How will it spread? And, how wide will its impacts be felt?

The then-U.S. Treasury Secretary, Henry Paulson, recognized that the credit market boom obscured the real danger to the economy.  Despite all the claims of knowing the true triggers of the economic recession, we believe the importance of dynamic complexity has been overlooked in everyone’s analysis. The real cause of the economic meltdown can be traced to intertwined financial domains, which generated considerable dynamic complexity that in turn made it difficult to determine the possible outcomes. There is no doubt that the subprime foreclosure rate started the domino effect, but had the degree of inter-domains dependency not pre-existed, then the effect on the market would have been much less severe.

While some seasoned experts have alluded to the same conclusion, most have considered that the market complexity (in a broad and immeasurable sense) played a significant role in creating the risk, which ultimately caused a global recession. But most conclude that the aggregate risk of complexity was not necessarily something that the market should be able to predict, control and mitigate at the right time to avoid the disaster.

While dynamic complexity can be identified after the fact as the origin of many unknowns that ultimately lead to disaster, most financial services and economic risk management models accept market fluctuations as something that is only quantifiable based on past experience or historical data.  However, the next economic shock will come from a never-seen-before risk. And the distance between economic shocks will continue to shrink as banks add more technology and more products/services, further compounding the inherent risk of dynamic complexity.

A Better Path Forward

Revealing the unknowns through the joint power of deconstruction theory and mathematical perturbation theory allows for both the determination of potential cause origins (allowing the evolution to happen as reactions to the influencer’s changes) and helps to predict the singularity/chaos point and the distance to such point in time.  As we privilege the determinism, we consider that any observation points to a cause and that such a cause should be discovered by the tools we possess. Openly, we are trying to convince you that, “If we know the cause, then we will be able to predict when it will occur, the severity of risk and what may be the amplitude of a possible singularity.” This will then afford us the time needed to mitigate the risk.


Figure 1. Spread between 3-month LIBOR and 3-month Expected Federal Funds Rate (January 2007 – May 2008 daily)

By reviewing graphs of the financial market from 2007 to 2008, we discovered that market changes happened at the vertices as well as at the edges, as we would normally expect. The example in Figure 1 illustrates part of the story.

According to Stephen G. Cecchetti[1], the divergence between the two rates is typically less than 10 basis points. This small gap arises from an arbitrage that allows a bank to borrow at LIBOR (London Interbank Offer Rate), lend for three months, and hedge the risk that the comparable overnight index swap rates (OIS) will move in the federal funds futures market, leaving only a small residual level of credit and liquidity risk that accounts for the usually small spread. But on August 9, 2007, the divergence between these two interest rates jumped to 40 basis points.

The problem lies in the worst case. Each vitric and each edge directly connects to every other vitric and every other edge, and therefore represents the direct effects covered by perturbation theory as presented in Figure 29.2. But, because each one is perturbed, the analysis will not be sufficient to understand the full picture if we do not add the indirect effects on the direct ones. This points precisely to the difference between Paulson’s analysis and ours.

 

Figure 2. Schematic Representation of Financial Market Dependencies and Crisis Points

In short, Paulson attributed the crisis to the housing bubble and we attribute it to the dynamic complexity, which includes multiple dependencies within the whole market: housing market, equity, capital market, corporate health, and banking solvency, which in turn impacted the money supply that caused massive unemployment and severe recession.

A major result of our analysis was still not obvious or entirely elucidated when Henry Paulson expressed his own analysis. Indeed, the housing bubble precipitated the crisis, but the real cause was a large proportion of dynamic complexity that was hidden in the overall construct of the financial system. This means that any regulation, organization, and consequently, surveillance of the system should measure the impact of dynamic complexity, if we hope to adequately predict and mitigate its risk.

[1] Cecchetti, Stephen G. Crisis and Responses: The Federal Reserve in the Early Stages of the Financial Crisis. Journal of Economic Perspectives, American Economic Association, vol. 23(1), pages 51-75, Winter 2009. PDF file.