Risk Management in the Age of Digital Disruption

Today, as the repercussions of the Fourth Industrial Revolution begin to take hold, many corporations are fighting hidden risks that are engendered by the obsolescence of their cost structures, outdated business platform and practices, market evolution, client perception and pricing pressures that demand some form of action. But change is often a difficult choice as it implies a certain degree of unknown risk.

Current risk management practices, which deal mostly with the risk of reoccurring historical events, cannot help business, government or economic leaders deal with the uncertainty and rate of change driven by the Fourth Industrial Revolution. As new innovations threaten to disrupt, business leaders lack the means to measure the risks and rewards associated with the adoption of new technologies and business models. Established companies are faltering as leaner and more agile start-ups bring to market the new products and services that customers of the on-demand or sharing economy desire—with better quality, faster speeds and/or lower costs than established companies can match.

Due to the continuous adaptations driven by the Third Industrial Revolution, most organizations are now burdened by inefficient and unpredictable systems. Even as the inherent risks of current systems are recognized, many businesses are unable to confidently identify a successful path forward.

PwC’s 2015 Risk in Review survey of 1,229 senior executives and board members, reports that 73% of respondents agreed that risks to their companies are increasing. However, the survey shows companies are not, largely, responding to increasing threats with improved risk management programs. While executives are eager to confront business risks, boost management effectiveness, prevent costly misjudgments, drive efficiency and generate higher profit margin growth, only an elite group of companies (12% of the total surveyed) have put in place the processes and structures that qualify them as true risk management leaders per PwC’s criteria.

The shortcomings of traditional risk management technologies and probabilistic methodologies are largely to blame. According to Delloitte’s 2015 Global Risk Management Survey, 62% of respondents said that risk information systems and technology infrastructure were extremely or very challenging. Thirty-five percent of respondents considered identifying and managing new and emerging risks to be extremely or very challenging.

To remain competitive, companies must pursue two parallel strategies:

  1. Building agile and flexible risk management frameworks that can anticipate and prepare for the shifts that bring long-term success.
  2. Building the resiliency that will enable those frameworks to mitigate risk events and keep the business moving toward its goals.

While many risk management and business management experts agree on the need for better risk management methods and technologies, their proposed use of probabilistic methods of risk measurement and reliance on big data cannot fulfill the risk management requirements of the twenty-first century.

Many popular analytic methods are supported by nothing more than big data hype, which promises users that the more big data they have, the better conclusions they will be able to ascertain. However, if the dynamics of a system is continuously changing, any analysis based on big data will only be valid for the window of time during which the data was captured. Outside this window of time, an alignment with reality is unlikely.

To respond to the rate of change engendered by the Fourth Industrial Revolution, the practice of risk management must mature and become a scientific discipline. Our work with clients and research exposes the failures of traditional risk management practices. But many are not aware there is a better way to discover and treat risk. Ultimately, we will need more collaborators and partners who wish to teach business leaders and risk practitioners how new universal risk management approaches and mathematical-based emulation technologies can be used to identify new and emerging risks and prescriptively control business outcomes. If you are interested in joining our cause, please contact us as we are always looking for new research, technology and service partners.

How Dynamic Complexity Disrupts Business Operations

Dynamic complexity always produces a negative effect in the form of loss, time elongation or shortage—causing inefficiencies and side effects, which are similar to friction, collision or drag. Dynamic complexity cannot be observed directly, only its effects can be measured. Additionally, dynamic complexity is impossible to predict from historic data—no matter the amount—because the number of states tend to be too large for any given set of samples. Therefore, trend analysis alone cannot sufficiently represent all possible and yet to be discovered system dynamics.

In the early stages, dynamic complexity is like a hidden cancer. Finding a single cancer cell in the human body is like looking for a specific grain of sand in a sandbox. And like cancer, often the disease will only be discovered once unexplained symptoms appear. To proactively treat dynamic complexity before it negatively impacts operations, we need diagnostics that can reliably reveal its future impact. System modeling and mathematical emulation allow us to provoke the existence of dynamic complexity through two hidden exerted system properties: the degree of interdependencies among system components and the multiple perturbations exerted by internal and external influences on both components and the edges connecting them directly or indirectly.

Successful risk determination and mitigation is dependent on how well we understand and account for dynamic complexity, its evolution, and the amount of time before the system will hit the singularity (singularities) through the intensification of stress on the dependencies and intertwined structures forming the system.

Knowing what conditions will cause singularities allows us to understand how the system can be stressed to the point at which it will no longer meet business objectives, and proactively put the risk management practices into place to avoid these unwanted situations.

Below we provide an example of a client case where dynamic complexity played a key role in terms of resource consumption, time to deliver, and volume to deliver. The scenario presented in the graph represents a trading and settlement implementation used by a volume of business that continuously increases. The reaction of the system is shown by the curves.

In the graph above, the production efficiency increases until it hits a plateau after which the business is increasingly impacted by a slowdown in productivity and increase in costs. The amount of loss is proportional to the increase in dynamic complexity, which gradually absorbs the resources (i.e. the cost) to deliver little. The singularity occurs when the two curves (productivity/revenue and cost) join, which in turn translates into loss in margin, over costing and overall instability.

In client cases such as this one, we have successfully used predictive emulation to isolate the evolving impact of dynamic complexity and calculate risk as an impact on system performance, cost, scalability and dependability. This allows us to measure changes in system health, when provoked by a change in dynamic complexity’s behavior under different process operational dynamics and identify the component(s) that cause the problem.

But knowing the how and what isn’t sufficient. We also need to know when, so we measure dynamic complexity itself, which then allows us to monitor its evolution and apply the right mitigation strategy at the right time.