Quantum Risk Analysis


... Calculating probabilities with probabilistic machines ...

Systemic risk of financial networks

The financial system is highly correlated. For example banks lend to each other via the interbank market or are directly invested in other banks. The default of one institute can impact other connected banks and can lead to system failure in extreme cases. Also the default of a bank impacts retail customers and companies when accounts cannot anymore be reached. Recent example is the default of Silicon Valley Bank. Due to rising interest rates, long term investments get less valuable, hence financing long term investments with short term credits incorporates higher risk than during zero interest rate regimes and more capital is necessary due to regulatory capital requirements. Similar problems occurred with Credit Suisse (in March 2023) and less valuable long term investments are an issue for all long term investors, e.g. insurance companies that need to manage capital over long time periods. To evaluate default probabilities in complex systems, simulations of networks over time can be simulated using Monte Carlo methods. The financial system is such a complex system where each bank, company or even individual can be represented as a network node that can fail and recover with a certain probability. Contagion is the effect that default of one node can lead to cascade effect by default of the next connected node and so forth, also called avalanche effects. This is called systemic risk and adds requirements to capital buffers, beside credit risk and others.

To evaluate credit and systemic risk in financial markets, probabilistic networks can be used to analyse probabilities of default for specific institutions over time. Then the system can be simulated by evolving over time to analyse the probability of certain states.

Modelling financial networks can also be encoded as a quantum program and evaluated using Monte Carlo methods on quantum computers. Here we show a very first idea of how to do risk management with quantum computers. Remember: Quantum computers are probabilistic machines and are therefore a good choice for dealing with random processes.

Probabilistic networks

Here we present a new risk model that can be used to model time evolution of probabilistic networks. It can be used to model default risks of financial institutions and companies (nodes). This allows for arbitrary network topologies including cycles, as each time step depends only on the states of the previous time steps. Furthermore, each node comes with a recovery probability that allows a node to recover after a failure with a certain probability. For example a bank has a probability of default of 1%, a correlation to 3 other banks and with a recovery probability of 10% in the next time step.

Monte Carlo methods on quantum computers

Quantum computing is a very hyped topic with heavy research activities going on and a lot of capital backed companies that try to develop the necessary technology for the future quantum computers. The technology itself is capable of providing disruption in all industries (security, optimization, simulation, artificial intelligence) due to effects that are impossible or inefficient with classical technology. Quantum mechanical properties open a new chapter of information processing, algorithms and exchanging information. Although the potential is huge when it comes to breaking technological barriers, there are very relevant problems to solve on the hardware level. One example is the quality of qubits and scaling them to hundreds of qubits. (look at our blog about the necessary hardware and how algorithm-specific development can help to reduce hardware requirements: https://jos-quantum.de/errorresilient/).

We are confident that these problems can be solved in the future and new computational paradigms will open new ways.

Potential applications of quantum computers range from breaking encryption protocols, more efficient optimization methods and faster Monte Carlo simulation.

To cut through hype and seperate bullshit from realistic expectations we build scalable quantum prototypes for our clients. Protoypes provide the possibility to evaluate business impact, asses hardware requirements and run first benchmarks.

Here are some questions a risk controller, economist, supervisor or regulator might ask is:

  • What is the implied probability of default for an institution in a certain time step considering contagion effects and recovery?

  • What is the probability of default for an institution over all time steps?

  • Determine the probability that the first node does not fail in the last step, given that its neighbour fails in the first time step.

Answering these questions requires using Monte Carlo simulation, meaning each parameter needs to be randomly sampled which requires a high number of runs to sample all possible states of the system.

For instance, the error of classical Monte Carlo methods decreases with 1/√N for N samples. Thus calculating results with 10 times higher accuracy (e.g. from 99% to 99,5% resolution), requires 100 times the computational effort.

For example a network of 20 banks and 10 time steps has a state space of 2^200 possible outcome combinations. To sample rare events, or in other words to evaluate tail risks would require running many Monte Carlo runs. To increase precision from for example 1% to 0.1% confidence level would require 100 times more runs.

A quantum algorithm called amplitude estimation would only require 10 times more runs and offers a quadratic speedup over brute force Monte Carlo methods. Although the problem outlined is of exponential complexity and a quadratic speedup will not provide solutions to arbitrary problem classes, there is a sweet spot where the speedup will make simulations using quantum computers useful while classical Monte Carlo simulations aren’t.

Future perspesctive

Current state of the art quantum computers have up to a few dozen of qubits and are prone to errors. By errors we mean error while initialising qubit states, executing operations and measuring qubits. Errors on qubit states have very drastic consequences and usually destroys all relevant information from the computation.

But the quantum algorithms for quantum error correction are actually very well known and studied. We think that if the quantum computers gain 1 order of magnitude in qubit quality (at least 99.99% success rates), useful applications will provide advantages over classical counterparts.

Therefore it is very useful to develop scalable quantum prototypes to be ready and prepared for the technological advancements that will come.

Contact us at JoS QUANTUM to get ready for the quantum future:

contact@jos-quantum.de or visit us at www.jos-quantum.de