What kinds of statistical problems are likely to benefit from quantum computing?

We are at the advent of quantum computing, with quantum languages anticipating hardware quantum computers now available at high and low levels for simulated quantum computers. Quantum computing brings new elementary functions like entanglement and teleportation of qubits, measurement of qubits, and imposition of superposition on qubits.

What kinds of statistical problems are likely to benefit from quantum computation?

For example, will quantum computers provide more ubiquitous true random number generation? What about computationally cheap pseudorandom number generation? Will quantum computing help accelerate MCMC convergence, or ensure upper bounds on convergence time? Will there be quantum algorithms for other sampling-based estimators?

This is a broad question, and acceptable answers will also be broad, but kudos if they differentiate quantum and classical computation. (If this is a too broad question, please help me make it a better question.)

Answer

About quantum computing

Quantum computing makes use of superposition and entanglement.

  • Superposition means that the state of a system is a collection of multiple states that are superposed. Intuitively you can see this as quantum computing being based on wave mechanics and the state of the system is a wave. For waves we can imagine multiple waves being on top of each other.

    The superposition allows manipulations of multiple (superposed) waves at once in a single operation.

  • Entanglement means that the states of multiple components in the system are linked by a single wave equation/function.

    For instance when two elections are created in a single event then the sum of their spins (one must be up; the other must be down) might be linked due to conservation rules (similar like conservation of energy or momentum).

    The big philosophical deal with entanglement is in wave function collapse. It occurs when a measurement of the state of the wave function is made (which breaks down the superposed state into a single state).

    • The philosophical “problem” is that the event of wave function collapse instantaneously collapses the state of all entangled components over a large distance
    • In addition there is a problem about the interpretation of this wave function collapse (and this is more related to the superposition). The wave function collapse is not something that follows from other basic principles/mechanisms, based on which we can get an intuitive grasp. It is more like some axiom or some base principle. For quantum computing it does not matter. It just works. But the human mind feels like there should be something more behind this collapse. It feels a bit, flimsy. The wave function collapse is our current day ‘atom’ that we try to split into parts.

    The entanglement allows the exponential increase in the number of states that are modeled by the superposed wave functions.

The power of quantum computing

The power of quantum computing lies in the combination of superposition and entanglement. The n entangled quantum bits create 2n different states for the system that are superposed on each other. The superposition allows us to manipulate (make computations with) all of those 2n states simultaneously. The contrast with non-quantum computing is that you would have to repeat the computations 2n times for each state separately.

No free lunch

Note that this is not a free lunch. A quantum computer is not deterministic but instead will give a different answer each time according to some distribution.

We can manipulate multiple superposed waves made out of the entangled bits at the same time. But we cannot know the state/amplitude of all those superposed waves. Once we read out the state of the system we only get to read one single state out of all superposed states in the system. This is the wave function collapse when the multiple superposed waves turn into the wave of a single state.

The behavior of this collapse is probabilistic. The probability to observe/collapse a particular state will depend on the wave function. The algorithms for quantum computing are made in such a way to start with a wave that encodes the 2n states with more or less equal probability. Due to the manipulations/computations, the algorithm ends up with the states that are close to the solution as the ones that are most likely to be observed.

What sort of benefits?

Basic computations

There are several basic algebraic manipulations that quantum computing can speed up. All of the statistics that are using such manipulations will benefit from quantum computing.

Basically, quantum computing is able to reduce the number of computations by applying simultaneous computations. For instance, in computing a discrete Fourier transform with 2n components, the quantum algorithm can use n bits that encode these 2n components, and by the manipulations, on those n bits you effectively are doing manipulations on the 2n components.

Which problems?

The problems that will most benefit from this simultaneous computation are large-scale problems that are currently limited by computational power (and that can be parallelized). For instance, machine learning problems, Monte Carlo simulations, or problems with a large dimensionality. The first two of these three are also nondeterministic (reducing the weak point of a quantum computer) and are like the quantum computer making approximations by using an estimate based on luck.

Maybe you could better ask which kinds of (computation-heavy) statistical problems are not gonna benefit from quantum computing? There’s probably not gonna be much.

Attribution
Source : Link , Question Author : Community , Answer Author :
5 revs, 2 users 95%

Leave a Comment