Von Neumann Normalisation of a Quantum Random Number Generator
Abstract
In this paper we study von Neumann un-biasing normalisation for ideal and real quantum random number generators, operating on finite strings or infinite bit sequences. In the ideal cases one can obtain the desired un-biasing. This relies critically on the independence of the source, a notion we rigorously define for our model. In real cases, affected by imperfections in measurement and hardware, one cannot achieve a true un-biasing, but, if the bias “drifts sufficiently slowly”, the result can be arbitrarily close to un-biasing. For infinite sequences, normalisation can both increase or decrease the (algorithmic) randomness of the generated sequences.
A successful application of von Neumann normalisation—in fact, any un-biasing transformation—does exactly what it promises, un-biasing, one (among infinitely many) symptoms of randomness; it will not produce “true” randomness.