Abstract
We discuss a stochastic method to simulate quantum computing algorithms in a classical computer using the Hubbard-Stratonovich decomposition of n-qubit gates into one-qubit gates integrated over auxiliary fields. The problem reduces to finding the fixed points of the associated system of Langevin differential equations. We show that the method can be applied to Grover's algorithm.
Export citation and abstract BibTeX RIS