Qubit neural network and its learning efficiency

Noriaki Kouda, Nobuyuki Matsui, Haruhiko Nishimura, Ferdinand Peper
Abstruct
Neural networks have attracted much interest in the last two decades for their potential to realistically describe brain functions, but so far they have failed to provide models that can be simulated in a reasonable time on computers; rather they have been limited to toy models. Quantum computing is a possible candidate for improving the computational efficiency of neural networks. In this framework of quantum computing, the Qubit neuron model, proposed by Matsui and Nishimura, has shown a high efficiency in solving problems such as data compression. Simulations have shown that the Qubit model solves learning problems with significantly improved efficiency as compared to the classical model. In this paper, we confirm our previous results in further detail and investigate what contributes to the efficiency of our model through 4-bit and 6-bit parity check problems, which are known as basic benchmark tests. Our simulations suggest that the improved performance is due to the use of superposition of neural states and the use of probability interpretation in the observation of the output states of the model.