Sorry, you do not have access to this eBook
A subscription is required to access the full text content of this book.
The history of quantum computers begins with the articles of Richard Feynman who, in 1982, speculated that quantum systems might be able to perform certain tasks more efficiently than would be possible in classical systems (Feynman, 1982). Feynman was the first to propose a direct application of the laws of quantum mechanics to a realization of quantum algorithms. The fundamentals of quantum computing were introduced and developed by several authors after Feynman’s idea. A model and a description of a quantum computer as a quantum Turing machine was developed by Deutsch (1985). In 1994, Shor introduced the quantum algorithm for the integer-number factorization and in 1997, Grover proposed the fast quantum search algorithm (Grover, 1997). Later on, Wooters and Zurek proved the noncloning theorem, which puts definite limits on the quantum computations, but Shor’s work challenges all that with the quantum error correction code (Shor, 1995). In the last years, the development of quantum computing has grown to enormous practical importance as an interdisciplinary field, which links the elements of physics, mathematics, and computer science. Currently, various physical models of quantum computers are under intensive study. Several types of elementary quantum computing devices have been developed based on atomic, molecular, optical, and semiconductor physics and technologies.
A subscription is required to access the full text content of this book.
Other ways to access this content: