### Theory

Quantum computing is based on a

new type of computer that gets it's processing power from quantum effects

. A normal computer is based on the model of a

Universal Turing Machine.

Richard P. Feynman once made an

offhand remark that

Turing machines, which are really supposed to be able to model everything that is an

algorithm, ie.

everything that can be modeled, can't seem to model quantum behavior decently, because they need a exponential amount of memory to do any calculation. (For instance, if you have a system with 5

quantum bits, and there are 4 interactions, you must use 5^5^5^5

bits in

memory, = 7

.18 * 10

^{436}.) This was clearly unpractical, and he suggested that if you based a computer on

quantum effects, it is possible that then you could model these effects better.

Of course, hypothesizing a new type of computer and working out a realistic theory of it's operation are two completely separate things, and it took several years before the theoreticians had models for what quantum computing should look like. There were 2 main models, which have been shown to be equivalent, so that any problem solvable in one system is solvable on the other. Basically this means that any programs written on one machine are able to be written on the other, though actually porting them is not necessarily so trivial. The two types of logical quantum machines are implementable on the same quantum hardware. In addition to this, It has been shown that the set of operations that quantum computer can do includes the set of operation that any Turing machine can do. This means that any stardard computer function, and possibly more, can be done. It is clear that non-computable (ie. Non-algorithmic) computations cannot be done, but speed increases (in terms of theoretical maximum efficiency) seem very likely in a number of very practical areas.

### Engineering

Several working models of very small scale quantum computers currently exist. Most of these, including the largest, a tiny 7-qubit machine owned by IBM, are based on the principle of Nuclear Magnetic Resonance. This is basically the method used to write data into the qubits. Its effective limit, however, as far as anyone can tell, is around 15 qubits, which is significantly smaller than would be truly useful for most applications.

Other types of Quantum computers are smaller, but have much higher theoretical limits to size. Several 3 and 4 qubit computers, owned by universities and research departments, are based on these models. There is a hope than at least one of these models will be able to be scaled up far enough to become practical.

Among the other types of quantum computers in the works, there are several that rely on complex chemical structures to bind (in one case) seven qbits as a unit, so that they can be manipulated en-masse. These would allow much larger quantum computers, using existing techniques for manipulating 3 and 4 qbit machines (such as linear ion arrays,) to produce machines 20 and 30 qbits large, a giant leap that could enable quantum computers to do the first practical work.

The biggest single practical obstacle in quantum computing, however, increases with size. When a quantum system is observed, the problem of decoherence interferes, literally. Quantum states rely on interference between particles in the system. This, however, is also an immense downside, because particles are not choosy about what they interfere with. After a time, the system interferes with the environment around it, and the system becomes incoherent, because the particles that you want to read are attached, via interference, with particles that are not part of the system. The Qbits that have interfered are now incorrect. This, however, has been shown to be correctable, but Quantum error correction algorithms, which must be vastly improved before large scale error correction can be made practical for partially observed quantum systems.

### Possible Downside

Quantum Computers, however, are only disputably worthwhile. Many Computer Science theorists believe that while in limited cases the quantum computers will offer appreciable gains on classical turing machines, in most cases the gain will be limited to, at most, polynomial time, and in that case, quantum computers would have to reach the level of complexity and speed at least approaching that of classical computers, (gigahertz ranges) to be useful. However, other argue that since certain applications (factoring especially) allow a exponential decrease in algorithm time, it is useful. In addition, many avenues have only begun to be explored, and may prove to be the most useful aspects of all. Quantum communication seems to be a very interesting, if young field, as does quantum dense coding (which includes lossless image compression , and data transmissions with errors eliminated to arbitrary accuracy.)
It is also very possible that certain classes of NP complete problems are either equivalent to the already investigated factoring problem, or are similarly much easier when working with quantum computation. In this case, it is immensely practical to develop these quantum computers.

In all, quantum computing is a field that is essentially still in formation. It is unclear whether it will be possible to build larger scale computers in the near future, of the size that will endanger RSA encryption. Whether it is or not, companies like HP and IBM, to forestall competition, as well as universities like MIT and Caltech, to forstall graduate students, will continue to do research into this exciting field.

Hardy, Yorick. Steeb, Willi Hans. "Introduction" Classical and Quantum Computing: With C++ and Java Simulations. Pg. ii-xxv.

Vazirani, Umesh. "Introduction to Special Section on Quantum Computation" SIAM Journal on Computing, Volume 26, Number 5, pg. 1409-1410.
Nielsen, Michael A. "Rules for a Complex Quantum World" Scientific American, November 2002.

Bennett, Charles H. Bernstein, Ethan. Brassard, Gilles. and Vazirani, Umesh. "Strengths And Weaknesses of Quantum Computing" SIAM Journal on Computing Volume 26, Number 5 pg. 1410-1423.

Fortnow, Lance. "One complexity theorist's view of quantum computing" Theoretical Computer Science, Volume 292, Issue 3, Pg. 597-610.