# Both at Once

Image Source: darpa.mil

**Marc RandyScience & Tech Editor**

Computers are arguably the most defining technology of our era. Since the Turing machine was invented by Alan Turing in the 1930s, computers have become more and more sophisticated to the point the we now carry more computing power in our smartphones than what was on the Apollo 11 Moon lander. Needless to say, computing power has been increasing exponentially, following Moore’s Law.

Moore’s Law states that every two years the processing power of computer doubles. But as computers become more and more powerful, the limitations of classical computers become evident. Only so many processors can fit on a circuit board, even as the processors get smaller and smaller, approaching the atomic scale.

However, a new computing method has emerged out of the obscure depths of experimental science. Quantum computing has been around since the 1980s, when physicist Paul Benioff, Richard Feynman, and Yuri Manin were involved in suggesting a quantum model for the Turing machine. The advantage of quantum computing is that it allows for operations to be performed that classical computers cannot.

The basic premise of quantum computing is this: classical computers operate on either zeroes and ones. Zeroes represent a lack of current and a one is the presence of current. All computers build off those two variables. Quantum computers use a system called a ‘qubit’, which is more or less both zeroes and ones existing at the same time in one variable. With a new system like this, big mathematical operations are much easier to perform.

Google is actively working on quantum computing as are many researchers around the world. This technology is slated to transform computing as a whole since quantum computers would far outclass the classical computers we currently use.

*Originally Published in Bandersnatch Vol.49 Issue 08 on January 29 ^{th}, 20*20