The digital devices that we rely on so heavily in our day-to-day and professional lives today—smartphones, tablets, laptops, fitness trackers, etc.—use traditional computational technology. Traditional computers rely on a series of mathematical equations that use electrical impulses to encode information in a binary system of 1s and 0s. This information is transmitted through quantitative measurements called “bits.”
Unlike traditional computing, quantum computing relies on the principles of quantum theory, which address principles of matter and energy on an atomic and subatomic scale. With quantum computing, equations are no longer limited to 1s and 0s, but instead can transmit information in which particles exist in both states, the 1 and the 0, at the same time.
Quantum computing measures electrons or photons. These subatomic particles are known as quantum bits, or ” qubits.” The more qubits are used in a computational exercise, the more exponentially powerful the scope of the computation can be. Quantum computing has the potential to solve equations in a matter of minutes that would take traditional computers tens of thousands of years to work out.