IBM, Microsoft and Alphabet working towards the dawn of quantum computing

quantum-processor-335There are many effects caused by the rules of classical physics which we humans take for granted. Hit a pane of glass with enough force and it will shatter. Hold up a piece of paper to an open flame and it will burn into ash. These properties are easy to understand for most high school seniors and classical physics adequately explains the vast majority of what we encounter on a day-to-day basis.

As many who are knowledgeable about 20th century scientific discoveries will be familiar with, classical physics does not account for everything we know about our world, especially at a molecular level. In the early parts of the century, the field of quantum mechanics began to grow as scientists like Max Planck and Albert Einstein began to unveil the secrets behind unusual atomic-level activities, such as how light particles can act like a wave instead of a particle at times. With the field of classical computing set to hit a technological wall in the coming years, some big names in technology are taking successful steps into quantum computer development. If successful, they could establish an entirely new field of computing which would change everything we know about the technology.

Unlike classical computing, which relies on bits that take on values of either 1 or 0 in order to process information, quantum computing relies on qubits. Qubits can take the distinguishable 1 or 0 value, but unlike classical bits, there are aspects of quantum mechanics, which make qubits much more useful in certain applications. One unique element of qubits are their ability to take on a superposition, meaning that a single qubit can be in multiple states at a single moment in the same way that a light can behave as a wave or a particle at the molecular level. Entanglement, or the state in which two qubits can be inextricably linked even when separated by great distances, is another effect of quantum mechanics which has implications for computing. Superposition and entanglement would allow a quantum computer to rapidly perform calculations which could never be completed by a classical computer, such as finding the factors of a number with more than 500 digits, unlocking a new world in data encryption and analysis.

[Varsity-4]

The pursuit of quantum computing goes back to the early 1980s but some American tech giants have been making forays into the sector. In September, Google (now Alphabet, Inc.) (NASDAQ:GOOG) completed work on a quantum computer known as the D-Wave which doubled the number of qubits in the machine to more than 1,000. In early December, the high-tech company released data indicating that, when calculating the solution containing about 1,000 binary variables, quantum annealing on a quantum computer was found to be 100 million times faster at solving the problem than simulated annealing on a single core computer. Informed observers have shown some muted optimism, as these experiments more represent a test of quantum computing theories using a problem specifically formulated for the quantum platform. Remarks made after the report’s release by Hartmut Neven, Alphabet’s top supervisor on its D-Wave project, likened the event to the Wright brothers’ first flight in 1903 at Kitty Hawk. Other than Canadian-based D-Wave Systems, a firm dedicated to quantum computing research, other entities involved in this project include NASA’s Quantum Artificial Intelligence Laboratory (QuAIL) and the Universities Space Research Association (USRA).

Another giant of American computing development, IBM (NYSE:IBM), is getting a federally-funded boost to help kickstart U.S. developments in quantum computing. The government’s Intelligence Advanced Research Projects Agency (IARPA) has entered into a multi-year contract with IBM for the development of a logical qubit utilizing a number of imperfect qubits working together in a circuit to produce accurate calculations. Quantum computing chips produced by IBM are made from materials that take on superconducting qualities at temperatures near absolute zero. IBM’s two-dimensional chip array differs from the D-Wave in a way that allows it to detect both bit-flip and phase errors affecting computation at the same time, for which the D-Wave is not designed.

Those with an interest in seeing what it’s like to program on quantum computing platforms got a nice gift recently from Microsoft Corporation (NASDAQ:MSFT), which decided to make its LIQUi|> quantum computing simulator available to the public. Although quantum linear algebra, Shor’s factoring algorithm or calculating the ground state energy of a molecule are not for the casual observer, anyone with a good handle on those topics can explore these subjects in LIQUi|>.

The computing world doesn’t just want to pursue development of computing technologies at the quantum level; it needs quantum computing or the pace of classical computing innovation will slow to a halt within a few decades. For decades now, there’s been an oft-discussed theory known as Moore’s Law which argues, somewhat loosely, that computer processors would double in processor density every two years, increasing computing power while reducing chip size. The current state of transistor development includes transistors which are 14 nanometers in length, making them a fraction of the size of a red blood cell. If transistors are ever reduced in size to the atomic level, which is a distinct possibility, then the rules of quantum mechanics begin to create huge issues for classical computing. Transistors are supposed to control the flow of electrons, which includes stopping electrons, but if superposition causes a transistor to act like a wave, electrons could squeeze through beyond the transistor’s control.

The international community is ramping up its pursuit of quantum computing technologies, especially in Australia. Researchers at that country’s University of New South Wales (UNSW) proved that quantum computing chips could be fabricated from silicon, paving the way for much more cost-effective solutions than vacuum-suspended charged atoms or light particles. Australia’s federal government has invested $26 million into the UNSW project in the hopes that Australia can be the birthplace of the first commercially successful quantum computer.

There are numerous ways that quantum computing will impact our world, only some of which are known today. One of the major applications of quantum computing that keeps popping up is for enhanced encryption technologies. The fact that it’s nearly impossible to factor large numbers is a major foundation of current encryption techniques. Quantum computers can factor large numbers swiftly, which would make current encryption impractical as a safeguard once quantum computing is widely commercialized. However, it also enables encryption on an entirely new level. Because of the quantum mechanical principle of entanglement, and the fact that qubits “decohere” or lose their superposition when they are observed, quantum computers could generate truly randomized keys and know exactly when an unauthorized party is trying to catch a glimpse of data. As well, it’s theorized that quantum computers can digitally model incredibly complex simulations, such as unusual atomic or particle activities, in the same way that the CERN Large Hadron Collider performs under actual laboratory conditions.

It might be fair to compare the current state of development in quantum computing to the early days of supercomputers, just before Seymour Cray revolutionized that industry with the CDC 6600 in 1964. One of the current hurdles to overcome in quantum computer development is the decoherence effect, which means that quantum operations cannot be observed while being processed without causing an error that throws off the whole equation. Most quantum computing implementations also require supercooling down to temperatures 250 times colder than outer space, which requires a great deal of energy. Whoever unlocks the answer to making quantum computing commercially successful stands a great chance to become this generation’s Seymour Cray.

Share

Warning & Disclaimer: The pages, articles and comments on IPWatchdog.com do not constitute legal advice, nor do they create any attorney-client relationship. The articles published express the personal opinion and views of the author as of the time of publication and should not be attributed to the author’s employer, clients or the sponsors of IPWatchdog.com.

Join the Discussion

No comments yet.