The tech world has been down this hype alley before. Google instigated that round with the 2019 introduction of its Sycamore processor, which also solved a mathematical equation that led the company to claim quantum supremacy over classical computers. News articles exhausted every angle in describing how these magical machines were going to impact each industry in the future. That enthusiasm fizzled out because while the machines were proven, they were not ready for widespread use. Besides, Nvidia keeps making superpowerful chips that are pretty good at simulating quantum computing without the technical drawbacks.
This is where it is helpful to understand the difference between classical and quantum computing. Classical computers use digital bits that process information in 1s and 0s, made possible by electric current that turns on or off tiny transistors. Quantum computing uses, well, quantum bits or qubits, which are able to process data between 1 and 0 using natural or human-made particles. Using a paint analogy, the bits on a personal computer chip are the equivalent of black or white. Qubits, though, can come in all shades on the color wheel. That potential array of values is what makes quantum computing so powerful and also underpins the reason the machines will be able to do much more complex calculations than their black-and-white classical cousins. The problem is that qubits are extremely delicate and can cause quantum computers to lose information, making them prone to error. Google’s latest achievement is that it can correct errors at greater scale.
With your current subscription plan you can comment on stories. However, before writing your first comment, please create a display name in the Profile section of your subscriber account page.