Image source: Intel
To handle the massive amounts of parallel data analysis required for IoT and advanced research projects, designers keep pushing the limitations of current computing architectures. In that vein, the recent announcement of a 17-qubit superconducting chip from Intel puts a spotlight on the notoriously ethereal concept of quantum computing.
Quantum computing is a funny dot in that it’s both here and not here, at the same time. And it has been that way since work on it started in 1980 within the field of quantum mechanics. Without digging too much into the physics, the end game is that we can theoretically take advantage of the ability of quantum “bits” to be in a superposition of two states – 0 and 1 – at the same time (Figure 1).
Current digital bits are 0 or 1. This ability of qubits to be in both states, in theory, allows for much more data to be encoded in a given qubit. Quantum computing combines superposition with entanglement theory, which stipulates that multiple, separated qubits are still entangled and can be acted upon simultaneously with precise, correlated results.
Like neuromorphic computing, quantum computing falls into the category of “someday, soon.” Intel recently made some more tangible announcements in the area of quantum computing that make it more real than, well, before the announcements.
Figure 1: Quantum computing relies upon the ability of a qubit to be in two states, simultaneously. (Image source: Intel)
The first was that it is investing $50 million in QuTech, the quantum research institute of Delft University of Technology and the Dutch Organization of Applied Research (TNO) to kick off a 10-year program of collaboration to accelerate advancements in this area. It subsequently announced delivery of a 17-qubit superconducting test chip to QuTech.
Mike Mayberry, Intel vice president and managing director of Intel Labs, said Intel’s interest in quantum computing lays in its potential to aid in solving complex problems, including simulations of data. With that said, he’s also realistic, projecting a fully functioning quantum computer to be at least 12 years out.
The difficulties of implementing a quantum computer are many, starting with the actual physical requirements to make a qubit stable, as any noise or unintended observation of them (a quantum physics effect) can make them unstable. To make them stable requires operation at 20 milliKelvin. For perspective, that’s 250 times colder than deep space.
This is where Intel stepped up with input from its Components Research Group and Assembly Test and Technology Development teams in Arizona. Together they developed the new chip and package. The chip measures about the size of a quarter, while the package is about the size of a half dollar coin.
The main performance parameters are that it’s reliable, and it has greater RF isolation between qubits. Also, the package interconnect allows for 10 to 100 times more signals in and out of the chip, as compared to regular wirebonded chips.
The qubits are actually on an IC that is bonded to the package using a ball-grid array. The package’s gold connectors allow QuTech to access the IC and test the qubit operation and examine workloads. Specifically, the chip will allow the team to focus on connecting, controlling and measuring multiple, entangled qubits toward an error-correction scheme and a logical qubit.
Whether quantum computing is here or not, the point is that along with acquiring IoT data, making practical use of it may require moving beyond current thinking on how data is processed and analyzed. The other point is that it’s going to take collaboration along the way, across disciplines.
Ironically, advancing the state of the art in quantum computing opens the door to faster hacking algorithms that may undermine IoT’s already tenuous security, but thankfully SecureRF is already working on that problem.