Intel Unveils Cryogenic Chip To Aid Climb To Quantum Computing

Intel Labs principal engineer Stefano Pellerano, holds Horse Ridge, a new cryogenic control chip to speed development of quantum computers.

At the IEEE International Electron Devices Meeting in San Francisco this week, Intel is unveiling a cryogenic chip designed to accelerate the development of the quantum computers they are building with Delft University’s QuTech research group. The chip, called Horse Ridge for one of the coldest spots in Oregon, uses specially-designed transistors to provide microwave control signals to Intel’s quantum computing chips.

The quantum computer chips in development at IBM, Google, Intel, and other firms today operate at fractions of a degree above absolute zero and must be kept inside a dilution refrigerator. However, as companies have managed to increase the number of quantum bits (qubits) in the chips, and therefore the chips’ capacity to compute, they’ve begun to run into a problem. Each qubit needs its own set of wires leading to control and readout systems outside of the cryogenic container. It’s already getting crowded and as quantum computers continue to scale—Intel’s is up to 49 qubits now—there soon won’t be enough room for the wires.

“Ultimately the goal is to minimize the number of wires into the fridge,” says Jim Clarke, director of quantum hardware at Intel. “Intel recognized that quantum controls were an essential piece of the puzzle we needed to solve in order to develop a large-scale commercial quantum system.” The solution is to bring as many of the control and readout electronics into the fridge, perhaps even integrating them onto the qubit chip itself.

Horse Ridge integrates control electronics onto a chip meant to operate inside the fridge with the qubit chip. Horse Ridge is programmed with instructions that correspond to basic qubit operations. It translates those instructions into microwave pulses that can manipulate the state of the qubits.

The chip is designed to work at 4 kelvins, a slightly higher temperature than the qubit chip itself. The company used its 22-nanometer FinFET process to build the chip, though the transistors that make up the control circuitry needed substantial reengineering.

“If you take a transistor and cool it to 4K, it’s not a forgone conclusion that it will work,” says Clarke. “There are a lot of fundamental characteristics of devices that are temperature dependent.”

The Intel team had to characterize how devices at that temperature switch on and off. And they had to use the model they developed to re-optimize the devices’ speed, performance, and power consumption under cryogenic conditions. The devices themselves had to be designed to throw off so little heat that they won’t disturb the delicate condition of the qubits. “Any little bit of heat and you scramble the information,” he says.

According to Clarke, Horse Ridge is a particularly important development, because it helps pave the way for the company’s next-generation qubit technology, called silicon spin qubits. These qubits resemble transistors in structure and have the potential to operate at 1 kelvin instead of the millikelvins needed for the superconducting qubits Intel and other have been using. That difference could mean that more control and readout electronics can be place inside the refrigerator with the qubit chip, because they can throw off more heat without the disturbing the qubits.

As Intel and Delft further develop the technology, they eventually hope to integrate more capabilities onto Horse Ridge and eventually integrate those onto the qubit chip itself.

Other companies with big quantum computing efforts are working on the same problem, of course. Google described a cryogenic control circuit earlier this year for its machine.

Original Article:

Read More:Intel’s New Path To Quantum Computing

Read More:Google Claims Huge Breakthrough In ‘Quantum Supremacy’

Read More:The Brain Of The Beast; IBM Puts A Quantum Processor In The Cloud

1 Comment

  1. Amazing, simply amazing because all I see is Windows10 and El Capitan.
    If you really use any or both of these you’ll find all the flaws built into it.
    Microsoft replaced Top Level American Techs with people from other countries (third world) who the Americans had to train before leaving.
    I see the stupid “look what I can do” mentality that went into the architect.
    You literally have to think 3rd world to understand the the design.
    For example;
    The American built WinXp file transfer feature did one thing only, it moved your file to any destination you commanded.
    This was the quickest way, from A to B = point of origin to point of destination.
    Now in Win8 onward this is what happens,
    The file transfer system calculates;
    Name of Point of origin and point of destination.
    Size of file and chunks of data moving – in progress.
    Size of file remaining and size of file moved – in progress.
    Time remaining and estimated time of completion.
    Now what just happens here is that useless information to the end user just bogged down the File Transfer.
    The “look what i can do” stupidity just messed up this feature.
    It’s not like the end user is sending his file(s) to the Moon!
    They have corrected this, somewhat in win10 but other stupid still remains.
    too long to list but transferring files is the most common.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.