Skip to main content

Intel will leverage its chip-making expertise for quantum research

8th gen intel core launch
Intel has detailed plans to forge its own path toward a chip that can facilitate quantum computing. The company will apparently eschew the strategies being implemented by other organizations working in this space in an attempt to adapt the silicon transistors commonly used in traditional computers to the task.

This represents a significant diversion from other groups looking to further the current state of quantum computing. At present, the superconducting qubits process seems to be the frontrunner in terms of popularity, while an implementation based around trapped ions has also demonstrated promising results.

Quantum computing diverges from traditional computing because qubits aren’t confined to the “on” and “off” states that restrict a standard bit. Intel’s silicon qubits would be able to represent data via electrons trapped inside modified versions of the transistors used in the company’s commercial chips, according to a report from Technology Review.

Intel hopes that its silicon qubits will be more reliable than the superconducting qubits being utilized elsewhere. Moreover, the fact that the company is applying its silicon qubits to standard chip wafers should help accelerate the research and development process.

Intel’s advantage over its competitors — assuming that its silicon qubits can stand up to the competition — is that the company is already very familiar with the process of manufacturing chips on an industrial scale. “The hope is that if we make the best transistors, then with a few material and design changes we can make the best qubits,” said its director of quantum hardware, Jim Clarke.

We’ll see how Intel’s project progresses compared to the many other groups currently working on quantum computing hardware. At present, the company is pursuing its silicon-based implementation, but it’s keeping its options open — research into superconducting qubits is also being conducted in-house.

Brad Jones
Brad is an English-born writer currently splitting his time between Edinburgh and Pennsylvania. You can find him on Twitter…
Intel XeSS vs. Nvidia DLSS vs. AMD Super Resolution: supersampling showdown
A quality comparison of Intel XeSS.

Dynamic upscaling is a major component in modern games and the latest and greatest graphics cards, but there are different modes and models to pick from. Intel's Xe Super Sampling (XeSS), Nvidia's Deep Learning Super Sampling (DLSS), and AMD's Fidelity FX Super Sampling (FSR) all do things in their own way and aren't always the same in performance, visual quality, game support, and hardware support.

Although there's an argument to be made for just turning on whatever your hardware and games support, if you have the choice between them or are considering different graphics cards based on their XeSS, DLSS, and FSR support, it's important to know the differences between them. Here's a key breakdown of these supersampling algorithms and which one might be the best fit for you.
Image quality

Read more
What is Intel XeSS, and how does it compare to Nvidia DLSS?
Intel XeSS visualized.

Intel XeSS is an exciting development in the world of upscaling PC games. This technology is finally out, and we've tested it extensively to give you a full overview of how it can improve your gaming experience.

While Intel XeSS is similar to the tech we already know well from AMD and Nvidia, it comes with its own pros and cons. We've rounded up everything you need to know about Intel XeSS, how it works, what games it supports, and how it compares to its rivals from AMD and Nvidia.
What is Intel XeSS?

Read more
Arc GPU drivers are getting better, but Intel says it’s challenging
Intel head of Graphics Raja Koduri explains Arc driver troubles

Intel Arc A770 and A750 graphics cards will be available to order on October 12, but Intel admitted it’s still struggling with drivers for DirectX games. Raja Koduri, Intel’s head of Accelerated Computing Systems and Graphics Group (AXG), discussed the challenges in a recent interview.

With reviews expected to start arriving today, the pressure is on Intel to either impress us with its recovery or rush to solve issues quickly. Intel has been creating graphics drivers for decades, but until this year, that has focused on integrated graphics built into the CPU, which come with much lower expectations. Koduri explained that the first generation of Arc GPUs was most difficult because programmers had to start with a completely new architecture. Pandemic challenges slowed development as well. Heading into the second generation it should be better.

Read more