Movatterモバイル変換


[0]ホーム

URL:


Home >Auto, Security & Enabling Technologies > Securing Hardware For The Quantum Era
Auto, Security & Enabling Technologies

Securing Hardware For The Quantum Era

Quantum computers may become a security threat as early as next year, and that threat will continue to grow over the next several years.

February 5th, 2026 - By:Ann Mutschler
popularity

Key Takeaways:

  • Quantum threats to security are already real. Adversaries are already harvesting data that will be decrypted later by quantum computers.
  • Quantum computers capable of breaking RSA and ECC may arrive as early as next year.
  • Asymmetric encryption algorithms like RSA and ECC will become inadequate against quantum threats, while symmetric encryption (such as AES) is less vulnerable.

Quantum computers that can break current encryption are expected to begin showing up sometime between next year and 2030, but the threat to data security is already here.

By planning now, chip architects and designers can ensure that chips designed for long-lasting products, such as those in automotive or aerospace, remain secure against “harvest now, decrypt later” attacks, meet regulatory compliance, and support the transition to new quantum-resistant algorithms. At the same time, because cryptographic algorithms are advancing rapidly, any post-quantum cryptography (PQC) approach must be updateable over time.

Harvest now, decrypt later means that adversaries are currently collecting encrypted sensitive data, such as national security secrets, health care records, and intellectual property. That data will be decrypted sometime in the near future when quantum computers become powerful enough. Experts predict that “Q-Day,” when quantum computers are capable of breaking widely used encryption methods like RSA and ECC, could arrive as early as next year, and that today’s secure systems may be completely vulnerable by 2034.

“According to what we heard at the last GSA roundtable on quantum computing, which included people from IBM and from the quantum computing world, they are indeed more optimistic about when that would come — likely before 2030,” said Sylvain Guilley, co-founder and CTO of Secure-IC, aCadence company. “But it will not be in the form of a classical computer. It’s not classical computing. It’s expected to look like computing accelerated by quantum phenomena, and accelerating certain operations like in HPC, but with a quantum function. In terms of timeline, what we heard was something that might risk the cryptography functions used today would likely arrive between 2027 and 2030.”

The timeline is a bit fuzzy. The first quantum computer probably won’t be large enough to be what NIST calls a cryptographically relevant quantum computer, which means it can break algorithms. “The first quantum computers will solve small problems that are very important, likely in chemistry, likely in some optimization problem,” Guilley said. “Some are even talking about doing some AI with those, and that computation will speed up. But if you look at quantum computers not as an opportunity, but as a threat, they need to have thousands of qubits. Quantum supremacy is beyond 100 qubits, and cryptography today is already using a minimum of 128 bits.”

Translation: Some common cryptographic ciphers will be almost useless soon. “Asymmetric encryption algorithms used today, such as RSA or ECC, will no longer be considered appropriately secure anymore,” said Robert Bach, product marketing manager atInfineon. “The security of RSA relies on the difficulty of factoring the product of two large prime numbers (prime factorization). With an adequate key length used, prime factorization would be impractical on a conventional computer, while a suitable quantum computer could solve it in minutes, making unauthorized extraction effortless. Symmetric encryption, such as AES, is considered less threatened by quantum computers. However, for higher security levels, it is recommended to use longer keys, such as AES-256.”


Fig. 1: Assets with a long service life are particularly at risk. Source: Infineon

Given these rapidly approaching timelines and the evolving nature of both quantum computing and cryptographic standards, the urgency to address PQC readiness is real, and calls for a careful examination of how today’s hardware security strategies must adapt to meet future challenges — especially as new threats and complexities emerge with the transition to quantum-resistant technologies.

“Both secure communication and firmware/software/AI-model authentication rely on public-key cryptographic algorithms,” said Scott Best, senior technical director, Security IP, atRambus. “The potential capabilities of future quantum computers have introduced present-day challenges to the security of legacy public-key algorithms. That is, the potential of future cryptographically relevant quantum computers (CRQCs) has already sent threat actors into action as they leverage a harvest now, decrypt later strategy, where private and secure communication data stored to disk today might become plaintext recoverable at a future date.”

Transitioning to PQC across global IT infrastructure is a complex undertaking that will take years, making it vital to start now to avoid rushed or incomplete migrations. Regulatory bodies are also ramping up pressure, as illustrated by mandates like the U.S. National Security Memorandum (NSM-10) and NIST’s finalizedPQC standards in August 2024, which set clear expectations for compliance. Additionally, organizations must adopt cryptographic agility, enabling them to update cryptographic algorithms efficiently as technology evolves. This is especially critical for sectors like automotive and critical infrastructure, where systems may remain operational for 15 to 20 years. Deploying assets today without PQC protections can leave them exposed throughout their entire lifespan.

“For the chip designer, this is a bit complex,” said Panasayya Yalla, principal security analyst atKeysight EDA. “Traditional cryptography, compared to PQC, is a bit easier. Fewer things can go wrong. PQC is a whole piece by itself, and there are complexities in it. On top of that, since we deal mostly with hardware attacks where you have a physical device — which you don’t have if it’s in the field — the attacker would have full control over it. So we look at it from that aspect. Here, hardware attacks fall into two buckets. One is fault injection, where the operating conditions of the device are changed. It could be temperature, voltage, EM, laser, or whatnot, and you push the device into an undefined state, then exploit the security. There is another aspect that is mostly on the passive side, where we measure any leakage coming out of it, timing, sound, EM, power, whatever it is. If you have a hardware device, then any algorithm or any software needs to run on that hardware, so it becomes vulnerable to these kinds of attacks.”

PQC is not immune to attacks. “People say it’s much more secure, and it is secure in terms of preventing quantum attacks,” Yalla said. “But it is still vulnerable to the same kinds of attacks that traditional cryptography is vulnerable to. The problem is that protecting against hardware attacks is hard, and it comes at a cost. So you cannot say, ‘Can we make it fully secure?’ No. That doesn’t happen because the cost would be exponentially high. If you look at a comparison of the size it takes to implement traditional cryptography versus PQC cryptography, it is 102,000-fold the size itself, and the execution time would be 10 nanoseconds to process for traditional cryptography, where PQC would be 10,000 or 100,000 times more. So you could imagine for a developer how complex it is to design something much more secure.”

PQC in automotive, IoT
As the PQC landscape evolves, it’s important to recognize that the approach and urgency surrounding PQC implementation can differ significantly, depending on the application. This is particularly true for automotive systems versus IoT devices, and it impacts both the technical requirements and the strategies needed to provide robust, future-proof security.

“Commercial-grade post-quantum cryptography solutions will be different than the ones in automotive, in large part due to the safety mechanisms, the work products, and the cybersecurity process alignment that needs to be followed,” said Dana Neustadter, senior director of product management for Security IP Solutions atSynopsys. “If you have a small IoT device that has a two-year lifecycle, not everyone has the pressure of migrating to quantum safety now. But it’s totally different when you talk about a car or an infrastructure system or a mission-critical system or something with a longer lifecycle. Here, it’s fundamental that you build solutions that are quantum safe today, and the pressure is particularly high for applications such as those in cars.”

However, PQC algorithms are still being developed, which makes it difficult to come up with a long-term plan. “There are standards available today that you can leverage, but in an ideal world, you want to be flexible to further update and have some agility built in,” Neustadter said. “With the car, it’s more challenging. How much can you afford to build in that agility, and what will be the implications for being safety and security compliant? If you just update the firmware, do you need to redo all the analysis for the work products and things like that? It’s going to be more challenging for automotive as to how agile you can be, and how much agility you can afford, given that it’s so complex in terms of having all those work products and safety mechanisms embedded into the design.”

Security in any safety-critical market is important, but it’s particularly important at high speed and where various systems are easily accessible. “The new thing in automotive is going to be regulation for cybersecurity,” said Pallavi Sharma, director of product management atImagination Technologies. “As people realize these vehicles are going to be connected, especially with autonomous driving, they realize there is a need for making sure that it’s protected from cyberattacks and hacks. We’re going to have these edge devices that are connected, and then they’re processing data and trying to make decisions on their own. We’re going to have to add those kinds of similar cybersecurity features to the SoC.”

Adding quantum to the picture makes this significantly more challenging. So how can engineering teams make sure they are ready to battle quantum computers?

“To get ready for a future with adversarial CRQCs, hardware designers need to start building silicon that incorporates standardized quantum-resistant algorithms,” Rambus’ Best said. “This is needed not only for secure communications, but also for firmware verification algorithms, especially for systems whose fielded lifetime might overlap with a post-quantum future. Software-based solutions are sufficient for some types of physically secure systems, but hardware-based verification accelerators offer maximum tamper resistance to the new algorithms required to protect against quantum computer attacks.

Further, the implementation of PQC algorithms will look different based on the approach taken. “We implement post-quantum cryptography algorithms in hardware as IP blocks,” said Yan-Taro Clochard, chief marketing officer at Secure-IC, a Cadence company. “It’s a hybrid implementation, down to the hardware. These algorithms are then leveraged in complex security systems and subsystems, including how to use those post-quantum cryptography algorithms in security policies in the life cycle of the devices.”

Why implement in hardware? “Of course, it’s about performance,” Secure-IC’s Guilley said. “It’s about keeping the CPU doing what it’s supposed to do. This is one aspect. But most importantly, it’s also a question of chicken and egg. In the past, with quantum, you needed to ensure some services that are extremely low-level, typically ensuring the boot. Also, in the second boot, the tenant is to check before executing. If we were to check software with software PQC, it doesn’t work. We need to bootstrap in hardware. It’s the same for other functions, such as when you debug. When you debug, the premise is that something is wrong. If you need the support of software cryptography to enable you to debug, you will be locked. For this reason, we need post-quantum to be really engraved into the silicon.”

Still, previous learnings and best practices still apply, even in the quantum age.

“Our experience is that with traditional cryptography like AES, RSA, and ECC, it took time to mature,” said Durga Ramachandran, innovation director at Keysight EDA. “The same will apply to post-quantum algorithms. In our approach, we are not challenging the mathematical model of it, which is based on the resistance of quantum computers. What we are challenging is how it is implemented in, for example, IoT or chips, or FPGAs, because the physical device has some restrictions, in terms of timing, power, and memory. And because of those restrictions, certain assumptions are taken by the designers, which can be exploited in different ways. In this case, we use fault injection and side channels mainly to exploit these vulnerabilities.”

Others agree. “Current cryptographic means will still apply in the world of post-quantum cryptography,” said Clochard. “Side channels and other attacks remains 100% valid with cryptographic modules, even post-quantum. They may be more resilient, mathematically speaking, against high-performance computing attacks, but they will not be resilient against side-channel attacks by default, so it has to be included in the physical implementation, as well.”

Considerations for chip architects
Given these challenges, it’s essential for chip architects to carefully evaluate their security strategies in light of the evolving landscape of post-quantum cryptography.

“When we do a security analysis, we look at what assets are in place,” Yalla said. “If they have some secure, confidential keys, the first question is where they are stored. That doesn’t change with PQC. How long have they been used, and how many times have they been used? All of these issues factor into how hard you need to protect it. If the key is a one-time-use key, you don’t need to add all these things because even if someone recovers it, it’s of no use. The key is already changed. But it’s more concerning for a long-term case. For example, if there is a key that signs something, and that key is valid for the lifetime of the device, then the developer needs to add all the things that could make it more protective, depending on the cost, of course. First, a designer needs to do this exercise on paper, before designing it, to see if it is right. How is this being used in the field? How many protections can be added to it? Based on all the things we observed with traditional cryptography for all these attacks, they are still relevant for PQC. There is nothing new. The only thing that’s going to change is when and where you need to apply the method. The attack doesn’t change much, so you can take all those lessons and incorporate them. It’s not like, ‘Okay, now we have a PQC. We need to rethink everything.'”

It all starts with really understanding your system. “Whether it’s a small implementation, a dashboard of a car, or the IT system of a company, it’s important to understand what kind of cryptography you are using,” Infineon’s Bach said. “The cryptographic inventory is needed because there are aspects that are more important to protect against the attacks of a quantum computer than others. For example, we have a company badge to enter the doors. That’s probably not the most urgent topic to migrate to post-quantum cryptography, but if it’s some very important manufacturing data in the IT system of Infineon that needs to be secured against attacks from the outside, cryptographic inventory is the first step in understanding the system. For a single design engineer, that’s difficult. It’s more of a management task at a company, because the management has to say, ‘Go for that, you get the budget, and you get the time and the resources.’ It takes time. It’s not done in a week or so. Depending on how big the system is, it can take a couple of months. Then you have to reflect on how to upgrade the products, how to upgrade the infrastructure, and how to upgrade the complete system. Either you directly upgrade the infrastructure of the system once you roll out products with post-quantum cryptography, or you start rolling out products that are prepared for post-quantum cryptography. And then, in two or three years, your infrastructure and the system are ready. After that, you do a switch, do an update, and migrate to quantum-safe topics. Basically, you need a migration plan.”

Given that PQC algorithms are changing and evolving constantly, all of this must be taken into consideration at the hardware level. “The degree of challenge between ‘changing’ protocols and ‘evolving’ protocols is significant,” Rambus’ Best noted. “From an evolutionary perspective, the key to providing future-proof hardware is to design accelerators that can segment PQC operations into algorithmic portions, and ‘control-plane’ portions. The algorithmic portions, for example, can be built with reusable math engines that accept programmable control from a software-based control plane. In this way, new variants of existing protocols can be supported through relatively minor updates. As for changing protocols entirely, that is substantially more difficult when the algorithm has been committed to hardware, especially when the hardware features tamper-resistant countermeasures.” For these updates, embedded FPGA macros offer a potential path towards long-term crypto agility.

As such, PQC must be a long-term strategy, not a single implementation. “PQC should absolutely be a long-term mindset, especially for hardware that is in the field operationally for a decade or more,” Best said. “A static one-and-done approach is unrealistic when threat models, standards, and attack techniques continue to evolve. Software updates are, of course, a reasonable interim solution, though they are generally insufficient for tamper-resistant implementations that are critical in edge-based devices. It is important for such products to consider a roadmap that includes migration paths, algorithm agility, and the ability to roll in new primitives without breaking deployed systems.”

Finally, it’s important to remember that a secure algorithm like post-quantum cryptography is not necessarily a secured implementation. “And similar to traditional ciphers, post-quantum cryptographic implementations have been proven vulnerable to side channel and fault injection,” Keysight’s Ramachandran said. “So whenever any developer, architect, or vendor is considering implementation or transition to post-quantum, the security aspects need to be taken into consideration in the early stages of the design.”

When it comes down to it, PQC is a game between the makers versus testers or evaluators. “It’s a game of cat and mouse,” Ramachandran said. “Sometimes the developers get the upper hand, and the testers or evaluators go behind. Sometimes the attack methodologies are advanced, and the developers are a bit behind.”

Related Articles
New Approaches To Limit Cyberattacks On Hardware
How architectural defenses and complexity can help secure chips.

Security Tradeoffs: A Difficult Balance
Lack of security metrics and the increasing adoption of chiplets, 2.5D architectures, and AI all complicate security.


Tags:
Alternative Text

Ann Mutschler

  (all posts)
Ann Mutschler is senior executive editor at Semiconductor Engineering.

Leave a ReplyCancel reply


(Note: This name will be displayed publicly)


(This will not be displayed publicly)

Technical Papers


Knowledge Centers
Entities, people and technologies explored


Related Articles

AI’s Impact On Engineering Jobs May Be Different Than Expected

Workflows and the addition of new capabilities are happening much faster than with previous technologies, and new grads may be vital in that transition.

Startup Funding: Q4 2025

More and bigger funding rounds for AI chips and AI for making chips; 75 companies raise $3 billion.

Can A Computer Science Student Be Taught To Design Hardware?

To fill the talent gap, CS majors could be taught to design hardware, and the EE curriculum could be adapted or even shortened.

Annual Global IC Fabs And Facilities Report

Companies and governments invested heavily in onshoring fabs and facilities over the past 12 months as tariffs threatened to upset the global supply chain.

HBM Leads The Way To Defect-Free Bumps

Bump scaling is pushing defect inspection to the limit. What comes next and why it matters.

Chip Industry Week in Review

Memory chip shortages prompt price increases; Israeli chip foundry; 2 acquisitions; Baidu's AI chips; IBM's new quantum processor; GF's GaN push; 3D NAND scaling boosters; U.S. policy recommendations; +$500M in fundings; SiPho and SiGe capacity; EV joint venture.

FPGAs Find New Workloads In The High-Speed AI Era

Growing use cases include life science AI, reducing memory and I/O bottlenecks, data prepping, wireless networking, and as insurance for evolving protocols.

Chiplet Fundamentals For Engineers: eBook

A 65-page in-depth research report on the next phase of device scaling.
  • Sponsors

  • Newsletter Signup

  • Popular Tags

  • Recent Comments

  • Marketplace T
  • Chip Industry Week in ReviewThe SE Staff
    The On-Device LLM RevolutionSteve Roddy
    Copyright ©2013-2026 SMG   |  Terms of Service  |  Privacy Policy
    This site uses cookies. By continuing to use our website, you consent to ourCookies Policy
    ACCEPT
    Manage consent

    [8]ページ先頭

    ©2009-2026 Movatter.jp