HomeTECHNOLOGYFrom HPC To QC: More Performance And New Challenges

From HPC To QC: More Performance And New Challenges

High-Performance Computing (HPC) is the measure of all things concerning computationally intensive applications, models, or simulations. But costs and energy requirements set limits. Quantum computers can calculate specific tasks exponentially faster and require fewer resources. Although it is still in its infancy, HPC experts see quantum technology as a helpful addition in the medium term.

Quantum technology is a topic in which the operators and managers of HPC data centers already see great potential. According to a survey conducted by analysts from IDC in cooperation with the quantum computer manufacturer IQM and Atos, one of the leading European HPC technology providers, in 110 HPC centers, 27 percent of the respondents are already working on concrete quantum computing projects. A further 49 percent plan to deal intensively with the topic within the next two years, and another 20 percent have a time horizon of around five years.

That means quantum computing is on the agenda, although many experts believe it may be years before the technology is ready for the market. So what do those responsible for the HPC centers already expect from this? And why is now the right time to get involved with quantum computing?

ALSO READ: Wasabi Cloud: SEP Expands Portfolio For Data Protection

High-Performance Computing – Parallelization And Its Limits

Research centers, government agencies, and industrial companies with computing-intensive applications either operate an HPC data center themselves or obtain computing resources from the cloud of an IT service provider. To achieve the computing power of several TeraFlops (corresponds to 10 to the management of 12 floating point operations per second) or even PetaFlops (corresponds to 10 to the power of 15 flops), the IT capacities are parallelized, and the computing power of several systems is thus aggregated.

Calculation models with many variables, complex simulations, machine learning algorithms, modern cryptography, and real-time calculations: Because the technology is available, it is also used. The application scenarios have long since gone beyond the well-known, computationally intensive applications such as weather forecasting or traffic optimization. For example, HPC is used in the chemical and pharmaceutical industries when calculating the molecular composition of substances and researching the effects of certain substances on the systems in which they are used. Increasingly complex dependencies can be calculated and displayed – the beginning of the exascale age. Because supercomputers are now able

At the same time, classic HPC is reaching its limits. Even if, in theory, every performance or capacity problem can be solved by adding new, parallel resources, this is only possible to a limited extent in practice. Because HPC servers are not cheap, they consume energy and space, and parallel processing, in particular, has to be controlled by software. The more computing units there are, the more complicated it becomes. To overcome these limits and solve tasks on a scale that exceeds the computing capacities available today, quantum computing is coming into focus.

Big Promises – What Quantum Computing Can Achieve

A quantum computer works differently than a traditional computer. In simplified terms, the principle of a quantum computer can be explained as follows: While a bit can only assume one of two states – either 0 or 1 – a qubit can assume any state between 0 and 1. At the same time, several qubits can be entangled (correlated) with one another.

This makes quantum computers exponentially faster for specific tasks when there is a correspondingly suitable quantum algorithm. This means that calculations that have to take place in hardware-intensive data centers today could be carried out in less time if the appropriate infrastructure is in place. Some complex problems, such as the analysis of interactions in entire ecosystems,

The possible applications are by no means limited to abstract research scenarios. On the contrary: Quantum computing could significantly contribute to optimization in everyday business. For example, if it is possible to react to changes in real-time in complex production systems with many dependencies, this would save enormous resources. In other words, Quantum computing is preparing to provide valuable support in solving major societal problems and addressing economically driven challenges. It is possibly the only variant – in terms of costs and energy requirements – with which complex tasks ranging from climate research to materials research and drug development to production optimization, can be addressed and resolved in real-time in the future. The promises and expectations are correspondingly high.

However, the state of art does not yet allow this. The qubits are highly unstable, and their interactions with each other and the outside world generate considerable noise. The more qubits are supposed to work together, the more significant the disruption. This goes so far that the quantum advantage will soon turn entirely negative. This has so far prevented the use of the quantum advantage in practice. Quantum algorithms have a different runtime behavior than classic algorithms, which must impact programming. In short: Quantum computing is by no means simply faster high-performance computing. How complex tasks are calculated must now be rethought.

The Future Of HPC Data Centers – Quantum Computing As A Supplement

But if the technology is not yet ready for the market, why should those responsible for HPC data centers mainly concern themselves with quantum computing? Because now is the right time. The IDC mentioned above survey, commissioned by IQM and Atos, suggests that quantum computing will be one of the technologies that will take HPC centers to the next important step.

The authors of the study see the HPC centers in an innovation dilemma. Because the demand for computing capacity is growing, but at the same time, energy requirements and costs are increasing, the pressure to fundamentally modernize the infrastructure is expanding. Quantum computing could be this strategic step that will complement HPC in the medium term and make it more powerful.

48 percent of those surveyed from 110 HPC centers stated that they are dealing with quantum computing to secure a long-term competitive advantage. 41 percent react to inquiries from their customers with their quantum commitment, and 39 percent each stated that they act according to the instructions of the investors or want to expand their technological knowledge.

The question about the deployment model also provided interesting results and an indication of the long-term strategic efforts currently used and later planned for their quantum computing initiative: At the time of the survey, the vast majority (70 percent) only used cloud-based quantum computing solutions 68 percent stated that they would instead rely on on-premise solutions in 2026.

The study’s authors recommend a three-stage strategy: In the first phase, those responsible for the HPC data centers should analyze where they are reaching their limits with their current infrastructure when dealing with complex optimization projects or basic research and which challenges are too significant. Cooperation with providers and providers of quantum computers is already recommended here.

This creates a first picture of the later quantum solution and the goals to be achieved. The second, medium-term phase is about the design and integration of the quantum solution. Since quantum systems and algorithms work differently than traditional systems, the approaches must be reconsidered. This is where Quantum Learning Machines (QLM) can help, with which quantum algorithms can already be developed and emulated today, regardless of later hardware. Only when the interaction between the two technologies works smoothly can the specific use cases be implemented (phase 3).

Conclusion: Build Knowledge Today And Simulate Quantum Computing

Development is in full swing: quantum computing is working, albeit only to a limited extent. Research and industry are already working on minimizing problems, such as effectively controlling increasing numbers of qubits. Funding projects from the federal government bring additional impetus and make the research field even more attractive. Because HPC data centers can only be expanded to a limited extent, mainly due to the high costs and immense energy requirements, quantum computing promises to be a helpful addition.

As the survey results show, most HPC experts are already dealing with the topic. Interesting test offers to support the build-up of know-how. For example, Atos provides a Quantum Learning Machine (QLM) service. The platform, consisting of hardware and software, simulates a quantum environment. Users can develop and test their quantum algorithms here, regardless of what the hardware will look like later. Once the quantum computers are available, the algorithms developed on the QLM can be ported. A practical way to get into the race for quantum technology right now.

RELATED ARTICLES

RECENT ARTICLES