Jump directly to the page contents

Supercomputers in research

The next big step

The JUWELS supercomputer at the Jülich Supercomputing Centre is the fastest supercomputer in Europe. Image: Forschungszentrum Jülich / Wilhelm-Peter Schneider

Modern science is almost inconceivable without supercomputers. Two new computers in Jülich and Karlsruhe are now setting new standards - and are already prepared for technologies that do not even exist yet.

At Forschungszentrum Jülich, Thomas Lippert heads the Jülich Supercomputing Centre. Image: Forschungszentrum Jülich / Ralf-Uwe Limbach

The development of vaccines against new viruses such as SARS-CoV-2; the increasingly accurate simulation and thus prediction of climate change; the development of our future CO2-neutral energy supply: Such research questions are highly complex and require tools to capture and process gigantic amounts of data. "All areas of the natural sciences are now making great progress with the help of powerful computers," says Thomas Lippert, who heads the Supercomputing Centre (JSC) at Forschungszentrum Jülich. "This also applies to the social sciences, which are working with ever larger amounts of data." Researchers' tools are thus themselves becoming the result of scientific innovation.

Europe's fastest supercomputer

The research center has spent two years developing such a tool, the "Jülich Wizard for European Leadership Science" (JUWELS), which will be ready in 2020. Now the fastest supercomputer in Europe, among the top 10 worldwide, it has a performance of 85 petaflops, i.e. it can perform 85 quadrillion computing operations per second. This corresponds to the capacity of more than 300,000 PCs commonly used today. JUWELS has already supported drug development against COVID-19 and enables it to compute particularly complex climate models. "This is made possible by its highly flexible modular architecture," explains Lippert. "In JUWELS, cluster and booster modules handle a very efficient division of labor."

JUWELS can perform 85 quadrillion computing operations per second. This corresponds to the capacity of more than 300,000 PCs commonly used today. Image: Forschungszentrum Jülich / Wilhelm-Peter Schneider

In conventional computers, thousands of identical computing units are connected together. However, some applications only make partial use of these. The modular architecture of the Jülich supercomputer complements the previous cluster module, which work with CPU processors, with the super-fast booster module. With its powerful, highly efficient graphics processors, it is specially designed for computationally intensive applications. Both modules are very tightly interconnected. "To efficiently solve many different computational problems in a scientific environment, more than one computer architecture must be used," Lippert said. "One of the strengths of JUWELS is that it can be configured appropriately for specific tasks." Other modules, such as data storage or analysis tools, can be integrated as needed.

The operation of such supercomputers usually requires a high energy input, not least because the machines have to be cooled. Here, JUWELS proves to be particularly efficient with its hot water cooling system. The liquid does not require any additional cooling generators; it cools directly on the outside air, thus saving energy.

Taking supercomputers to a new level

Speaking of climate and research - the distribution of the various computing tasks among the different modules at JUWELS looks like this, for example: Atmospheric transport on the global scale is calculated in the cluster, chemical reactions in the booster. Division of labor and networking was also the recipe for success in the development of the supercomputer. Forschungszentrum Jülich collaborated with the French-German company Atos, the Munich-based supercomputing specialist ParTec and the US manufacturer NVIDIA. Things are soon to continue on a European level, with plans to launch an exascale computer starting in 2023. "JUWELS is a milestone on the way there," emphasizes Lippert. The construction and operation of such a supercomputer is considered the next big step in supercomputing worldwide, he said. "With a computing power of at least one exaflops, it would still be at least twelve times faster than JUWELS."

Jennifer Buchmüller is head of the Scientific Computing and Simulation department at KIT's Steinbuch Centre for Computing (SCC). Image: KIT

Managing Large Volumes of Data with High-Performance Computers

The Karlsruhe Institute of Technology (KIT) is also building a highly efficient supercomputer. In summer, the "Karlsruhe High Performance Computer" (HoreKa) will be fully operational and will be one of the ten most powerful computers in Europe. "In material sciences as well as in models for weather forecasting and climate research, particle physics, and mobility research, we can perform detailed simulations in ever shorter times," explains Jennifer Buchmüller, who heads the Scientific Computing and Simulation department at KIT's Steinbuch Centre for Computing (SCC). HoreKa is designed to handle large amounts of data. For several years, KIT has also been doing pioneering and development work on energy-efficient hot water cooling of high-performance computers. Thus, HoreKa is to be cooled with 47°C warm water. The more powerful computers are, the more power they require and the more heat is generated during operation. Conventional air-conditioning systems that use air or water as a coolant consume a lot of energy to cool it to low temperatures.

In both Karlsruhe and Jülich, scientists from a wide range of institutions and scientific disciplines have access to the supercomputers. "Handling a data-processing powerhouse like HoreKa requires special expertise," Buchmüller emphasizes. "That's why we provide researchers with intensive advice. With our support, researchers can exploit the computer to its full capacity and use it effectively."

The Karlsruhe High Performance Computer (HoreKa) will be made available to the National High Performance Computing when it is put into operation. (Photo: Amadeus Bramsiepe, KIT)

Giving researchers access to supercomputers

The equipment, the knowledge, and the transfer services in Karlsruhe tipped the scales in November in favor of making KIT one of the eight centers for National High Performance Computing (NHR). This was decided by the Joint Science Conference, which coordinates science funding by the federal and state governments. Thus, scientists will be able to use even more powerful high-performance computers at KIT in the future. Existing strengths of high-performance computing centers will be further developed and bundled. The goal is to provide researchers from all over Germany with access to supercomputers. At the same time, the NHR centers are to strengthen user competencies, promote junior staff, and intensify education and training in this field.

The Helmholtz Association's two supercomputer sites are also already technically equipped for what is to come. "Soon we will be able to integrate modules that are still dreams of the future," says Buchmüller. "These may be neuromorphic modules, programmable digital devices or other chip technologies that are still under development." The first module to be installed and put into operation in early December was nodes with the novel ARM processors as installed in Fugaku, currently the world's fastest supercomputer. And that's not all, Jennifer Buchmüller knows: "In addition, we're ensuring that we're advancing scientific progress through high-performance computing with customized services, new interactive access paths and cross-platform data access."

Message from Forschungszentrum Jülich

Message from KIT


The future of supercomputers

Quantum computers leave the basis of the laws of classical physics. The binary principle "0 or 1" plays no role in it. The functions here are based on the laws of quantum mechanics. Analogous to the classical bit in conventional computers, the Qbit serves as the smallest possible memory unit; it also defines a measure of quantum information. Quantum computers can search extremely large databases faster and decompose and process large numbers more efficiently than classical computers. This would make many mathematical problems easier to solve.

Neuromorphic modules are modeled after the human brain. They are considered especially promising for tasks in the field of artificial intelligence. When it comes to object recognition or predicting events in a natural environment that is constantly changing, the human brain is far superior to conventional computers. This also applies to energy efficiency: some supercomputers consume as much energy as a small town. Our brain, on the other hand, consumes only about as much as a light bulb.

Readers comments

As curious as we are? Discover more.