This area is much less talked about than technologies like artificial intelligence, 5G or even quantum computers, yet one of the most heated battles between China and the United States is under way in the field of supercomputers. Firstly, what is a supercomputer? They are huge machines, weighing hundreds of tonnes and made up of many computers of the size (more or less) of a refrigerator, all connected to each other. The calculation speed of these systems reaches tens, if not hundreds, of millions of billions of operations per second (petaflops).
What are supercomputers used for?
Put in those terms, the sheer size of the numbers doesn't tell us much. To get an idea of their value, just think that these supercomputers are used in the most strategic sectors of science, industrial research and security. For example, the United States has used them to simulate a nuclear bomb exploding, creating a virtual model of what would happen with a fraction of a second precision.
In other cases, supercomputers have been used to fight disease, as with the supercomputer employed by a Kansas City hospital, able to analyse 120 billion DNA sequences to discover the genetic variant responsible for a particular liver disease. And even more, General Motors uses this technology to simulate crash tests, while other institutions use it to better understand the behaviour of earthquakes or hurricanes, to predict the consequences of climate change, to recreate the big bang or try to digitally reproduce the brain human.
In short, supercomputers are used in all the key sectors. It is no surprise then that the US took a big hit during the five years that the fastest supercomputer in the world belonged to China. In 2013, the top of the prestigious Top500 ranking was held by the Tianhe-2A, a 125 petaflop machine that held on to its leadership until 2016.
However, this supercomputer had US Intel processors, a fact that, from a certain point of view, made the Chinese conquest less surprising. Things changed with the arrival of a new made in China supercomputer, the Sunway TaihuLight (140 petaflops), which from 2016 to 2018 was the national pride in technology, having been built using only Chinese technology.
The (temporary) American fight back
However, in June 2018, the United States took its resounding revenge, unveiling its new Summit system , built by IBM and equipped with Nvidia processors. This colossus, made up of 256 huge computers connected to each other, reaches the unprecedented speed of 200 petaflops, was supplied to the US Department of Energy Oak Ridge Laboratory and is designed for workloads that include nuclear physics, seismology and climate science.
The second place in this ranking was also retaken by the United States with its Sierra system (125 petaflops), used by the National Nuclear Security Administration for the aforementioned simulation of nuclear warheads. However, all this does not mean that the United States is firmly at the top of the rankings. The opposite is true.
China is working hard to be the first nation to reach the new frontier in the field of supercomputers, the so-called exascale system, machinery capable of reaching a speed of exaflop (one billion billion calculations per second) and of which the first two prototypes have already been unveiled. Their development should be completed in 2021 and they will then be placed at the service of scientific and technological research, allowing the People's Republic to regain its recently lost record.
The supercomputer speaks Chinese
That's not all, though. Today, China already has as many as 206 of the 500 most powerful supercomputers in the world (in 2001 it didn't even own one), while the US share continues to decline, from 145 in 2017 to 124 today. In short it appears that, despite the success of Summit and Sierra, China has nevertheless managed to consolidate a highly respected position in this sector. In this challenge, the third wheel role is still held the same two players, Japan and Europe. At least in this case (and although Japan holds seventh place with its Abci system), the Old Continent is not doing badly. It is not only thanks to the Swiss Piz Daint (fifth place), but also to some European Union nations such as Germany and, surprisingly, Italy.
Although it has been overtaken today by the German SuperMUC-NG (27 petaflops), for a long time the most powerful supercomputer in the entire European Union was the HPC4 in Eni's Green Data Center in the province of Pavia. A machine capable of reaching 22 petaflops and which is used to find oil and natural gas fields. And still in Italy, Eni is about to unveil the HPC5 which, with its 52 PetaFlop/s of peak computing power, will take a respectable position in the TOP500 ranking. It will also be used for molecular modeling of photoactive systems, for the fluid-dynamic simulations needed to optimise technology to produce energy from sea waves and to research magnetic fusion. HPC5 combined with the previous HPC4 version will reach a peak power of 70 Petaflops, which means 70 million billion mathematical operations at floating point per second, maximising the efficiency and sustainable use of resources. In line with Eni's sustainability policy, the supercomputing systems (the result of internal company knowledge) and the Green Data Center that hosts them were designed to guarantee the highest level of energy efficiency in order to minimise CO2 emissions and operating costs”.
It is not Italy's only example of a supercomputer, which can also count on the Marconi (19th position) owned by Cineca, an inter-university consortium that deals with scientific and IT research. If all this was not enough, the European Union as a whole is working to build four new supercomputers, two of which should be able to enter the top five of the ranking and the other two the top 25. And more, these systems (which should have been built by 2020) will use only European technology for the first time, being developed through the European Processor Initiative.
Overall, Brussels has allocated something close to 1.4 billion dollars, divided equally between the European Union, the 25 member states participating in the initiative and numerous private partners. In the two-way fight between the United States and China, the European Union (at least in the field of supercomputers) seems to have no intention of merely watching.
An additional resource against cyber threats
The exascale information system can also help combat cyber threats to critical infrastructure.
In a world that increasingly relies on computers to manage almost all aspects of our daily lives, from banking activities to energy supply to online purchases, the complexity and frequency of cyber attacks is increasing.
In a report released by the Idaho National Laboratory, researchers found that the development of smart grids has rendered energy companies more exposed to cyber attacks. The report stressed that "there is nothing that can be 100% effective at mitigating this". A defence system that works today may not be effective tomorrow, since the methods and means of cyber attacks are constantly changing. It is essential that all energy sector players are aware of the changes in information security and continue to work to prevent potential vulnerabilities in the systems they manage.
Supercomputers combined with Artificial Intelligence enable a better machine learning process. Through faster and more efficient learning, cybernetic models and threats can be identified and more accurately prevented, ensuring greater security for the energy industry and its critical infrastructure.
Wanting to access this data, Europe has recently committed $1 billion to improve the computational capabilities of supercomputers. The United States hopes to have a complete exascale system by 2021.
Read more about technology and sustainability
Selected content on this topic.