The battle of supercomputers

 By Andrea Signorelli

You don’t hear much about them compared to technologies like artificial intelligence, 5G or even quantum computers. But it’s just over supercomputers that China and the United States are fighting one of their fiercest battles…

First of all, what are supercomputers? Well, they are massive pieces of equipment weighing hundreds of tonnes and made up of multiple computers about the size of fridges, all connected. These systems reach a calculation speed of tens, if not hundreds, of millions of billions of operations per second (petaflops).

What are supercomputers for?

By themselves, these big numbers don’t add up to much. But to get a real idea of their value, you only need to know that these supercomputers are deployed in the most strategic fields of science, industrial research and security. The United States, for example, use them to simulate nuclear bomb explosions, creating virtual models of how they would unfold with a level of precision that reaches fractions of seconds.
In other instances, supercomputers can help to fight diseases. The machine at one hospital in Kansas City can analyse 120 billion DNA sequences to find out the genetic variation responsible for certain liver disease. General Motors, meanwhile, leverages the technology to simulate crash tests. Other organisations use it to better understand the nature of earthquakes and hurricanes, predict the consequences of climate change, recreate the Big Bang or try and make digital reproductions of the human brain.
Suffice to say, supercomputers are employed in crucial sectors. It won’t come as a surprise, then, that the USA has suffered a bit in the last five years, during which China has been home to the world’s fastest supercomputers. In 2013, the number-one spot in the prestigious  TOP500 ranking was taken and held until 2016 by Tianhe-2A, a machine with 125 petaflops.
However, this supercomputer contained American-made Intel processors, something which arguably made China’s success less surprising. Things changed, though, with the arrival of the Sunway TaihuLight (140 petaflops), a source of national pride from 2016 to 2018 having been built using purely Chinese technology.

The Sunway TaihuLight is equipped with 10,649,600 calculation cores and is able to perform about 93 quadrillions of processing per second (Xinhua)

America gets its revenge – for now

However, in June 2018 the United States exacted emphatic revenge when it revealed its new system, Summit, built by IBM and running on Nvidia processors. This monster, made up of 256 enormous computers connected to each other, reached an unprecedented speed of 200 petaflops. Summit is now in operation at the US Department of Energy’s Oak Ridge National Laboratory and is designed to carry out work on nuclear physics, seismology and climate science.
America also took back second place in the ranking, with the Sierra system (125 petaflops), which the National Nuclear Security Administration uses on explosion simulations for nuclear warheads, as described above. Even so, none of this means that the United States’ position at the top of the ranking is safe. Far from it.
China is working flat out to be the first country to cross the new frontier in supercomputers: exascale computing, in which the machines can reach speeds of one exaflop (1 billion calculations a second). The first two prototypes have already been unveiled. They should be fully developed by 2021, when they will be put to use in scientific and technological research and allow the People’s Republic to reassume the dominance it recently lost.
And that’s not all. China already has no less than 206 of the world’s 500 most powerful supercomputers, while in 2001 it had none at all. Meanwhile, America’s share continues to fall, from 145 in 2017 to 124 today. So, despite the success of Summit and Sierra, it appears that the Chinese will be the ones to achieve predominance.

The Summit supercomputer is also the 5th in the world for better energy efficiency with 13,889 GFlop/watt (Carlos Jones/ORNL, Flickr)

Europe’s position

The third wheels in this contest have always been Japan and Europe. Japan is at number seven in the ranking with its system ABCI, but Europe is not doing badly either, thanks not only to Switzerland’s Piz Daint, in fifth place, but also to EU countries including Germany and, surprisingly, Italy.
It may have been overtaken now by the Germans’ SuperMUC-NG (27 petaflops), but for a long time the most powerful supercomputer in the EU was HPC4 at Eni’s Green Data Center in Pavia. This machine can reach 22 petaflops and is used to locate oil and natural gas fields. And it’s not the only one in Italy, either, as there’s also the Marconi system (19th) owned by Cineca, a consortium of universities that carries out scientific and IT research.

Eni's HPC4 is among the most powerful in the world in terms of industry and the fourth most energy efficient system in the world

If that wasn’t enough, the EU is also working to build four new supercomputers, two of which should occupy spots in the top five and the other two will still be in the top 25. What’s more, these systems will be built using entirely European technology for the first time. They’re currently in the development stage at the European Processor Initiative and should be finished by 2020.
Overall, Brussels has allocated about $1.4 billion to the initiative, spread equally between the EU, the 25 member states taking part, and various private partners. In the fight between China and the United States, it looks like the EU (at least in the field of supercomputers) has no intention of just standing back and watching.

READ MORE: Supercomputer vs Climate Change by Eniday Staff

about the author
Andrea Signorelli
Born in Milan in 1982, he writes about the interaction between new technologies, politics and society. He collaborates with La Stampa, Wired, Esquire, Il Tascabile and others. In 2017 he published “Rivoluzione Artificiale: l’uomo nell’epoca delle macchine intelligenti” for Informant Edizioni.