The Frontier supercomputer at Oak Ridge National Laboratory, United States, has created the largest simulation of the universe ever achieved. The researchers’ work considers stars, the evolution of superclusters of galaxies over billions of years, their gravitational implications and even the temperature of gases and plasma. The goal of this virtual reproduction is to provide a realistic field to simultaneously investigate conventional matter (also known as baryonic) and dark matter.
“There are two components in the universe: baryonic matter [así llamada por estar constituida de leptones y bariones, entre los que destacan los protones y neutrones]which makes up everything we can see, and on the other hand, dark matter, which as far as we know only interacts gravitationally with baryonics. If we want to know the universe, we need advanced models that take gravity and all other physics into account.including that of hot gas, the formation of stars, black holes and galaxies,” explained Salman Habib, leader of the simulation project.
To understand the evolution of superstructures in space (such as galaxy clusters), researchers use cosmic hydrodynamics. This term may seem confusing because it makes you think of liquid water in space. In general, hydrodynamics studies the behavior of fluids. In space, interstellar gas and plasma behave like fluids, Therefore, to understand their influence on the architecture of the universe, it is necessary to study them from this perspective.
Simulating cosmic hydrodynamics is much more complicated than simulating traditional expansion in space considering only gravity. There are too many calculations in the universal simulation running simultaneously. Thanks to the capabilities of the Frontier supercomputer, it was finally possible to execute a realistic reproduction of the behavior of the universe with different notions of physics at scales of one exaflop. This implies that, every second, the computer performs 10^18 operations.
The achievement of a code that arrived early
Raw power is not solely responsible for simulation. This virtual mini-universe was achieved thanks to supercomputer code that has been in development for the last decade. Hybrid Accelerated Cosmology Hardware/Code (HACC) was first entered into programming contests and eventually incorporated into exascale computing calculation projects. The HACC was adapted to develop advanced scientific applications that will take advantage of the next generation of supercomputers with the capacity to perform billions of trillions (one quintillion) calculations per second.
According to the Oak Ridge lab, the HACC code ran effectively on about 9,000 computing nodes on the Frontier supercomputer. The language was recognized in 2012, but at that time the most powerful computer was not able to take advantage of its potential. Ten years later, the same code, with some optimizations, was hundreds of times faster.
“A requirement of exascale computing was that codes run approximately 50 times faster than they could before on Titan, the fastest supercomputer at the time of HACC’s launch. Running on the exascale-class Frontier supercomputer, HACC was nearly 300 times faster than the reference run,” assured the laboratory.
#complex #simulation #universe #created #supercomputer