Tuesday, March 26, 2013

Supercomputing reveals hidden earth

Italian seismologist Domenico Giardini, keynote speaker on the SARA Superdag last december, explains why supercomputers are for earth scientists what telescopes are for astronomers. 

This article was published in I/O Magazine, march 2013

Nobel-prize winner in physics Richard Feynman (1918-1988) once remarked: “It’s hard to believe, but we know a lot more about the distribution of matter in the interior of the sun than in the interior of the Earth.” Asked whether this statement still holds true, Domenico Giardini answers without a millisecond of doubt: “o yes, that’s still true”.

At the same time, the very fact that the earth’s interior is so inaccessible, that one cannot send a submarine, a space ship or even a beam of light inside the earth, is what fascinates Giardini the most in his job as a seismologist. And it is the main reason why supercomputing has become so important to reveal the hidden earth. Combining measurements of where, when and how the earth shakes and shivers with detailed computational models tells a lot about the geophysics of the earth’s interior. For earth scientists the supercomputer is what the telescope is for astronomers.

                                                       A computer simulation of wave propagation in the earth

Giardini is a professor of seismology and geodynamics at the ETH Zürich in Switzerland. He was educated in Italy as a physicist, but he changed his career from modeling the physics of the atmosphere to modeling the physics of the earth, more or less “by accident”, as he says. “When I started in the field, computational power just started to increase rapidly. It was easy to see that this development would make continuously new science possible. And as there were little people doing the same type of work, it was relatively easy to be at the forefront.”

North pole becomes south pole
On the SARA Superdag on december 19 last year, Giardini gave a keynote lecture about high performance computing in solid earth geophysics and seismology. A few decades of exponential growth in the performance of supercomputers makes it nowadays possible for earth scientists to investigate phenomena that were ten years ago impossible to investigate. “The inversion of the magnetic poles, is one of them”, tells Giardini in an interview from half january, when he is just back from field research in Nepal. 

From investigating rocks, scientists have known for decades that the earth’s magnetic north and south pole have interchanged positions many times in the geological history. Although there is no consensus yet about the underlying geophysical cause, today’s supercomputer simulations are beginning to unveil this mystery. Giardini: “To find the cause of the inverting poles, computer simulations need to have a certain resolution. Ten years ago the resolution was below a threshold. Nowadays our codes have crossed this threshold. Suddenly we can run a certain type of physics that was impossible before.”

Not only earth science benefits the power of Moore’s law in supercomputing, also its applications. Giardini: “The exploration of hydrocarbons like oil and gas, a field that is traditionally strong in the Netherlands, can be done with higher and higher precision.” Another application is the development of more realistic evacuation scenario’s based on better models of volcanic eruptions. In his home country Italy this is done for a possible eruption of the Vesuvius, Giardini explains: “Around the Vesuvius about half a million people are living. In the case of an eruption, it’s likely that about ten percent of the neighborhood of the volcano will be destroyed. Although we will never know exactly which ten percent will be destroyed, running different eruption scenario’s on a supercomputer can help to make more realistic evacuation scenario’s. We model what happens in the entire column of the Vesuvius and combine this with measurements at thousands of locations. What happens to the molten rock? What happens with the gas? What happens with the ash? On the basis of past eruptions, detailed measurements and state-of-the-art computational models we hope to help civil protection. Applying my scientific knowledge for the use of society is what I find the most fulfilling in my work.” 

                                                      Computer simulation of an eruption of the Vesuvius

Giardini cooperates with a number of Dutch institutions: Utrecht University, KNMI and TNO. “International cooperation is needed”, he tells. “Not so much for sharing the costs of supercomputing, but primarily to share tools that are needed to run and analyze simulations. Tools such as mathematical techniques that speed up codes; visualization techniques that let us in an easily understandable way the outcomes of the calculation; and last but not least better computer architecture. Nowadays the most demanding chips are developed for games, not for science. Research is needed to develop computer architecture that are optimal for our type of simulations.”

Real time simulations of earth quakes
The exponential growth in the power of supercomputers will still hold in the next decade. What are the new problems that can expected to be solved in the next decade? Giardini: “Let me mention two examples. The earth consists of a hard crust, a deformable mantle, a liquid outer core and a solid inner core. Within five to ten years we will get much better models for the convection in the mantle. That will lead to a better understanding of how tectonic plates move on the mantle and therefore of how earthquakes arise and how mountain ranges are formed. Our models for the mantle describe processes on the scale of million of years. But earthquakes start on a scale of seconds. We need to bridge a huge gap of scales.”

“My second example is the real time simulation of earth quakes in area’s like California, where we can combine a lot of measurements with computer simulations. In California, but also in Italy, thousands of seismic stations measure the tension in the earth’s crust. We would like to run real time simulations of earthquake scenario’s based on these input data. Nowadays it takes three years to calculate fifty different scenario’s of what might happen next. That’s far too long. Real time simulations need a lot more computing power. I expect that we will get there in the next ten years.”

[kader:]
SURFsara and the new national supercomputer Cartesius
On January 1 2013 SARA and Surf merged to SURFsara. SURFsara is now the new Dutch ICT-infrastructure, which consists of networks, supercomputers, grids and data. SARA was founded in 1984 as a national center for supercomputing for scientific research. SARA hosts the national supercomputer. From 2008-2013 this was the IBM-supercomputer named Huygens. In the first half of 2013 Huygens will be replaced by a new national supercomputer from the French company Bull. The new supercomputer is named Cartesius, after the French philosopher Rene Descartes. It is expected that Cartesius will break the petaflop barrier of 10^15 floating point operations per second in 2014.

SURF is the second partner in the newly formed SURFsara. SURF is a powerful partnership for higher education and research in which Dutch universities, colleges and research institutes jointly invest in ICT innovation. It consists of a number of companies each with their own focus: SURFnet, Surfmarket, SURFshare and as of this year: Surf Sara. In 2008 ICT Regie advised that the merger between the two foundations would lead to synergy in forming a world class e-infrastructure within the Netherlands. The then government decided to follow that advise. Within the SURF-organisation SURFsara is now responsible for delivering services in High Performance Computing (HPC), data storage and visualization.

Internet
www.surfsara.nl