Flanders’ supercomputer makes “unthinkable” research a reality


The supercomputer Tier-1 has been helping scientists in Flanders to achieve the unimaginable, from predicting weather patterns in deep space to solving complex mathematical equations, but the powerful machine is underused by industry, say analysts

Supercomputer for hire

For much of human history, the scientific method has been the building block of science. Scientists would perform experiments in their labs to explore the laws of the universe and test their hypotheses. But there was only so much they could do with limited resources and time.

In the 21st century, another way to investigate scientific phenomena has originated: the computer simulation run on powerful processing machines called supercomputers.

Supercomputers have allowed scientists to take their research to the next level, enabling them to carry out projects that used to be unaffordable because of the huge computing power and data storage capacity they required.

The machines have been around since the 1960s, but, for a long time, Flanders neglected its computing infrastructure. The region had to wait until 2013 to see its first supercomputer installed: Tier-1, housed at Ghent University (UGent).

At the time of its inauguration, Tier-1 ranked 118th in the global top-500 list of supercomputers. The computer, which, unlike your PC, is a combination of many processing cores, has a peak performance of 175 teraflops – that is 175 trillion calculations per second.

One year after that installation, the region’s computing capacity was increased with the launch of Tier-2, a network of local computing clusters from Antwerp University, the Free University of Brussels (VUB), the University of Leuven (KU Leuven) and UGent. Now Flanders can boast a total computing capacity of 613 teraflops.

Three billion unknowns

All of that computing power is managed by the Flemish Supercomputer Centre (VSC), which allocates “computer time” to researchers working at the five universities. “Researchers can submit applications which are then evaluated by a panel of experts,” says Marc Luwel, director of the Hercules Foundation, which co-ordinates VSC.

If their own software is not sufficient, lecturers and postgraduate students from the associated universities can request access to the network. But researchers working for Flemish public institutions, businesses and non-profits are also encouraged to apply.

It’s unthinkable to perform such simulations on one machine. But we did it in 48 hours

- Researcher Jan Fostier

Jan Fostier is with the IBCN research group at UGent. Together with his colleagues, he used Tier-1’s computing power to study the interaction of electromechanical waves, such as GSM, Wi-Fi and Bluetooth, with large objects like airplanes, antennas and telescopes. 

“We essentially solved Maxwell’s equations for these objects,” Fostier explains, referring to the set equations that show how energy and information can be transmitted through the air. “The bigger the object those waves interact with, the more unknown variables there are.” 

To calculate these unknowns, the team needed processing power. A lot of it. “We once did a simulation with more than three billion unknowns, which means you would need at least 30 terabytes of RAM memory and more than a year to run it,” Fostier says. “It’s unthinkable to perform such simulations on one machine.”

But one machine proved just enough. In the end, thanks to Tier-1 and its powerful processing brain, Fostier’s team was able to run the simulation. “And we did it within 48 hours.”

However fast the supercomputer may be, it still lacks intuitiveness and human intelligence. Before running any kind of simulation, the researcher needs to write an algorithm that the machine will be able to read and interpret.

“The final code was a result of 10 years’ work,” he says. “The development and implementation of these algorithms is so essential to the process that it has become its own discipline within computer science.”

Weather in space

Supercomputers can also be used for more everyday applications. Consider weather forecasting, which by all standards isn’t easy, even here on Earth. Now try to predict if it is going to be windy tomorrow on a planet in a different solar system, millions of light years away.

Postdoctoral researcher Ludmila Carone works at the astrophysics department at KU Leuven. She used Tier-1 to simulate weather on exoplanets that haven’t even been discovered yet. Exoplanets are planets that orbits stars other than the sun.

“They are hot these days,” Carone says, referring not to their weather but to their popularity within the scientific community. “The current search for exoplanets is mostly focused on finding ones that resemble Earth, in the hope of discovering that one twin capable of sustaining life.”

These Earth-twins, however, are not easy to detect and observe because, as Carone explains, they are relatively far away from their stars and thus don’t reflect much stellar light. Instead, in her research, Carone focuses on Earth-like planets that revolve in close proximity to smaller and cooler stars (pictured above). “Some of them receive just enough stellar energy to contain water,” she explains. “And because they are so close to a star, they’re much easier to observe.”

So what kind of calculations did Carone run on Tier-1? “I built more than 700 virtual Earth-like planets and put them on a very close orbit around their star,” she says “I then simulated how the weather patterns changed in relation to the orbit and the size of the planet.” This enabled her to identify climate features that could make a planet hospitable, or not.

For every one of her 700 virtual worlds, Carone built advanced 3D models, each with over 1,000 years of climate history – just long enough to identify average weather patterns. “This resulted in simulations that were time-consuming and required a lot of processing power,” she says. “Luckily, I didn’t have to write the algorithms myself, as they are freely available.”

Could she have done it without the help of a supercomputer? “I could have built maybe a dozen virtual planets with limited climate patterns,” she says. “But nowhere near the 700 I have now.”

Learning machines

Breakthrough scientific research is not the only domain of computer-run simulations. Large-scale computing capacity can also act as an innovation trigger for mainstream companies looking to develop new products and services.

The government of Flanders understands this and actively encourages private companies to apply for computing time at VSC. “Supercomputers are currently used in highly innovative environments and industries,” says Marc Luwel from the Hercules Foundation. “But we still see hesitation within a lot of companies.”

We still witness hesitation with a lot of companies when it comes to supercomputers

- Marc Luwel

At VCS, he says, “our role is to convince firms that supercomputing can strengthen their competitiveness. We’re also here to support them in overcoming the many hurdles that might prevent them from using the available processing power more effectively.” 

Some businesses feel no such hesitation and fully embrace the technology. Pharmaceutical giant Janssen Pharmaceutica, based in Beerse, Antwerp province, is one of the first companies to benefit from the Tier-1 infrastructure. At the moment, the company’s three supercomputing projects are focused on the analysis of large data sets with the hope of improving the development of new medicines.

“As in any other scientific environment, our R&D is facing exponential data growth,” says Jörg Wegner, senior scientist at the company. “It may be easy to count within a file with 10,000 lines, but it is really difficult to do so in a file with 100 million entries." 

The challenge faced by the company, and others like it, is that as the amount of data increases, so does the need for more computational power to process it. One of Janssen’s on-going projects concerns large-scale machine learning, or the idea that algorithms used in machines can not only summarise data but also learn from it and make predictions.

“These machine-learning approaches help us understand and support novel experimental designs and, crucially, estimate potential risks,” Wegner says. “Consider the failure rate of experimental drugs in the second phase of a clinical trial. Many potential drugs don’t survive it. By incorporating sequencing data of the genomes of the patients, we can significantly lower the failure rate during this crucial phase.”

Out with the old, in with the new

Another firm to collaborate with the VSC is 3e. Based in Brussels, the company delivers software solutions for sustainable energy projects worldwide, including wind, solar and hydropower. 

The company used Tier-1 (pictured above) to simulate atmospheric conditions over large geographical areas, including the Belgian portion of the North Sea. “We’ve been doing this kind of work for decades,” says Rory Donelly, who is responsible for wind simulations at 3E. “Thanks to the simulations, we can inform our clients about the long-term sustenance of wind resources in areas where they plan to build turbines.”

As result, their clients can easily predict how much power each turbine will generate, before it is even set up.

But in the realm of communications technology, three years is like eternity. And so, for all its breakthroughs and innovations, the first Flemish Tier-1 is slowly becoming obsolete. In the latest top-500 list, published two months ago, the Flemish supercomputer doesn’t even get a mention. 

If it is to keep up with the rest of the world, VSC knows it needs to step up its game and update its supercomputing capacity. “That’s why the Hercules Foundation is making additional funding available for a new supercomputer,” Luwel says. “KU Leuven is already designing a sophisticated computer room to house the next generation Tier-1.”

The new machine should become operational in the second half of 2016.