Supercomputers help gravitational wave discovery

Computing

Ashley Washburn, February 15, 2016 | View original publication

Supercomputers help gravitational wave discovery

Detecting gravitational waves that emerged from a collision of black holes about 1.3 billion years ago took the efforts of 1,000-plus scientists over more than two decades, culminating with a Feb. 11 announcement of the historic feat.

As it turns out, crunching the galaxies of data spawned by the discovery has also demanded collaboration – this time among a series of global supercomputer networks guided in part by computer scientists at UNL’s Holland Computing Center.

The first-ever observation of gravitational waves, which were predicted by Albert Einstein’s general theory of relativity, resulted from the work of the U.S-based Laser Interferometer Gravitational-wave Observatory, or LIGO.

Though LIGO has its own supercomputing network, it also received assistance from the Open Science Grid – a global consortium of more than 125 institutions, including UNL, that offers its collective large-scale computing power to scientific projects big and small.

Since September, when LIGO detectors recorded a signal representing the first direct evidence of gravitational waves, physicists have been busy seeking answers to their many questions about the phenomenon.

“That actually requires quite a bit of computational power,” said Brian Bockelman, UNL research assistant professor of computer science and engineering. “It’s like [LIGO] heard this big, wild foghorn, and they’re now looking for whispers in the data, too. The Open Science Grid provides a lot of computational resources for that, and it’s really focusing on what’s called distributed high-throughput computing, where you’re trying to get lots and lots of these computational jobs to run at as many places as possible.”

Bockelman heads up a division of the Open Science Grid that assembles its software to best serve the needs of the physicists, biologists, chemists and other scientists who regularly use it. In this case, Bockelman and his colleagues facilitated the porting of code from LIGO’s data facilities to the Open Science Grid, helping the observatory perform its data analyses as “opportunistically” as possible.

“If you have a cluster of computers, you often (allocate) it to meet your peak needs,” Bockelman said. “But if a grad student has gone to bed or it’s Christmas vacation, for example, you have these gaps where maybe you don’t use it at 100 percent – it’s maybe closer to 80 percent.

“Where LIGO was benefiting was that they could take this 10 percent here, that 15 percent there, and put them all together into one larger resource pool to get their computations done.”

UNL’s Holland Computing Center served as a hub for distributing LIGO’s data to about 15 computing clusters throughout the United States, Bockelman said, temporarily housing a partial copy of the observatory’s dataset. In total, UNL’s facilities served out about one petabyte of LIGO data – roughly 1,000 times more than the storage capacity of a typical desktop hard drive – from late October through mid-December.

The university’s contribution was possible only because of a 2014 network upgrade that pushed its data transfer rate from 10 gigabits to 100 gigabits per second.

“This project, all by itself, required 10 gigabits,” Bockelman said. “We couldn’t have done this before the upgrade. We worked really hard on that development, so it was nice to see it pay off.”

UNL’s involvement in the LIGO project also stems from its long-term work on another global scientific endeavor: the particle smasher known as the Large Hadron Collider. As part of the Worldwide LHC Computing Grid, the university houses a Tier-2 site that stores and computes data for specific analyses of particle collisions.

Ken Bloom, associate professor of physics and astronomy, oversaw the U.S. segment of the Tier-2 grid from 2005 until 2015. He’s since been appointed manager of software and computing for a U.S. operations program dedicated to one of the LHC’s massive particle detectors.

“This was what we imagined all along when we started the Nebraska Tier-2 center and put it on the Open Science Grid – that our computers would be available for other scientists when we weren’t using them heavily, and vice versa,” Bloom said. “It actually works in real life; sharing resources benefits everyone. I’m glad that we had a chance to help out and excited to have this very small connection to a historic scientific result.”

Bloom also credited Bockelman with advancing the state of the high-throughput computing that has become a cornerstone of contemporary scientific collaboration.

“Brian is one of the key people who has made grid computing actually work for scientists,” Bloom said. “He is a veritable Swiss army knife in his versatility and our go-to guy for all kinds of problems. Between the LIGO result and the observation of the Higgs boson in 2012, he’s made contributions to two Nobel-level scientific results in the past four years.”


Computing Holland Computing Center