Select Page

The Parallel Path to Computing Present and Future

By Rob Mitchum // May 17, 2013

For the last few decades, parallelism has been the secret weapon of computing. Based on the theory that large problems could be solved faster if they are chopped up into smaller problems performed simultaneously, parallel computing has driven supercomputers to their current petascale power. Recently, the concept has spread to consumer computers as well, as clock speed limitations of single processors led manufacturers to switch to multi-core chips combining 2, 4 or 8 CPUs. But in the early 1980’s, when Argonne National Laboratory created its Argonne Leadership Research Facility, the path of parallelism was not so clear.

The origin, subsequent impact and future role of this technology were the topics of discussion at the Thirty Years of Parallel Computing at Argonnesymposium, held over two days earlier this week. Luminaries of the computer industry and research community — many of them Argonne alumni or collaborators — met on the Argonne campus to share stories of the laboratory’s instrumental role in nurturing parallel computers and the software they use, and how the approach helped to create the computational science of today and tomorrow.

From a modern perspective, it was hard to spot the world-changing potential in Jack Dongarra’s pictures and descriptions of the earliest Argonne parallel computers, which more resembled washer-dryers than today’s sleek, gargantuan supercomputers. The diversity of parallel machines purchased by the ACRF — 13 in its first 8 years, Dongarra said — reflected the excitement and uncertainty about parallel computing in those early days.

“We knew that parallel computing was the way HPC was going to be done in the future,” said Paul Messina, director of science at what is now known as the Argonne Leadership Computing Facility. “But there was no clear winner in terms of parallel architectures.”

Though the machines themselves were designed elsewhere, Argonne scientists made essential contributions to parallel computing’s evolution through programming and outreach. ANL theorists and researchers figured out how to assemble programs that squeezed the most speed and power out of (at the time) dozens of computing cores working simultaneously. Annual summer institutes and training courses brought eager students to the Lemont campus…and then sent them back into the world preaching the gospel of parallelism.

Today, the scientific impact of those efforts can be felt in disciplines from chemistry and physics to biology and meteorology. Thom Dunning, director of the National Center for Supercomputing Applications at the University of Illinois, explained how concurrent projects at Argonne in the 1980’s in parallel computing and computational chemistry sparked the creation of NWChem, simulation software used to this day by chemists around the world. CI Senior Fellow Paul Fischer discussed how parallel computing transformed the field of fluid dynamics, allowing physicists to model the multiple scales and dimensions needed for research and commercial applications.

“Sometimes we take it for granted as practitioners that we have made these advances, but when you talk to others, you need a way to explain why we are always asking for more and more computational power,” Dunning said. “The reason is that it allows us to achieve much higher fidelity, to look at more complex systems than we could with lower power.”

Meanwhile, the impact of parallel computing in the commercial sphere could be divined just from the companies represented by speakers at the symposium: Intel, IBM, HP and Microsoft all sent high-ranking scientists to discuss the technology. Many of those industry representatives acknowledged the past briefly before offering glimpses of computing’s future, including new chips built with silicon photonics, data-centric computer architecture, and multi-level cell memory technologies. But many also warned that without a dramatic leap in technology, the computing industry may soon reach the end of Moore’s law — causing an audience member to quip “The number of people who say Moore’s Law is dead doubles every 18 months.”

[[image here]]

A panel at the end of the symposium’s first day combined all of these threads, with moderator Irving Wladawsky-Berger asking participants very plainly to name the biggest impact of parallel computing over the last 30 years. CI Senior Fellow Rick Stevens named six — genome analysis, aircraft radar, weather simulation, computer chip simulation, web indexing and internet routing — that together encompass a substantial portion of our modern lives. But Justin Rattner of Intel went even further, saying that the benefits from parallel computing have enabled an entire third branch of science to rise up alongside theory and experimentation.

“I think parallell computing gave rise to the notion of computational science,” Rattner said. “Scientific field after field has changed as a result of the availability of prodigious amounts of computation, whether we’re talking what you can get on your desk or what the big labs have available. The shockwave won’t be fully understood for decades to come.”

Plenty of opinions were shared about what the next thirty years of parallel computing will bring, from Stevens’ predictions of smart cities, personalized medicine and synthetic food to Rattner’s forecast of brain-machine interfaces and self-assembling machines. To realize those visions will require just as much ingenuity and investment as the pioneers of parallel computing used in the early 1980’s, said William Harrod of the Department of Energy Office of Science.

“Thirty years ago we were at a crossroads of computing, and organizations such as Argonne stood up and took on the challenge,” Harrod said.  “The next 30 years will have an equally impressive set of challenges, and I hope that organizations such as Argonne and the other national laboratories again step up.”