Profile: Exascale Computing

Supercomputers are extremely fast and powerful computers with advanced processing capabilities. Current supercomputers operate at a scale of a quadrillion (10 to the 15th), meaning the computer performs 10,000,000,000,000,000 arithmetic operations with real numbers per second*. To put this in context, this is more than one hundred thousand times that of your laptop.

With this power, supercomputers are used extensively in the advanced of modern science, modeling and simulating complex issues. This field of science is known as High Performance Computing (HPC) and applications are wide-ranging from medical research to the demystification of elements generated from the big bang to fluid dynamics. 

Supercomputers run off of silicon chips. For a bunch of years, innovation meant these chips kept getting faster and faster and accordingly, supercomputers kept getting faster and faster. Recently, however, the technology in these silicon chips has plateaued and innovation has been minimal*. Additionally, the heat signature generated and related high energy costs is a major limiting factor in supercomputer innovation. These computers run so hot that advanced cooling techniques are required. Hosting a supercomputer requires steady power supply as an outage will cause the processor to melt.

A joint program called the “Exascale Computing Project” (ECP) lead by the Department of Energy (DOE) was launched earlier this year to develop the next-generation of exascale supercomputers that will bring processing up to a quintillion (10 to the 18th) calculations per second. The ECP is a 7-year project estimated to cost between $3.5B-$5.7B and is a joint effort between several US governmental agencies including the Department of Defense, NASA, the FBI, the National Science Foundation and six National Laboratories such as the local Argonne National Laboratory. Recent innovations in technology, particularly in cell phone tech and graphics cards (cards (which can process a lot of data at a much lower heat signature than a silicon chip), opened up the potential for exascale computing. In addition to advancing the hardware, ECP also needs to innovate the software used to run them and  increase energy efficiency because current consumption rate is prohibitively expensive. This increase in processing power will generate more accurate simulations and lead to unprecedented scientific breakthroughs. ECP identifies six pillar areas of application: national security, energy security, economic security, scientific discovery, healthcare, and earth systems. 

Examples include nuclear stockpile steward ship, development of energy efficient engines run on biofuels, seismic hazard risk assessment, accelerated cancer research, and dark matter studies*. In addition to scientific applications, the development of exascale computing has geopolitical implications. The US has historically been the leader in HPC, however, China has had the fastest supercomputer globally since 2013 and top two since 2016. Furthermore, China has been outspending the US on exascale development and international competition continues to scale. The ECP is in part a response to this and has the explicit goals of keeping the U.S. at the forefront of technological innovation and thus sustaining economic competitiveness.

*Primary Source: Interview with John Bell, Director of Center for Computational Science and Engineering at Lawrence Berkeley National Laboratory

By: Gilad Andorn, Renee Bell, John Brennan, and Lauren Kramer (Group Name: Computers)