The Case for a Periphery-Driven Future for High-Performance Computing

0

“Exascale only gains value when it creates and uses data that matters to us,” Pete Beckman, co-director of the Northwestern-Argonne Institute of Science and Engineering (NAISE), said at the latest HPC User Forum. Beckman, head of a state-of-the-art computing project at Argonne National Laboratory called Waggle, insisted on one thing: state-of-the-art computing is a crucial part of delivering that value to exascale.

Beckman began with a quote from computer architect Ken Batcher: “A supercomputer is a device that turns computational problems into I / O problems.” “In a lot of ways, that’s still true today,” Beckman said. “What we expect from supercomputers is that they are so fast that there is really a bottleneck reading or writing from an input or an output. “

“If we take that concept, however, and turn it around,” he added, “then we end up with this idea that edge computing, therefore, is a device for transforming an I / O problem. into a computational problem. “

Beckman described what he saw as the new paradigm of high performance computing: a paradigm defined by the production of extreme data – more than what could ever be efficiently transferred to supercomputers – by massive detectors and instruments like the Grand hadron collider and radio telescopes. This paradigm, he said, resulted in a series of research problems where it would be more efficient to examine data at the edge and only filter out important or interesting data on supercomputers for heavy analysis.

There were a number of reasons, Beckman explained, why edge processing might be better: more data than bandwidth, of course – but also a need for low latency and fast actuation, as in autonomous cars; confidentiality or security requirements that prevent the transfer of sensitive or personalized data; a desire for additional resilience through distributed processing; or energy efficiency.

Beckman, for his part, is advancing this new paradigm – which he says “has been made possible in large part by AI” – through Waggle, which began as a wireless sensor system aimed at enabling smarter urban and environmental research. With Waggle, Beckman said, “the idea was to understand the dynamics of a city” through pedestrian monitoring, air quality analysis and more. The first generation of sensors had been installed all around Chicago, generating data that was then shared with scientists.

The latest version of the Waggle sensor, Beckman said, has just been developed and is much more robust: a cutting-edge AI-enabled computing platform that processes incoming data using an Nvidia Xavier NX GPU. . The platform is equipped with sky and ground oriented cameras, atmospheric sensors, rain sensors and mountain points for even more sensors. Beckman added that Lawrence Berkeley National Laboratory is working on his own Waggle node setup for one of his projects.

The latest version of the Waggle sensor platform. Image courtesy of Pete Beckman.

These Waggle sensors, Beckman explained, point to an even greater vision – a vision embodied by Northwestern University’s NSF-funded Sage Project (also led by Beckman). Through Sage, he said, the goal was to “take these types of on-board sensors and use them in networks across the United States to build what we call software-defined sensors.” flexible on-board computers which are then specialized for a purpose.

“The architecture of Sage… is pretty straightforward,” Beckman said. “At the edge, we are processing data. … Data pulled from an edge AI goes into the repository, the repository can then share that data with HPC applications, which can then process that data. Sage-enabled Waggle networks, Beckman said, were simple and secure, with no open ports. “You cannot connect to a Waggle node,” he said. “The nodes only phone home.”

The architecture of the Sage project. Image courtesy of Pete Beckman.

Through Sage and Waggle, Beckman outlined a number of current and potential use cases. Huts equipped with Sage technology, he said, had already been installed alongside ecological monitoring equipment for the 81-site NEON project, run by NSF, which has been operating since 2000. Various other partnerships – including one between Sage and ALERTWildfire – used cutting edge processing technology like Waggle to advance low latency forest fire detection and data reporting. Other projects have ranged from identifying pedestrian flows and classifying snowflakes to assessing the effects of policies on social distancing and wearing masks during the pandemic.

“In reality, most of the HPC has focused on an input bridge – some data you enter, then you calculate and you do a visualization,” Beckman said. “It’s clear that the future of large HPC systems is loop processing, where data will come in, will be processed, and it’s a live stream from the edge that executes the first layer of that HPC code. “

“Each group we talk to about advanced computing has a different idea. That’s what has been so fun with the concept of a software-defined sensor, ”he added. “Being able to run this software stack on edge devices and report back by doing AI at the edge is a very new area, and we’re interested in seeing new use cases and what issues you might have. – in what ways you can connect your supercomputer with the edge.

Header image: Waggle nodes. Image courtesy of the Argonne National Laboratory.


Source link

Leave A Reply

Your email address will not be published.