Edge computing enables NOAA to push workloads closer to the public consumers, not just field researchers

NOAA operates in mission spaces that extend from the ocean floor to beyond Earth’s atmosphere. But basically, it’s a data organization first and foremost. Sensors are deployed in all of these diverse environments in order to collect data that can then be analyzed and packaged into products that help protect citizens and industries from the vagaries of the weather. Now, edge computing makes it possible…

Read more

NOAA operates in mission spaces that extend from the ocean floor to beyond Earth’s atmosphere. But basically, it’s a data organization first and foremost. Sensors are deployed in all of these diverse environments in order to collect data that can then be analyzed and packaged into products that help protect citizens and industries from the vagaries of the weather. Now, edge computing makes it possible not only to do more analysis of that data in the field, resulting in more immediate insights, but also to better engage the stakeholder community across academia, industry, and the general public who are primary consumers of NOAA data.

Most agencies whose primary mission revolves around data collection, such as the intelligence community or certain healthcare agencies, retain data for internal government consumption due to privacy or national security considerations, and publish analysis and guidance only. But NOAA is the opposite. All of the data it collects is for public products, and as much of the analysis takes place outside the agency as it does within it. Basically, NOAA has two data centers of gravity: one in the sensors, where the data is collected, and one in the public domain, where the data is consumed.

And it’s looking to edge computing to drive workloads in both directions.

This is why you are looking to move to a community modeling approach. The conversations are happening around how NOAA packages its climate models in order to push the models themselves for the public to understand, build on and adjust, said Frank Indeviglio, NOAA’s deputy director for high-performance computing and communications.

“These toolkits and software environments,” he said, “will drive innovation.” “Hardware enables innovation, but software, I think, is really the layer that we’re going to need to focus on and push for, not only to modernize, but also to get these tools out into the public domain so we can all do better together, for lack of a better term.”

For example, Indiviglio said that as hardware platforms get smaller, edge computing is enabling the use of what are essentially supercomputers in places that wouldn’t even have been imagined ten years ago. Artificial intelligence enables NOAA fisheries to perform genetic sequencing in the field, identifying fish species in the water without a diver. AI also improves the accuracy of predictions, leading to a better product and freeing people to do more science. Independent platforms increase the amount of data NOAA can collect from inside hurricanes while reducing the need for people to put themselves in harm’s way by flying into them.

This requires a heavy focus on the data itself, and how it is transmitted and disseminated.

“It’s integrity,” Indiviglio said. “We want to make sure that the data that we release into the public domain is real data that has been produced by our science and that people can trust. What you don’t want is data either being ingested or released to the public that has some kind of question mark about its integrity.”

One challenge is how many system limits NOAA data has to cross. Indiviglio said it’s moving from the monitoring system, where it’s collected by the sensor, to the HPCC system where it’s processed and analyzed and becomes part of the prediction. Then it moves to data distribution. Moving data across all of these systems while maintaining integrity is a challenge he said required updating the platforms. But he also said NOAA benefits from a long history of prioritizing data fundamentals and interoperability, which has positioned the agency to not only streamline those processes, but also start to layer artificial intelligence on top of them.

That’s why the cloud was a perfect fit for NOAA. The agency has always had a mobile workforce, with end users on both ships in the Arctic and on stations in the Antarctic, and ensuring the ability to safely transfer data from those locations was critical. Now they are leveraging the cloud to provide more flexibility around their workloads. NOAA’s supercomputer programs continued to grow, far beyond the ability to provide the computing required through the hardware. The cloud allows the agency to provide that flexible capacity to remote workstations and shave off the weeks and months of response time that were common. Workloads are able to snap towards the data block.

“You can get computing to people far away and it would be kind of limited in the past. We can build environments that work for them,” Indiviglio said. “And we can now, with technology, get a lot of data and get a lot of people out of harm’s way to get that data. So it’s kind of a win-win on both sides, right? So you can do more analysis in this area, you can certainly process more in this area, but you can also get more [data]And that’s a good thing.”

Leave a Comment