Skip to content

SSA meeting recap: Harnessing the power of cloud computing

Tags: cloud platform

a room full of people at long tables opening up their laptops and looking at projector screens at the front of the room
Setup time at the “Data Mining on the Cloud 101″ workshop. (Photo: Tammy Bravo/EarthScope)

The recent Seismology Society of America (SSA) annual meeting in Anchorage, Alaska, brought together researchers to explore how cloud computing is shaping the future of seismology. A workshop, poster, and oral session showcased how the cloud is enabling efficient data analysis, fostering collaboration, and driving innovative research. We find events like this to be useful for taking the pulse of our community—highlighting for us how the growth of cloud computing is progressing and why, for example. If you weren’t able to attend the meeting, here’s a short summary of what we heard.

A foundational “Data Mining on the Cloud 101 Workshop”, taught by researchers from the University of Washington and the Lamont-Doherty Earth Observatory at Columbia University introduced the basics and best practices of cloud computing, focusing on correlation seismology and machine learning. Attendees learned how to scale their workflows using both cloud-native and high-performance computing (HPC) resources.

An oral session focused on how the U.S. Geological Survey (USGS) is using cloud computing to accelerate scientific modeling, improve seismic hazard products, and enhance research collaboration. EarthScope’s ongoing migration to commercial cloud systems highlighted the potential of scalable processing workflows and resources to unlock geophysical insights using the NSF SAGE/GAGE facility, while MsPASS showcased how its parallel processing framework can handle seismic data at any scale. The downstream science impacts of lossy compression as a practical solution for managing large-scale seismic data were presented. Finally, the efficiency and accuracy of earthquake characterization was enhanced in a multitask deep learning model, PhaseNet+. This approach combines the strengths of different models at discrete tasks that make up holistic, end-to-end event characterization.

In the poster session, the SCOPED platform demonstrated how a flexible infrastructure can support both data-driven and model-based research. Tools like Docker containers and GitHub enable seamless collaboration across HPC and cloud platforms. EarthScope, a partner in the ShakeAlert® Earthquake Early Warning System, highlighted how normalized real-time position streams in a cloud-native event streaming platform enable robust and secure message distribution for downstream accurate alerts with low-latency processing. The USGS also presented updates to its ComCat database, reinforcing the importance of scalable cloud systems for real-time earthquake monitoring.

Several themes emerged across these sessions:

  • Collaboration: Community-driven solutions are crucial for the shared development and use of seismic data.
  • Scalability: Cloud computing is uniquely scalable, a useful characteristic from real-time alerts to comprehensive earthquake catalogs, voluminous data types (DAS) to massively parallel seismic processing. 
  • Accessibility: Efforts are being made to ensure that educational resources, tutorials, and workshops democratize access to cloud computing in seismology.

Overall, the strong turnout across the sessions at the SSA meeting reflects community interest in the cloud, and emphasizes the need to build resources to facilitate a cloud-ready seismology community. EarthScope aims to do its part, developing a cloud platform to support transformative research using NSF SAGE and GAGE data. With knowledge sharing and training, we will continue to support researchers interested in adopting this new technology to realize the potential of cloud computing for seismology research and education.


The EarthScope-operated data systems of the NSF GAGE and SAGE Facilities are migrating to cloud services. To learn more about this effort and find resources, visit earthscope.org/data/cloud and check out this short video on our roadmap.