Machine Learning for Climate and Weather Working Group Announce

Hi everyone,

This month’s meeting of the Machine Learning for Climate and Weather Working Group will take place Friday, 3 October at 2pm.

Zoom details: Launch Meeting - Zoom
Zoom Meeting ID: 83187307759 (password: 123456)

Here is the agenda:

Facilitator: Micael Oliveira
Co-chairs: Sanaa Hobeichi, Ryan Holmes, Tennessee Leeuwenburg, Micael Oliveira (ACCESS-NRI liaison), Yue Sun (NCI liaison)

  • Acknowledgement of country
  • Updates from ACCESS-NRI and NCI
    • ACCESS-NRI 2025-26 Workplan for Machine Learning
    • New DL models ready to run at NCI
  • Updates from the Chairs
    • Highlights from the First Year of the Community.
    • ML session at AMOS2026
  • Updates from the community & new community member introductions
  • Presentation by David Fuchs (Details provided below).
  • Discussion and follow-up from NRI community workshop (ongoing):
    • Proposals for interacting/supporting other working groups
    • Plans for using project nm47 (100TB gdata storage, 700 kSU/quarter) on NCI

See you on Friday!

Details of this month’s presentation:

Presenter: David Fuchs | Senior Scientist, DCCEEW
Title: The benefits of lateral connections: towards a stable neural network surrogate for climate model parametrization
Abstract: This study compares four neural network (NN) architectures to serve as surrogates for moist convection in a global atmospheric model. The four architectures share a common backbone but differ in implementation, and are tested offline and online in the CAM atmosphere model. A network architecture that enforces a mesh of short and long pathways proves superior to an increase of depth alone, or replacing ResNet shortcuts with DenseNet. An online hybrid climate model run based on this architecture runs stably for 25 years to completion of an atmospheric-only, and 12 years to completion of a coupled ocean-atmosphere scenario. This shows that the learning process loses online numerical stability while reducing online error. Earlier offline learning epochs mastered cloud liquid water while later epochs reduced the error in cloud ice, potentially due to a naive sampling approach. In the coupled ocean-atmosphere scenario, these translated to steep surface temperature increases when earlier training epochs were used as surrogates, which flattened when later training epochs were used. These results emphasize the need to explore further NN architectures as surrogates to existing parameterizations.