CM3 Dev-Eval Working group: Meeting minutes 2025

First CM3 Dev-Eval Working group

Date: Tuesday on the 11th November at 10:30 am. Meeting link.

Participants: @cbull, @jemmajeffree, @wghuneke, @spencerwong, @rbeucher, @tiloz, @MartinDix, @ezhilsabareesh8, @paulleopardi, @CharlesTurner, @dgwyther, @aekiss, @heidi, @ZhiLiUNSW, @RachelLaw, Sienna Blanckensee, @Harun_Rashid, @lishx, @ongqingyee, @HIMADRI_SAINI, @peterdobro, @DeepashreeDutta

Coordination: @cbull

Chair: @cbull

Minutes: @spencerwong

Apologies: @nicolamaher @ctychung @dave

Agenda:

Scheduling:

  • See this post for the schedule of ESM WG and CM3 dev-eval meetings for the rest of the year. There will be an ESM WG meeting with a science presentation on Thursday 20 November at 1pm AEDT, and a CM3 dev-eval meeting on Tuesday 9 December at 10:30 am AEDT.

CM3 technical developments

  • @MartinDix provided an update on the latest CM3 technical developments. Changes being tested include bathymetry changes to address Baltic salinity drift, and UM GC5 parameter changes. The parameter changes led to a worsening of the radiative imbalance, and tests using newer UM code will be carried out.
  • Full minutes from the last technical meeting are available here, and anyone is welcome to get involved in the meetings.

Python stats functions

  • @Harun_Rashid has added a collection of python functions for statistical analysis to the repository here. These might be useful in for anyone doing analysis, and feel free to reach out if you have questions.

Datastores for CM3 evaluation

  • @CharlesTurner provided an update on creating datastores from CM2 and CM3 output. Most of the data from the latest CM3 run has been placed in a virtualised datastore, which significantly speeds up access and analysis.
  • Virtualisation may not currently be viable for the CM2 data, and more specific workarounds will be required to optimise the speed of accessing the data.
  • @MartinDix noted plans to split CM3 atmospheric output into single variable files which may also help.

Figure discussions:

Discussion highlighted changes in variability in the maximum salinity timeseries, though this could be due to changes at individual points. It was noted that adding observations to the comparisons would be useful.

GitHub links to OM3 figures for CM3 (please include OM3 comparison links where practical) – @ezhilsabareesh8 :

  • Drake Passage Transport
    The streamfunction is too high in CM3. OM3 matches observations well, while CM3 is too high and still increasing. @ezhilsabareesh8 will update the figures to use the CM2 025 degree simulation.

    This could point to a problem in the atmosphere, either too strong westerlies or too strong coupling. It was noted that Southern ocean wind stresses appear similar between 1 degree CM2 and 025 degree CM3, and so similar wind stresses could be resulting in different ocean forcing. Plots of SSH vs latitude at the Drake passage were raised as a possible way to investigate this further.

  • Global Timeseries. These diagnostics are not quite ready yet.

  • SSS and SSS difference from WOA23
    Large SSS biases were noted in the Arctic

  • SST and SST difference from WOA23
    Large warm biases in the Southern Ocean were noted

Community figures

Rolled over to next meeting
@Harun_Rashid

today I’d like to briefly talk about the ENSO power spectrum (slide 4) and the ENSO diagnostics table (slide 5). The table compares various ENSO diagnostics computed from the latest CM3 run with those calculated from observations, ACCESS-CM2 and other CMIP6 model simulations.

The run length is 37 years, with seven years of data missing
The simulation was done in August by Kieran at ACCESS-NRI

Others

Please paste here and include a caption!