The road to regional coupled modelling (rCM3)

Hi All.

One of 21stCenturyWeather’s aims is to be build a regional coupled model, combining high resolution atmospheric and ocean models.

I posted this thread on the 21stCenturyWeather Cumulus website, but I thought I’d cross-post it here to get some extra attention (it was a little love-starved over there).

"Following on from Monday’s Modelling Science. meeting, I thought I’d take a high-level look at rAM3 and CM3 and see how they work.

Here is a high-level simplified view of rAM3 from a rose/cylc view.

Note rAM3 uses a pre-built executable.

The running of CM3 is in contrast, rather opaque from a rose/cylc viewpoint. It only has four major tasks.

It’s easy to understand the Recon and Atmos tasks from a UM point of view, but I’m unsure how the coupling is done. This suggests the Atmosphere task is configured via the UM namelists and other MOM6/CICE/NUOPCY control files and the UM executable directly handles the execution of MOM6, CICE and NUOPCY coupling.

Is this correct?

Or does the compiled NUOPCY executable control the UM, MOM6, and CICE?

BTW I can’t find any reference to CABLE in the cm3-suite, but I see plenty of JULES. Does the NUOPCY version of CM3 use JULES?"

(Updated to include the correct rAM3 diagram- thanks @bethanwhite )

3 Likes

Hi @Paul.Gregory, exciting to hear about the plans for the regional coupling! Just jumping in with a few details re CM3. @kieranricardo will be able to clarify in a bit more detail when he is back!

As CM3 is still in development there is a bit of clean up that could still be done to make the suite a bit clearer.

Or does the compiled NUOPCY executable control the UM, MOM6, and CICE?

This is correct, in the NUOPC setup, a single executable is built from a NUOPC driver which loads in each of the component models as libraries. This driver then controls all the components and the mediator during the run.

BTW I can’t find any reference to CABLE in the cm3-suite, but I see plenty of JULES. Does the NUOPCY version of CM3 use JULES?"

CM3 is currently still using Jules, but the plan is to swap to CABLE.

2 Likes

Hi all.

For rCM3 development I’d like to build a NUOPCY executable that can run inside a debugger.

I need some advice as to the best way to achieve this:

Currently, the CM3 config (latest branch is https://github.com/ACCESS-NRI/access-cm3-configs/tree/cm3-O100km-vn13.8) does the following (I think?)

  1. Uses UM sources from https://github.com/ACCESS-NRI/UM/tree/vn13.8_nuopc (which contains the required files in src/control/top_level)

  2. Pulls a pre-build SPACK MOM6 executable (access-om3/pr79-12 according to https://github.com/ACCESS-NRI/access-cm3-configs/blob/cm3-O100km-vn13.8/rose-suite.conf)

  3. Build the NUOPCY executable locally

So, in order the build the NUOPCY executable with debug symbols, I should

a) Create a new om3 pull request to compile with debug symbols, similar to Claire’s pr120? MOM6 symmetric with updated MOM6 version by claireyung · Pull Request #120 · ACCESS-NRI/ACCESS-OM3 · GitHub (which was built with -gflags on branch cy-pananopt according to ACCESS-OM3/spack.yaml at cy-pananopt · ACCESS-NRI/ACCESS-OM3 · GitHub )

b) Change the value of OM3_MODULEin the rose-suite.conf

Then compile the NUOPCY executable locally (including compiling the UM locally w/debug options).

Any thoughts @cbull, @kieranricardo, @harshula ?

Note - last time I tried to load access-om3/pr120-19 into Linaro it couldn’t find the debug symbols, see this thread :

An end of year update.

Things we have learnt recently.

You can’t run an ancil suite task with a domain without any land points. (This is mentioned in the ACCESS rAM3 release notes but the link to the work-around docs is broken). The ancil_lct task will run, but the downstream task ancil_lct_postprocess_c4 will fail if there are no land points. So our initial test case has moved from ocean / atmosphere only to to ocean / atmosphere+land.

We can use the ESMF conservative regridder (using the python wrapper esmpy) to correctly interpolate a regional grid and remove any edge effects. Hence we can create a MOM6 land/sea mask, regridded to the target UM resolution, that will replicate the ‘CAPS’ used by NUOPCY to transfer fluxes b/w MOM6 and the UM inside the NUOPCY executable.


The target land sea mask must have a specified co-ordinate system. Otherwise ancil_lct.py will fail at this step

tgt_crs = target_x.coord_system.as_ants_crs()

It expects the target land sea mask to have the a co-ordinate system specified in iris as

iris.coord_systems.GeogCS(6371229.0)

This can be achieved in xarray by
a) specifying a grid mapping variable

# Createa GeogCS Grid Mapping Variable
grid_mapping_var = xr.Variable(
        dims=(),
        data=0,
        attrs={
            'grid_mapping_name': 'latitude_longitude',
            'earth_radius': 6371229.0,
        }
    )

b) referencing this variable in the attributes of the data array

da = xr.DataArray(
    data=subset.data,
    dims=["latitude", "longitude"],
    attrs=dict(um_stash_source='m01s00i505',
               grid_mapping='geog_cs',
               earth_radius=6371229.0)
    )

c) Including both the data array and the grid mapping variable in the output dataset

ds = xr.Dataset(
        data_vars=dict(
            land_binary_mask=da,
            geog_cs=grid_mapping_var
            ),
        coords=dict(
            latitude=subset.latitude,
            longitude=subset.longitude,
        ),
        attrs=dict(Conventions='CF-1.7')
        )

Implementing the above creates the correct vegetation cover for our regridded test domain.

Note the ancil_lct.py code converts the input float values into booleans:

lbm = lbm.copy(lbm.data.astype("bool", copy=False))

And the code sets by default

min_frac = 0.0


To do:

  1. Build a NUOPCY executable with both MOM6 and UM source compiled with debug flags (see above post)
  2. Modify ancil_lct.py to output the actual land-sea masks used within the code, to be used by downstream ancil tasks.
  3. Build a task which creates a grid namelist ASCII file which replicates the land-sea mask initially defined by rMOM6 (regridded to the desired UM resolution). There are downstream tasks (e.g. ancil-cap-orog) which use the ASCII namelist file.
  4. Build a task and suite logic that takes the definition of the MOM6 grid and expands the domain to provide the UM ‘driving model’ domain, where the ERA5/BARRA/CMIP data is reconfigured to provide the initial and boundary conditions for the regional UM grid.
2 Likes

@Paul.Gregory, FYI the link to the workaround has now been fixed in the ACCESS rAM3 release notes.

1 Like

Some updates.

@ashjbarnes has been working on rCM3 now for over a month. He has got both the UM and MOM6 to run in a regional configuration for one timestep, but the NUOCPY mediator fails at this point.

Ashley is working closely with Kieran and Martin to reconcile this issue (created by the current NUOPCY mediator is hard-coded to look for CICE variables that don’t exist in our regional configuration).

Ashley is working closely with Kieran and Martin to ensure his executables are in line with CM3 developments.

Question : Should we start building the rCM3 rose/cylc suite in cylc8?

1 Like