ACCESS-ROM3 setup instructions

Introduction

These are the steps needed to convert an ACCESS-OM3 model configuration to a regional model configuration run on NUPOC.
If you would like to view an example setup there is one here. To download this you need to run

git clone --branch dev-regional_jra55do_ryf https://github.com/ACCESS-NRI/access-om3-configs/

if you want to see some of steps described in the instructions below then run

cd access-om3-configs 
git log -16

This will provide a log of the changes performed to convert the original ACCESS-OM3 run to an ACCESS-ROM3 run. The output will display a long alphanumeric commit hash for each step.

To see the changes done to the code between two different steps you can use the first few digits of the commit hashes for each of these steps.
e.g

git diff ba62 a6b7

The following instructions are the steps needed to generate a regional ocean model (ACCESS-ROM3). Keep an eye on these instructions as they will be modified over the next few weeks for bug fixes and changes in the underlying code.

Compile

The first step is optional as there is a version of the executable already available to be used.
I suggest skipping this step for now and waiting a couple of weeks before trying as there are some changes coming through that will make it easier.

To skip this step, go to the next step: Generate your input files

Get a Spack build using mom-only symmetric on gadi by first installing spack instructions here. Note that you might get an error when doing the spack concretize step. It is not an issue here so ignore It.

Next you need build spack using a modification of instructions here

The modifications are when you reach this step:

git clone https://github.com/ACCESS-NRI/ACCESS-ESM1.5.git
spack env create mom5_dev ACCESS-ESM1.5/spack.yaml

you need to change to this:

git clone https://github.com/ACCESS-NRI/ACCESS-OM3.git
spack env create mom6_dev ACCESS-OM3/spack.yaml

If you wait a couple of weeks, these next changes will not be necessary

Then, in spack.yaml you need to change

   access-om3-nuopc:
      require:
        - '@git.0.3.1'

to

   access-om3-nuopc:
      require:
        - '@git.0.3.1'
        - configurations=MOM6
        - +mom_symmetric 

Generate your input files

Use regional-mom6 notebook from the cosima-recipes to get a non-nuopc regional mom 6 configuration.

There needs to be some modifications to change the date_range to [“2013-01-01”, “2013-01-05”] and to change the dataset used.

change

experiment = catalog["01deg_jra55v13_ryf9091"]

to

experiment = catalog["01deg_jra55v150_iaf_cycle1"]

This is good for getting something up and running – but when doing production runs, you should carefully consider which model run is best suited for boundary conditions.

There is an example of the notebook with the modifications here
Note that this example uses the crocodile version of regional-mom6. It should work on the COSIMA version in a few weeks after a pull request has gone through.

Remember which folder you put these files in as you will need to change some files later to point to it. i.e in the notebook we have:

input_dir = f"{scratch}/regional_mom6_configs/{expt_name}/"
run_dir = f"{home}/mom6_rundirs/{expt_name}/"

These folders are going to be called “input_dir” and “run_dir” throughout the rest of these instructions.

This generates some of the input files that we need, but the nuopc files need to come from a global configuration.

Download your other configuration files from an ACCESS_OM3 run

Download a global access-om3 configuration onto gadi. Do this in a different folder than the input_dir that you used in the notebook in a previous step

mkdir -p ~/access-om3
cd ~/access-om3
module use /g/data/vk83/modules 
module load payu/1.1.6
payu clone -b expt -B dev-1deg_jra55do_iaf https://github.com/ACCESS-NRI/access-om3-configs/ access-rom3
cd access-rom3

We now need to mesh our regional mom simulation which has our input files (input_dir) and the global configuration that runs on nuopc.

Copy the MOM_input, MOM_override, MOM_layout and data_table files from run_dir to ~/access-om3/access-rom3/

Modify MOM_input to point to different locations

There needs to be quite a few changes to the MOM_input file
First changes are to point to different locations for the input. Note that the changes in this step will not be needed in future versions of regional-mom6 toolbox. If you cannot find these lines to change in MOM_input then hopefully you can skip this step (still do the last atmospheric forcing change in the next post) but watch for errors about missing boundary files.

Change

TEMP_SALT_Z_INIT_FILE = "forcing/init_tracers.nc" ! default = "temp_salt_z.nc"

to

TEMP_SALT_Z_INIT_FILE = "init_tracers.nc" ! default = "temp_salt_z.nc"

Change

SURFACE_HEIGHT_IC_FILE = "forcing/init_eta.nc" !

to

 SURFACE_HEIGHT_IC_FILE = "init_eta.nc" !

Change

VELOCITY_FILE = "forcing/init_vel.nc" !

to

VELOCITY_FILE = "init_vel.nc" !

Change

OBC_SEGMENT_001_DATA = "U=file:forcing/forcing_obc_segment_001.nc(u),V=file:forcing/forcing_obc_segment_001.nc(v),SSH=file:forcing/forcing_obc_segment_001.nc(eta),TEMP=file:forcing/forcing_obc_segment_001.nc(temp),SALT=file:forcing/forcing_obc_segment_001.nc(salt)" !
                                ! OBC segment docs
OBC_SEGMENT_002_DATA = "U=file:forcing/forcing_obc_segment_002.nc(u),V=file:forcing/forcing_obc_segment_002.nc(v),SSH=file:forcing/forcing_obc_segment_002.nc(eta),TEMP=file:forcing/forcing_obc_segment_002.nc(temp),SALT=file:forcing/forcing_obc_segment_002.nc(salt)" !
                                ! OBC segment docs
OBC_SEGMENT_003_DATA = "U=file:forcing/forcing_obc_segment_003.nc(u),V=file:forcing/forcing_obc_segment_003.nc(v),SSH=file:forcing/forcing_obc_segment_003.nc(eta),TEMP=file:forcing/forcing_obc_segment_003.nc(temp),SALT=file:forcing/forcing_obc_segment_003.nc(salt)" !
                                ! OBC segment docs
OBC_SEGMENT_004_DATA = "U=file:forcing/forcing_obc_segment_004.nc(u),V=file:forcing/forcing_obc_segment_004.nc(v),SSH=file:forcing/forcing_obc_segment_004.nc(eta),TEMP=file:forcing/forcing_obc_segment_004.nc(temp),SALT=file:forcing/forcing_obc_segment_004.nc(salt)" !
                                ! OBC segment docs

to

OBC_SEGMENT_001_DATA = "U=file:forcing_obc_segment_001.nc(u),V=file:forcing_obc_segment_001.nc(v),SSH=file:forcing_obc_segment_001.nc(eta),TEMP=file:forcing_obc_segment_001.nc(temp),SALT=file:forcing_obc_segment_001.nc(salt)" !
                                ! OBC segment docs
OBC_SEGMENT_002_DATA = "U=file:forcing_obc_segment_002.nc(u),V=file:forcing_obc_segment_002.nc(v),SSH=file:forcing_obc_segment_002.nc(eta),TEMP=file:forcing_obc_segment_002.nc(temp),SALT=file:forcing_obc_segment_002.nc(salt)" !
                                ! OBC segment docs
OBC_SEGMENT_003_DATA = "U=file:forcing_obc_segment_003.nc(u),V=file:forcing_obc_segment_003.nc(v),SSH=file:forcing_obc_segment_003.nc(eta),TEMP=file:forcing_obc_segment_003.nc(temp),SALT=file:forcing_obc_segment_003.nc(salt)" !
                                ! OBC segment docs
OBC_SEGMENT_004_DATA = "U=file:forcing_obc_segment_004.nc(u),V=file:forcing_obc_segment_004.nc(v),SSH=file:forcing_obc_segment_004.nc(eta),TEMP=file:forcing_obc_segment_004.nc(temp),SALT=file:forcing_obc_segment_004.nc(salt)" !
                                ! OBC segment docs

Changes to MOM_input to put NUOPC atmospheric forcing info

This change to MOM_input will still need to be done with future versions of regional mom6-toolbox. We need to add some lines for atmospheric forcing coming in via nuopc coupler:
Add this to the bottom of MOM_input:

! === module MOM_surface_forcing_nuopc ===
ENTHALPY_FROM_COUPLER = True
                                ! "[Boolean] default = False
                                ! If True, the heat (enthalpy) associated with mass entering/leaving
                                ! the ocean is provided via coupler."

LATENT_HEAT_FUSION = 3.337E+05  !   [J/kg] default = 3.337E+05

LATENT_HEAT_VAPORIZATION = 2.501E+06 !   [J/kg] default = 2.501E+06

Changes to config.yaml to point to input netcdf

You also need to make some changes in the config.yaml file under “input:” to add these lines make sure you change “input_dir” and “run_dir” to your input run directories created in the regional-mom6 notebook:

 - input_dir/hgrid.nc
 - input_dir/vcoord.nc
 - input_dir/bathymetry.nc
 - input_dir/init_tracers.nc
 - input_dir/init_eta.nc
 - input_dir/init_vel.nc
 - input_dir/forcing_obc_segment_001.nc
 - input_dir/forcing_obc_segment_002.nc
 - input_dir/forcing_obc_segment_003.nc
 - input_dir/ forcing_obc_segment_004.nc  
 - input_dir/grid_spec.nc
 - input_dir/ocean_mosaic.nc 
 - input_dir/access-rom3-ESMFmesh.nc
 - input_dir/access-rom3-nomask-ESMFmesh.nc
 - input_dir/land_mask.nc

Note that the .mesh.nc files do not yet exist.

I would suggest double checking that the above links (except for *.mesh.nc) are correct as the regional-mom6 toolbox is under active development and the position of these files will change depending on which version you are using. In particular, some of the files may be under input_dir/forcing.

You will need to remove these lines:

 - /g/data/vk83/configurations/inputs/access-om3/mom/grids/mosaic/global.1deg/2020.05.30/ocean_hgrid.nc
 - /g/data/vk83/configurations/inputs/access-om3/mom/grids/vertical/global.1deg/2023.07.28/ocean_vgrid.nc
 - /g/data/vk83/configurations/inputs/access-om3/mom/initial_conditions/global.1deg/2020.10.22/ocean_temp_salt.res.nc
 - /g/data/vk83/configurations/inputs/access-om3/share/meshes/global.1deg/2024.01.25/access-om2-1deg-ESMFmesh.nc
 - /g/data/vk83/configurations/inputs/access-om3/share/meshes/global.1deg/2024.01.25/access-om2-1deg-nomask-ESMFmesh.nc

if you are using an executable compiled by ACCESS-NRI then you also need to change this:

load:
        - access-om3/2025.01.1

to this:

load:
        - access-om3/2025.01.0

Changes to config.yaml to point to executable

Note, if you compiled your own executable that you need to skip this step and go to alternative changes to config.yaml These changes won’t be needed in a few weeks’ time.

Change

exe: access-om3-MOM6-CICE6

to

exe: access-om3-MOM6

Alternative changes to config.yaml

If you have compiled your own executable then you need to point your executable to the one you created using spack. You do not need to do this step if you did the previous step

Change this

exe:   access-om3-MOM6-CICE6

to something that looks like this – but your path will vary

exe: /g/data/***/****/spack/0.22/release/linux-rocky8-x86_64/intel-2021.10.0/access-om3-nuopc-git.0.3.1_0.3.1-6ry27bdy6uoet34nwk6yunmkrkomvads/bin/access-om3-MOM6

Changes to config.yaml job specifications

You also need to modify these lines in config.yaml:

  1. ncpu: 240
    This is the number of CPUs you will need. It depends on the layout that you specified when calloing FRE_tools. If you didn’t change the default (10x10) in the regional-mom6 notebook then you want
    ncpu:100

  2. change runlog: false to runlog: true

  3. mem:960GB
    The number you need here will be domain dependent. For the Tasmanian example I use
    mem:100GB
    You may need more when using a larger domain

  4. Comment out these line for mom-standalone runs

    #userscripts:
        #    setup: /usr/bin/bash /g/data/vk83/apps/om3-scripts/payu_config/setup.sh
        # archive: /usr/bin/bash /g/data/vk83/apps/om3-scripts/payu_config/archive.sh

This last step is setting up CICE so keep it in for simulations with ice.

Creating mesh files

You will need to use generate_mesh.py from om3-scripts

To do this you need to download a copy of the toolbox. Note that in the following commands you need to change “path_to_where_you_store_libraries” to wherever you want to download this to.

cd path_to_where_you_store_libraries
git clone https://github.com/COSIMA/om3-scripts.git
conda activate analysis3

The following commands will create your mesh files. Note that you will need to change “input_dir” to your input_dir and “path_to_where_you_store_libraries” to where you just downloaded the om3-scripts to.

cd input_dir

python3 path_to_where_you_store_libraries/om3-scripts/mesh_generation/generate_mesh.py --grid-type=mom --grid-filename=hgrid.nc --mesh-filename=access-rom3-ESMFmesh.nc --mask-filename=ocean_mask.nc --wrap-lons
python3 path_to_where_you_store_libraries/om3-scripts/mesh_generation/generate_mesh.py --grid-type=mom --grid-filename=hgrid.nc --mesh-filename=access-rom3-nomask-ESMFmesh.nc --wrap-lons

Run these commands the input_dir so the files are stored with your other netcdf files. If you store them somewhere else, note that you will need to change the path to them in config.yaml

You need to modify a few configuration files to point to these. Note that the path we are putting in is different to their current location but payu will fix this up at run time.

Change back into your access-rom3 directory

cd ~/access-om3/access-rom3

In datm_in change these lines:

model_maskfile = "./INPUT/access-om2-1deg-nomask-ESMFmesh.nc"
model_meshfile = "./INPUT/access-om2-1deg-nomask-ESMFmesh.nc"
nx_global = 360
ny_global = 300

to

model_maskfile = "./INPUT/access-rom3-nomask-ESMFmesh.nc"
model_meshfile = "./INPUT/access-rom3-nomask-ESMFmesh.nc"
nx_global = 140
ny_global = 249

Note that nx_global and ny_global are the domain size. The numbers used here are from the Tasmanian example but if you have a different size domain then you can check these numbers in MOM_layout (the value for NIGLOBAL is the same as nx_global and the values for NJGLOBAL are the same as ny_global).

Similar, you need to modify drof_in to point to the masks

Change

model_maskfile = "./INPUT/access-om2-1deg-nomask-ESMFmesh.nc"
model_meshfile = "./INPUT/access-om2-1deg-nomask-ESMFmesh.nc"
nx_global = 360
ny_global = 300

to

model_maskfile = "./INPUT/access-rom3-nomask-ESMFmesh.nc"
model_meshfile = "./INPUT/access-rom3-nomask-ESMFmesh.nc"
nx_global = 140
ny_global = 249

Changes to input.nml

In input.nml you need to include an extra file.

Change

parameter_filename = 'MOM_input', 'MOM_override'

to

parameter_filename = 'MOM_input', 'MOM_layout', 'MOM_override'

Changes to nuopc.runconfig for new layout

We need to change the distribution of tasks (assuming the default 10x10 layout and 140x249 grid size):

Change

ocn_ntasks = 216

to

ocn_ntasks = 100

Change

ocn_rootpe = 24

to

ocn_rootpe = 0

Change

ocn_nx = 360
ocn_ny = 300

to

ocn_nx = 140
ocn_ny = 249

if you are not using a 140x249 grid size (e.g. the Tasmanian example) then you can find ocn_nx and ocn_ny in MOM_layout (NIGLOBAL and NJGLOBAL respectively)

We need to set the start date to match the date done in the notebook and to change the length of the simulation to just a couple of days.

Change

start_ymd = 19000101
stop_n = 1
stop_option = nmonths

to

start_ymd = 20130101
stop_n = 2
stop_option = ndays

As we are running for 2 days we need to change the frequency of the restart files:

Change

restart_n = 1
restart_option = nmonths

to

restart_n = 2
restart_option = ndays

We need to update the meshes in here as well.
Change

mesh_mask = ./INPUT/access-om2-1deg-ESMFmesh.nc
mesh_ocn = ./INPUT/access-om2-1deg-ESMFmesh.nc

to

mesh_mask = ./INPUT/access-rom3-ESMFmesh.nc
mesh_ocn = ./INPUT/access-rom3-ESMFmesh.nc

For mom-standalong runs, remove the ice component.

Change

component_list: MED ATM ICE OCN ROF

to

component_list: MED ATM OCN ROF

Change

ICE_model = cice

to

ICE_model = sice

Changes to nuopc.runseq to remove CICE

For mom-only runs in nuopc.runseq remove these lines (to not run the ice model; essentially remove any line that has “ice” in it)

MED med_phases_prep_ice
MED -> ICE :remapMethod=redist
ICE
MED med_phases_diag_ice_ice2med
ICE -> MED :remapMethod=redist
MED med_phases_post_ice
MED med_phases_diag_ice_med2ice

Changes to diag_table to output more frequently

The default option was to output monthly. As the test run does not run for a month, we are swapping to daily output.
In diag_table change

"access-om3.mom6.h.rho2%4yr-%2mo",                      1,  "months", 1, "days",   "time", 1, "months"
"access-om3.mom6.h.native%4yr-%2mo",                    1,  "months", 1, "days",   "time", 1, "months"
"access-om3.mom6.h.z%4yr-%2mo",                         1,  "months", 1, "days",   "time", 1, "months"
"access-om3.mom6.h.sfc%4yr-%2mo",                        1,  "months",   1, "days",   "time", 1, "months"

To

"access-om3.mom6.h.rho2%4yr-%2mo",                       1,  "days", 1, "days",   "time", 1, "months"
"access-om3.mom6.h.native%4yr-%2mo",                     1,  "days", 1, "days",   "time", 1, "months"
"access-om3.mom6.h.z%4yr-%2mo",                          1,  "days", 1, "days",   "time", 1, "months"
"access-om3.mom6.h.sfc%4yr-%2mo",                        1,  "days",   1, "days",   "time", 1, "months"

Run the model

To run you need to use payu:

module use /g/data/vk83/modules
module load payu/1.1.6
payu setup
payu sweep
payu run

Note: That there is still a fix to come for a known issue with land runoff so don’t do any production runs just yet

3 Likes

@helen Thanks for putting this together. We are excited to get rolling with the regional config. I am already failing at your first steps:

i) git clone --branch dev-regional_jra55do_ryf GitHub - ACCESS-NRI/access-om3-configs: ACCESS-OM3 MOM6-CICE6 configurations … this works
ii) cd dev-regional_jra55do_ryf … this fails. No dev-* directory. But git log -16 works on the access-om3* dir
iii) RE: “The first step is optional as there is a version of the executable already available to be used. I suggest skipping this step for now and waiting a couple of weeks before trying as there are some changes coming through that will make it easier.” It is unclear which steps we should skip. For example, do we still need to edit the spack.yaml file?

Thanks for getting feedback to me so quickly @PSpence!

i) To confirm – does this step does work for you?
ii) Good catch! The instructions were incorrect, and you have changed into the correct directory. I have updated
iii) Skip everything after the heading “Compile” and before the heading “generate your input files” I have updated this to hopefully make it more clear by adding:
“To skip this step, go to the next step: Generate your input files”

1 Like

For the next step (generate your input files), the regional mom code is under flux as a big pull request is coming through. This adds a temporary complication as the example notebooks for generating your own domain may become broken. This issue will hopefully go away in a couple of weeks when the pull request is completed.

For now, I think the easiest way to complete this step is to use the cosima regional mom notebook with the regional mom code in the analysis3 conda environment as these won’t change until the pull request goes through.

Alternatively, if @mmr0 has a notebook working then you could modify this.

My example notebook can be used if you want to view the changes I did to the dates and the catalogue data, but I am expecting this notebook to break more frequently.

Can I comment that the name ROM3 is very very similar to ROMS.

May I suggest we name the package/module ACCESS regional-OM3?

cc @PSpence, @mmr0

Thanks for the detailed instructions!!!

(tagging also @Paul.Gregory in case they missed this)

1 Like

Is ACCESS-ROM3 still very very similar to ROMS? I think they’re not so much if the full name is used.

Hi @navidcy – thanks for bringing up this discussion. The current name was chosen for its shortness and consistency with branding of other ACCESS-models.

We had a quick discussion at ACCESS-NRI which I would summarise

Pro to changes:

  1. ROM3 is similar to ROMS – which can be confusing especially as ACCESS-NRI Regional and coastal ocean modelling team (i.e. me) is also working with ROMS users.

Cons to changes

  1. Inconsistent branding
  2. Adding regional makes it very long “ACCESS-regional-OM3
  3. “Ocean” and “Modelling” form the basis of a lot of acronyms so similar acronyms is already a present in the ocean modelling community without there being a big issue.
  4. The full name is ACCESS-ROM3 so if we stick with this then the confusion will be less

Alternative suggestion

  1. We change the big R to a small r - “ACCESS-rOM3” - so it more easily identifiable as part of the ACCESS-OM3 suite.

No decision has been made yet so keen to hear ideas.

@ashjbarnes has done a lot of development work here so I would like to also hear if he has input into the naming convention.

It’s OK, it was just a suggestion. You’ve put thought on it, let’s go with ACCESS-ROM3.

1 Like

Well the other reason is that we have so many acronyms that I feel we don’t need more, so regional-OM3 was my attempt to reduce the number of acronyms in the community.

People like ROM3 then ROM3 it is!

2 Likes

Hi Helen, this looks great! Thanks for putting it all together in one place. I have a couple of questions & comments:

  1. Regarding changing config.yaml, is there a reason that in this case we need to specify every netcdf input explicitly? Normally the package just adds a single absolute path to the location containing all of the forcing files( eg /g/data/nm03/mom6_inputs/ashley_example/) works. Perhaps this is an update to payu I’m not aware of though, in which case we ought to update the rmom6 package to reflect this

  2. Is a mask table no longer a thing with NUOPC? So if I have 10x10 layout but my domain is 70% land, do I still use 100 cpus? This is what’s suggested under the “Changes to config.yaml” step. I have no idea about this, but flagging it in case mask table does still need handling here

  3. As you’ve mentioned already, these steps will be made much simpler with the new version of regional mom6! If the Australian users are more likely to switch to using NUOPC rather than the FMS coupler in the future, I’d advocate for putting the input.nml and config.yaml files from this ROM3 directory into a set of default input files included with the package. This way the setup_rundir step will automatically populate everything you need for this NUOPC setup. If we want to keep maintaining the old FMS coupler setup, then maybe we just have a boolean using_NUOPC flag which specifies which set of input & config files the package grabs.

Regarding the naming: I do agree with @navidcy that this is very similar to ROMS! Also, the current naming convention is usually ACCESS followed by the model, and so ROM3 reads to me that it’s a different model, rather than a variation on an existing model.

My suggestion: adding a -regional or -R at the end of the current naming system could work too. For example, we already have ACCESS-OM2-01 for the 10th degree. Is this convention continuing for OM3 too? Since regional models will probably have variable resolutions, maybe we could replace the third entry with a -R, so the suite would eventually be something like

ACCESS-OM3-01
ACCESS-OM3-025
ACCESS-OM3-R

I don’t mind though, it’s very much an ACCESS-NRI decision as to their naming conventions of course!

2 Likes

Hi Helen, just to let you know that I got this running following your instructions :raised_hands:

Regarding Ashley’s (1) above, I actually couldn’t run the code without also adding my top level input directory
- /scratch/jk72/mxr581/regional_mom6_configs/ROM3-tassie-test/
under input: in config.yaml

I think this is because most of the input files are given as ./INPUT/file_name.nc

I haven’t tested what happens if I delete all the specific file paths and just keep my top level input directory.

2 Likes

I am moderately confident both approaches are basically equivalent. Most ACCESS-NRI configs list the files individually to allow each file to be individually versioned and updated individually (following the folder structure in /g/data/vk83/configurations/). Obviously if the folder has lots of other stuff in it this can lead to messy manifests file / possibly lack of clarity about which files are acutally used.

1 Like

You should be able to confirm which files are actually be put into the work directory for an run using the manifests/input.yaml

1 Like

I’d assume this is correct for this config. I think we are planning to use the AUTO_MASKTABLE option in MOM6, but haven’t investigated it in detail yet.

Thanks @anton. Can’t pretend I currently understand the contents of manifests/input.yaml but I will attempt to!

The error that I was getting (which was fixed by adding the top level directory to config.yaml) was

FATAL from PE     0: init_extern_file: file INPUT/forcing/forcing_obc_segment_001.nc could not be opened.

I’ll recreate it so that I can compare manifests/input.yaml

Thanks @ashjbarnes @mmr0 and @anton – a few good suggestions!
I am about to head off for a few days so may not be very responsive. Please feel free to edit the wiki if you agree that an improvement is needed, and I will try to address some of the other suggestions when I return.

2 Likes

@ashjbarnes very good idea! We are still in discussions about how best to manage this - but we do need a simpler way than these rather long step-by-step instructions. The two suggestions we were considering were

  1. via saving the default input files as you suggested.
  2. making an offical release like access-rom3/access-om3-r and the relevant files can be downloaded via pay clone
1 Like

Yep.

As @anton said we moved to elucidating all the files individually so it was clear what was being used and what wasn’t, and when a file path was changed it was picked up as a change in the config.yaml, and not just in a manifest file.

There are exceptions. The full JRA-55-do forcing dataset is just referenced by the directory in which it resides because it isn’t practical to list all the files individually.

I like it. Nice suggestion @ashjbarnes.

4 Likes