It makes interesting reading. Initially as I read it I thought this is a super-impressive document/strategy. The more I read though, I started to think it was erring on the side of being too bureaucratic. Perhaps that is just how it reads, as they say in they’re trying to document how stuff currently works, to build on that.
Things that caught my eye
Code needs to be well-maintained and readable - following coding rules that conform with good practice, modularity, verification (unit testing, robustness, state-of-the-art software development, continuous integration)
- Within 2 years: To have the system team fully proficient with all aspects of a GitLab-based development environment. Including: the use of GitLab runners for automated tasks and continuous integration and regular updating of Wiki and web-based support material
- Within 5 years: To have Unit-testing capabilities in most code areas. Frequent, automated testing of the code base. Full support for exascale and heterogeneous computing environments (e.g. GPU co-processors) via Domain Specific Language pre-processing tools.
- Within 10 years: To have code testing carried out by AI-enabled agents
So they’re at the beginning of this process the same as ACCESS-NRI.
Our initial idea for an implementation plan was centred on a Gantt chart for each WG summarising the key milestones in the implementation of their main aims and objectives similar to that in the Land-ice chapter. Uncertainties about the resources and funding that will be available, however, make it difficult to establish reliable/meaningful Gantt charts of this sort at this stage.
Instead we have started to construct tables of the tasks and resources that each WG need to undertake and the resources that these tasks will require. The content and format of these tables was agreed with the WGLs and the members of the NSC and draft tables have been constructed.
As we’ve discussed at ACCESS-NRI, planning in an ever-changing environment like climate science, where there are always new developments and discoveries, is a real challenge. It’s a cliché but being agile is important.
I was struck by the very different approach of MOM6, a federation of development sites that guarantee their respective test suites pass CI when merging code but otherwise are free to develop as each node sees fit
I can see the benefits of both, and while Alistair’s talk at the CLEX workshop was incredibly interesting and exciting, it seemed challenging to try and base a released product on something that was so incredibly dynamic.