Initial experiments with spack

I am using this topic to document steps investigating using spack as a build-from-source package manager for gadi@NCI.

The initial steps are documented in this GitHub issue, but copied here for completeness:

Results of some preliminary testing of spack on NCI

Quick set up of spack:

git clone https://github.com/spack/spack
cd spack/
. share/spack/setup-env.sh 
spack compiler find

Picked up the following by default:

==> Available compilers
-- clang rocky8-x86_64 ------------------------------------------
clang@13.0.1

-- dpcpp rocky8-x86_64 ------------------------------------------
dpcpp@2022.1.0

-- gcc rocky8-x86_64 --------------------------------------------
gcc@8.5.0

-- intel centos8-x86_64 -----------------------------------------
intel@2021.2.0  intel@19.1.3.304  intel@19.1.2.254

-- intel rocky8-x86_64 ------------------------------------------
intel@2021.6.0

-- oneapi centos8-x86_64 ----------------------------------------
oneapi@2021.1

-- oneapi rocky8-x86_64 -----------------------------------------
oneapi@2022.1.0

Can load other compiler versions and for them to be found as well

module load intel-compiler-llvm/2022.1.0
spack compiler find

which added the compiler to list

-- oneapi rocky8-x86_64 -----------------------------------------
oneapi@2022.1.0

Force finding external tools

spack external find

That picks up a bunch of system tools we’re probably pretty agnostic about as far as versions go. Mostly just need builds to be reproducible, so the less we have to install as dependencies the better

$ grep spec ~/.spack/packages.yaml 
    - spec: cmake@3.20.2
    - spec: bison@3.0.4
    - spec: cvs@1.11.23
    - spec: groff@1.22.3
    - spec: texinfo@6.5
    - spec: tar@1.30
    - spec: openssh@8.0p1
    - spec: binutils@2.30.113
    - spec: subversion@1.10.2
    - spec: flex@2.6.1+lex
    - spec: automake@1.16.1
    - spec: ninja@1.8.2
    - spec: pkgconf@1.4.2
    - spec: gawk@4.2.1
    - spec: gmake@4.2.1
    - spec: openssl@1.1.1k
    - spec: m4@1.4.18
    - spec: autoconf@2.69
    - spec: libtool@2.4.6
    - spec: findutils@4.6.0
    - spec: diffutils@3.6
    - spec: git@2.31.1~tcltk

Can load modules and ask spack to find them

module load openmpi/4.1.4
spack external find openmpi
module switch openmpi/4.1.2
spack external find openmpi
spack external find perl
module load python3/3.10.4
spack external find python

Picked up the following:

    - spec: git@2.31.1~tcltk
    - spec: python@3.10.4+bz2+ctypes+dbm+ensurepip+lzma+nis+pyexpat~pythoncmd+readline+sqlite3+ssl+tix+tkinter+uuid+zlib
    - spec: python@2.7.18+bz2+ctypes+dbm+ensurepip+nis+pyexpat~pythoncmd+readline+sqlite3+ssl~tix~tkinter+uuid+zlib
    - spec: python@3.6.8+bz2+ctypes+dbm+ensurepip+lzma+nis+pyexpat~pythoncmd+readline+sqlite3+ssl~tix~tkinter+uuid+zlib
    - spec: perl@5.26.3~cpanm+shared+threads
    - spec: openmpi@4.1.2%gcc@8.5.0+cuda+cxx~cxx_exceptions~java+lustre~memchecker+pmi~static~wrapper-rpath
    - spec: openmpi@4.1.4%gcc@8.5.0+cuda+cxx~cxx_exceptions~java+lustre~memchecker+pmi~static~wrapper-rpath

Added an un-buildable virtual mpi package in ~/.spack/packages.yaml:

packages:
  # Add a virtual mpi package and set it to not be buildable, to force using available external MPI
  mpi:
    buildable: false

which forces any MPI dependency to use one fo the installed MPI libraries.

Check what spack will install with using spec sub-command:

$ spack spec json-fortran %oneapi@2022.1.0                                                     
Input spec                                                                                                                 
--------------------------------                                                                                           
json-fortran%oneapi@2022.1.0                                                                 
                                                                                                                           
Concretized                                                                                                                
--------------------------------                                                                                           
json-fortran@8.3.0%oneapi@2022.1.0~ipo build_type=RelWithDebInfo arch=linux-rocky8-cascadelake                             
    ^cmake@3.20.2%oneapi@2022.1.0~doc+ncurses+ownlibs~qt build_type=Release arch=linux-rocky8-cascadelake           

Install json-fortran dependency used in libaccessom2

$ spack install json-fortran %oneapi@2022.1.0                                                  
[+] /usr (external cmake-3.20.2-hw3xgexqsgf6bjklxwit5fxhjoz6wdst)                                                          
==> Installing json-fortran-8.3.0-zyzrvosuv4732vqp3z3grchy3rqyg5og                
==> No binary for json-fortran-8.3.0-zyzrvosuv4732vqp3z3grchy3rqyg5og found: installing from source
==> Fetching https://mirror.spack.io/_source-cache/archive/5f/5fe9ad709a726416cec986886503e0526419742e288c4e43f63c1c22026d1e
8a.tar.gz                                                                          
==> No patches needed for json-fortran                                                                                     
==> json-fortran: Executing phase: 'cmake'                                         
==> json-fortran: Executing phase: 'build'                                                                                 
==> json-fortran: Executing phase: 'install'                                                                               
==> json-fortran: Successfully installed json-fortran-8.3.0-zyzrvosuv4732vqp3z3grchy3rqyg5og                               
  Fetch: 3.34s.  Build: 17.56s.  Total: 20.90s.                                                                            
[+] /scratch/x77/aph502/spack/opt/spack/linux-rocky8-cascadelake/oneapi-2022.1.0/json-fortran-8.3.0-zyzrvosuv4732vqp3z3grchy
3rqyg5og                        

$ spack find
==> 2 installed packages
-- linux-rocky8-cascadelake / oneapi@2022.1.0 -------------------
cmake@3.20.2  json-fortran@8.3.0

Look for an existing datetime-fortran package

$ spack list datetime
==> 3 packages.
py-jdatetime  py-parsedatetime  r-assertive-datetimes

Not available, so make one

$ git checkout -b datetime-fortran
Switched to a new branch 'datetime-fortran'
$ spack create 
==> This looks like a URL for datetime-fortran
==> Found 8 versions of datetime-fortran:[https://github.com/wavebitscientific/datetime-fortran/releases/download/v1.7.0/datetime-fortran-1.7.0.tar.gz](https://github.com/wavebitscientific/dathttps://github.com/wavebitscientific/datetime-fortran/releases/download/v1.7.0/datetime-fortran-1.7.0.tar.gz)

  1.7.0  https://github.com/wavebitscientific/datetime-fortran/releases/download/v1.7.0/datetime-fortran-1.7.0.tar.gz
  1.6.2  https://github.com/wavebitscientific/datetime-fortran/releases/download/v1.6.2/datetime-fortran-1.6.2.tar.gz
  1.6.1  https://github.com/wavebitscientific/datetime-fortran/releases/download/v1.6.1/datetime-fortran-1.6.1.tar.gz
  1.6.0  https://github.com/wavebitscientific/datetime-fortran/releases/download/v1.6.0/datetime-fortran-1.6.0.tar.gz
  1.5.0  https://github.com/wavebitscientific/datetime-fortran/releases/download/v1.5.0/datetime-fortran-1.5.0.tar.gz
  1.4.3  https://github.com/wavebitscientific/datetime-fortran/releases/download/v1.4.3/datetime-fortran-1.4.3.tar.gz
  1.4.2  https://github.com/wavebitscientific/datetime-fortran/releases/download/v1.4.2/datetime-fortran-1.4.2.tar.gz
  1.4.1  https://github.com/wavebitscientific/datetime-fortran/releases/download/v1.4.1/datetime-fortran-1.4.1.tar.gz

==> How many would you like to checksum? (default is 1, q to abort) 8
==> Fetching https://github.com/wavebitscientific/datetime-fortran/releases/download/v1.7.0/datetime-fortran-1.7.0.tar.gz
==> Fetching https://github.com/wavebitscientific/datetime-fortran/releases/download/v1.6.2/datetime-fortran-1.6.2.tar.gz
==> Fetching https://github.com/wavebitscientific/datetime-fortran/releases/download/v1.6.1/datetime-fortran-1.6.1.tar.gz
==> Fetching https://github.com/wavebitscientific/datetime-fortran/releases/download/v1.6.0/datetime-fortran-1.6.0.tar.gz
==> Fetching https://github.com/wavebitscientific/datetime-fortran/releases/download/v1.5.0/datetime-fortran-1.5.0.tar.gz
==> Fetching https://github.com/wavebitscientific/datetime-fortran/releases/download/v1.4.3/datetime-fortran-1.4.3.tar.gz
==> Fetching https://github.com/wavebitscientific/datetime-fortran/releases/download/v1.4.2/datetime-fortran-1.4.2.tar.gz
==> Fetching https://github.com/wavebitscientific/datetime-fortran/releases/download/v1.4.1/datetime-fortran-1.4.1.tar.gz
==> This package looks like it uses the autotools build system
==> Created template for datetime-fortran package
==> Created package file: /scratch/x77/aph502/spack/var/spack/repos/builtin/packages/datetime-fortran/[package.py](http://package.py/)

Add meta-data to auto-generated package

$ spack edit datetime-fortran

Test install

$ spack install datetime-fortran %oneapi@2022.1.0
==> Installing datetime-fortran-1.7.0-smbai7glmpfkg3poxd6jdpm5hl4zjjeg
==> No binary for datetime-fortran-1.7.0-smbai7glmpfkg3poxd6jdpm5hl4zjjeg found: installing from source
==> Fetching https://github.com/wavebitscientific/datetime-fortran/releases/download/v1.7.0/datetime-fortran-1.7.0.tar.gz
==> No patches needed for datetime-fortran
==> datetime-fortran: Executing phase: 'autoreconf'
==> datetime-fortran: Executing phase: 'configure'
==> datetime-fortran: Executing phase: 'build'
==> datetime-fortran: Executing phase: 'install'
==> datetime-fortran: Successfully installed datetime-fortran-1.7.0-smbai7glmpfkg3poxd6jdpm5hl4zjjeg
  Fetch: 3.89s.  Build: 8.82s.  Total: 12.71s.
[+] /scratch/x77/aph502/spack/opt/spack/linux-rocky8-cascadelake/oneapi-2022.1.0/datetime-fortran-1.7.0-smbai7glmpfkg3poxd6j
dpm5hl4zjjeg

$ spack find
==> 3 installed packages
-- linux-rocky8-cascadelake / oneapi@2022.1.0 -------------------
cmake@3.20.2  datetime-fortran@1.7.0  json-fortran@8.3.0

I have pushed the package description to a datetime-fortran branch on the ACCESS-NRI fork of spack.

Next test step was to build the OASIS3-MCT coupler used in ACCESS-OM2. This is a version of the OASIS code that has been modified, but the same base code has been used in a number of projects.

There was no well versioned tarballs to use for spack, so pushed a 2.5 tag to the repo as a reasonable estimate of the status of the version.

Experimented with creating an ACCESS specific OASIS3-MCT build

spack create -n oasis3-mct-access https://github.com/COSIMA/oasis3-mct/archive/refs/tags/2.5.tar.gz

spack correctly detected this as a Makefile based build.

To edit the spack build configuration

spack edit oasis3-mct-access

Stopped at this point as the OASIS build is a bit non-standard. The Makefile defaults to making an NCI specific build

the Makefile itself is here

and it sources this NCI specific file

This isn’t a currently supported configuration for spack, which only really supports editing (filtering) static files. It should work to filter the make.nci file, but I think it would probably be better to standardise the build infra. @harshula is now working on the OASIS3-MCT build infrastructure with the aim of making it consistent with other builds.

@dale.roberts suggested trying to build ESMF as a useful test of compiling with the OpenMPI libraries provided by NCI.

esmf is available as a package in spack

$ spack info esmf
MakefilePackage:   esmf

Description:
    The Earth System Modeling Framework (ESMF) is high-performance, flexible
    software infrastructure for building and coupling weather, climate, and
    related Earth science applications. The ESMF defines an architecture for
    composing complex, coupled modeling systems and includes data structures
    and utilities for developing individual models.

Homepage: https://www.earthsystemcog.org/projects/esmf/

Preferred version:  
    8.2.0     https://github.com/esmf-org/esmf/archive/ESMF_8_2_0.tar.gz

Safe versions:  
    8.2.0     https://github.com/esmf-org/esmf/archive/ESMF_8_2_0.tar.gz
    8.1.1     https://github.com/esmf-org/esmf/archive/ESMF_8_1_1.tar.gz
    8.0.1     https://github.com/esmf-org/esmf/archive/ESMF_8_0_1.tar.gz
    8.0.0     https://github.com/esmf-org/esmf/archive/ESMF_8_0_0.tar.gz
    7.1.0r    http://www.earthsystemmodeling.org/esmf_releases/public/ESMF_7_1_0r/esmf_7_1_0r_src.tar.gz

Deprecated versions:  
    None

Variants:
    Name [Default]           When    Allowed values    Description
    =====================    ====    ==============    ========================================

    debug [off]              --      on, off           Make a debuggable version of the library
    external-lapack [off]    --      on, off           Build with external LAPACK support
    mpi [on]                 --      on, off           Build with MPI support
    netcdf [on]              --      on, off           Build with NetCDF support
    pio [on]                 --      on, off           Enable ParallelIO support
    pnetcdf [on]             --      on, off           Build with pNetCDF support
    xerces [on]              --      on, off           Build with Xerces support

Build Dependencies:
    lapack  libxml2  mpi  netcdf-c  netcdf-fortran  parallel-netcdf  xerces-c  zlib

Link Dependencies:
    lapack  libxml2  mpi  netcdf-c  netcdf-fortran  parallel-netcdf  xerces-c  zlib

Run Dependencies:
    None

Can install with the gcc compiler

$ spack install esmf %gcc
[+] /usr (external libxml2-2.9.7-3tg77dcqcq4njzzg6utcb2yyiggejkft)
[+] /usr (external cmake-3.20.2-epebyw4npsytp4fg7pmans2phn6kvpi6)
==> openmpi@4.1.4 : has external module in ['openmpi/4.1.4']
[+] /apps/openmpi/4.1.4 (external openmpi-4.1.4-ltxdephdcg3j5dajcykaoxewqzme2rza)
[+] /usr (external pkgconf-1.4.2-hdnpub6laejsmze2kea7fzmclzbz2qto)
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/zlib-1.2.12-kglbdsrq45ndx7rzxp2nhc2bx3u76n7t                                                  
[+] /usr (external m4-1.4.18-brzkr65auawocpnevfcevfhfxqgazosy)
[+] /usr (external perl-5.26.3-3ja5gkcnhrraexes2zicyudv3cy5jn2e)
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/libiconv-1.16-pl3qqm2tp3pgmmhhcoc63vogalzefkwz                                                
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/hdf5-1.12.2-7zoo7ll6xwncn5yhbgzzz3ejx7d5odje                                                  
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/parallel-netcdf-1.12.2-xwtvsqjzvvx6tzb4xiwb6qloqzivhb6q                                       
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/xerces-c-3.2.3-7airba3wdwn55izglso27nv53dt4xkmz                                               
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/netcdf-c-4.8.1-32qdhov44lvit6oq6sbutuu5episa4uj                                               
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/netcdf-fortran-4.5.4-kritoco5s7zxjc4outk2yp42kkplg2n6                                         
==> Installing esmf-8.2.0-n2k5tal7m5faybbah4m3n6jxv756vtx3
==> No binary for esmf-8.2.0-n2k5tal7m5faybbah4m3n6jxv756vtx3 found: installing from source                                                                          
==> Using cached archive: /scratch/tm70/aph502/spack/var/spack/cache/_source-cache/archive/36/3693987aba2c8ae8af67a0e222bea4099a48afe09b8d3d334106f9d7fc311485.tar.gz
==> No patches needed for esmf
==> esmf: Executing phase: 'edit'
==> esmf: Executing phase: 'build'
==> esmf: Executing phase: 'install'
==> esmf: Successfully installed esmf-8.2.0-n2k5tal7m5faybbah4m3n6jxv756vtx3
  Fetch: 0.16s.  Build: 19m 8.31s.  Total: 19m 8.47s.
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/esmf-8.2.0-n2k5tal7m5faybbah4m3n6jxv756vtx3

Successfully used the following system tools/libraries:

  • ibxml2-2.9.7
  • cmake-3.20.2
  • openmpi-4.1.4
  • pkgconf-1.4.2
  • m4-1.4.18
  • perl-5.26.3

Built the following components:

  • libiconv-1.16
  • hdf5-1.12.2
  • parallel-netcdf-1.12.2
  • xerces-c-3.2.3
  • netcdf-c-4.8.1
  • netcdf-fortran-4.5.4

You can get a nice description of a package and it’s dependencies using spack find options, which can help distinguish what was built from source and what was used from the system:

-- linux-rocky8-x86_64 / gcc@8.5.0 ------------------------------
n2k5tal esmf@8.2.0%gcc                   /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/esmf-8.2.0-n2k5tal7m5faybbah4m3n6jxv756vtx3
3tg77dc     libxml2@2.9.7%gcc            /usr
32qdhov     netcdf-c@4.8.1%gcc           /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/netcdf-c-4.8.1-32qdhov44lvit6oq6sbutuu5episa4uj
7zoo7ll         hdf5@1.12.2%gcc          /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/hdf5-1.12.2-7zoo7ll6xwncn5yhbgzzz3ejx7d5odje
epebyw4             cmake@3.20.2%gcc     /usr
ltxdeph             openmpi@4.1.4%gcc    /apps/openmpi/4.1.4
hdnpub6             pkgconf@1.4.2%gcc    /usr
kglbdsr             zlib@1.2.12%gcc      /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/zlib-1.2.12-kglbdsrq45ndx7rzxp2nhc2bx3u76n7t
brzkr65         m4@1.4.18%gcc            /usr
kritoco     netcdf-fortran@4.5.4%gcc     /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/netcdf-fortran-4.5.4-kritoco5s7zxjc4outk2yp42kkplg2n6
xwtvsqj     parallel-netcdf@1.12.2%gcc   /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/parallel-netcdf-1.12.2-xwtvsqjzvvx6tzb4xiwb6qloqzivhb6q
3ja5gkc         perl@5.26.3%gcc          /usr
7airba3     xerces-c@3.2.3%gcc           /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/xerces-c-3.2.3-7airba3wdwn55izglso27nv53dt4xkmz
pl3qqm2         libiconv@1.16%gcc        /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/libiconv-1.16-pl3qqm2tp3pgmmhhcoc63vogalzefkwz

However, if I try installing esmf with intel compilers it errors:

$ spack install esmf %intel                                                                                                              
[+] /usr (external libxml2-2.9.7-pbyfinpzbflkcnm5nfviagpcisb5le7e)   
[+] /usr (external cmake-3.20.2-pndskxfmljehku2ob5dc7fdveoyasl3l)
==> openmpi@4.1.4 : has external module in ['openmpi/4.1.4']
[+] /apps/openmpi/4.1.4 (external openmpi-4.1.4-ltxdephdcg3j5dajcykaoxewqzme2rza)
[+] /usr (external pkgconf-1.4.2-nwouzfkqxs3oq2qp3t6gehj5fnaygdlo)                                                                                                   
==> Installing zlib-1.2.12-fxbzqxk5ia4czmk4gtzyja64nu2h5xw5
==> No binary for zlib-1.2.12-fxbzqxk5ia4czmk4gtzyja64nu2h5xw5 found: installing from source
==> Using cached archive: /scratch/tm70/aph502/spack/var/spack/cache/_source-cache/archive/91/91844808532e5ce316b3c010929493c0244f3d37593afd6de04f71821d5136d9.tar.gz
==> Applied patch /scratch/tm70/aph502/spack/var/spack/repos/builtin/packages/zlib/configure-cc.patch
==> zlib: Executing phase: 'install'
==> zlib: Successfully installed zlib-1.2.12-fxbzqxk5ia4czmk4gtzyja64nu2h5xw5
  Fetch: 0.05s.  Build: 6.69s.  Total: 6.74s.
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0-gcc/zlib-1.2.12-fxbzqxk5ia4czmk4gtzyja64nu2h5xw5
[+] /usr (external m4-1.4.18-bo5ik3k3itkc7wa2larjncvjqw7e7gur)
[+] /usr (external perl-5.26.3-5guoo7pfhpdofdabyeoi7pu7oui7roxv)
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/libiconv-1.16-pl3qqm2tp3pgmmhhcoc63vogalzefkwz
==> Installing hdf5-1.12.2-raod4w2nlm2dn42qo533pidhszkzfvfv
==> No binary for hdf5-1.12.2-raod4w2nlm2dn42qo533pidhszkzfvfv found: installing from source
==> Using cached archive: /scratch/tm70/aph502/spack/var/spack/cache/_source-cache/archive/2a/2a89af03d56ce7502dcae18232c241281ad1773561ec00c0f0e8ee2463910f14.tar.gz
==> Ran patch() for hdf5
==> hdf5: Executing phase: 'cmake'
==> hdf5: Executing phase: 'build'
==> hdf5: Executing phase: 'install'
==> hdf5: Successfully installed hdf5-1.12.2-raod4w2nlm2dn42qo533pidhszkzfvfv
  Fetch: 0.11s.  Build: 3m 1.57s.  Total: 3m 1.68s.
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0-gcc/hdf5-1.12.2-raod4w2nlm2dn42qo533pidhszkzfvfv
==> Installing parallel-netcdf-1.12.2-meptwc23ca3a4jnf2zkhctq5rlysbybc
==> No binary for parallel-netcdf-1.12.2-meptwc23ca3a4jnf2zkhctq5rlysbybc found: installing from source
==> Using cached archive: /scratch/tm70/aph502/spack/var/spack/cache/_source-cache/archive/3e/3ef1411875b07955f519a5b03278c31e566976357ddfc74c2493a1076e7d7c74.tar.gz
==> No patches needed for parallel-netcdf
==> parallel-netcdf: Executing phase: 'autoreconf'
==> parallel-netcdf: Executing phase: 'configure'
==> Error: ProcessError: Command exited with status 1:
    '/scratch/v45/aph502/tmp/spack-stage/spack-stage-parallel-netcdf-1.12.2-meptwc23ca3a4jnf2zkhctq5rlysbybc/spack-src/configure' '--prefix=/scratch/tm70/aph502/spack
/opt/spack/linux-rocky8-x86_64/intel-2021.6.0-gcc/parallel-netcdf-1.12.2-meptwc23ca3a4jnf2zkhctq5rlysbybc' '--with-mpi=/apps/openmpi/4.1.4' 'SEQ_CC=/scratch/tm70/aph5
02/spack/lib/spack/env/intel/icc' '--enable-cxx' '--enable-fortran' 'CFLAGS=-fPIC' 'CXXFLAGS=-fPIC' 'FCFLAGS=-fPIC' 'FFLAGS=-fPIC' '--enable-relax-coord-bound' '--ena
ble-shared' '--enable-static' '--disable-silent-rules'

1 error found in build log:
     134    checking dynamic linker characteristics... (cached) GNU/Linux ld.so
     135    checking how to hardcode library paths into programs... immediate
     136    checking how to get verbose linking output from /apps/openmpi/4.1.4/bin/mpif77... configure: WARNING: compilation failed
     137
     138    checking for Fortran 77 libraries of /apps/openmpi/4.1.4/bin/mpif77...
     139    checking whether /apps/openmpi/4.1.4/bin/mpif77 is a valid MPI compiler... no
  >> 140    configure: error:
     141       -----------------------------------------------------------------------
     142         Invalid MPI Fortran 77 compiler: "/apps/openmpi/4.1.4/bin/mpif77"
     143         A working MPI Fortran 77 compiler is required. Please specify the
     144         location of a valid MPI Fortran 77 compiler, either in the MPIF77
     145         environment variable or through --with-mpi configure flag. Abort.
     146       -----------------------------------------------------------------------

See build log for details:
  /scratch/v45/aph502/tmp/spack-stage/spack-stage-parallel-netcdf-1.12.2-meptwc23ca3a4jnf2zkhctq5rlysbybc/spack-build-out.txt

==> Warning: Skipping build of esmf-8.2.0-gwakp4aj5c32p2fk5fel4zrudcqmabzy since parallel-netcdf-1.12.2-meptwc23ca3a4jnf2zkhctq5rlysbybc failed
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/xerces-c-3.2.3-7airba3wdwn55izglso27nv53dt4xkmz
==> Installing netcdf-c-4.8.1-ti6nfpg2e2yq2c5tvap7ytrhabrwzjej
==> No binary for netcdf-c-4.8.1-ti6nfpg2e2yq2c5tvap7ytrhabrwzjej found: installing from source
==> Using cached archive: /scratch/tm70/aph502/spack/var/spack/cache/_source-cache/archive/bc/bc018cc30d5da402622bf76462480664c6668b55eb16ba205a0dfb8647161dd0.tar.gz
==> Applied patch /scratch/tm70/aph502/spack/var/spack/repos/builtin/packages/netcdf-c/4.8.1-no-strict-aliasing-config.patch
==> netcdf-c: Executing phase: 'autoreconf'
==> netcdf-c: Executing phase: 'configure'
==> netcdf-c: Executing phase: 'build'
==> netcdf-c: Executing phase: 'install'
==> netcdf-c: Successfully installed netcdf-c-4.8.1-ti6nfpg2e2yq2c5tvap7ytrhabrwzjej
  Fetch: 0.14s.  Build: 1m 43.73s.  Total: 1m 43.87s.
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0-gcc/netcdf-c-4.8.1-ti6nfpg2e2yq2c5tvap7ytrhabrwzjej
==> Installing netcdf-fortran-4.5.4-75elx25wrx6xym374usxml73fg36cabf
==> No binary for netcdf-fortran-4.5.4-75elx25wrx6xym374usxml73fg36cabf found: installing from source
==> Using cached archive: /scratch/tm70/aph502/spack/var/spack/cache/_source-cache/archive/0a/0a19b26a2b6e29fab5d29d7d7e08c24e87712d09a5cafeea90e16e0a2ab86b81.tar.gz
==> No patches needed for netcdf-fortran
==> netcdf-fortran: Executing phase: 'autoreconf'
==> netcdf-fortran: Executing phase: 'configure'
==> Error: ProcessError: Command exited with status 1:
    '/scratch/v45/aph502/tmp/spack-stage/spack-stage-netcdf-fortran-4.5.4-75elx25wrx6xym374usxml73fg36cabf/spack-src/configure' '--prefix=/scratch/tm70/aph502/spack/o
pt/spack/linux-rocky8-x86_64/intel-2021.6.0-gcc/netcdf-fortran-4.5.4-75elx25wrx6xym374usxml73fg36cabf' '--enable-static' '--enable-shared' '--disable-doxygen' '--disa
ble-parallel-tests'

1 error found in build log:
     53    checking whether the compiler supports GNU Fortran... no
     54    checking whether /scratch/tm70/aph502/spack/lib/spack/env/intel/ifort accepts -g... no
     55    checking whether the compiler supports GNU Fortran 77... no
     56    checking whether /scratch/tm70/aph502/spack/lib/spack/env/intel/ifort accepts -g... no
     57    checking whether Fortran compiler is checked for ISO_C_BINDING support... yes
     58    checking for Fortran flag to compile .f90 files... unknown
  >> 59    configure: error: Fortran could not compile .f90 files

See build log for details:
  /scratch/v45/aph502/tmp/spack-stage/spack-stage-netcdf-fortran-4.5.4-75elx25wrx6xym374usxml73fg36cabf/spack-build-out.txt

==> Error: esmf-8.2.0-gwakp4aj5c32p2fk5fel4zrudcqmabzy: Package was not installed
==> Error: Installation request failed.  Refer to reported errors for failing package(s).

This looks familiar @Aidan,. I think the issue is here: 'SEQ_CC=/scratch/tm70/aph5 02/spack/lib/spack/env/intel/icc. My recollection is that spack is introducing its own compiler wrapper to the Intel compilers that are clobbering the NCI wrappers. The NCI wrappers add the Intel subdirectories to paths set in the module file, so if the NCI wrappers aren’t being run, then it won’t be able to find libmpi_mpifh.so or the other MPI fortran libraries, so it can’t compile a simple application. What’s odd is that the NCI gfortran wrapper does the same thing, but I guess spack leaves the system GNU compilers alone? The NCI wrappers also handle DT_RPATH settings, which I guess is why spack is trying to wrap the compilers in the first place. If you could figure out how to get it to trust that the system compilers are configured to set DT_RPATH in binaries, then perhaps that’d solve this problem.

1 Like

Thanks @dale.roberts, you were on the money. I wasn’t sure about how to make spack use the compiler wrapper as-is, because it has it’s own machinery for path injection. Instead I followed the instructions on adding environment variables to compiler definitions and added some paths to the environment for the intel compiler

and that seemed to work

$ spack install esmf %intel@2021.6.0
[+] /usr (external libxml2-2.9.7-v3vfk3p6zrcphz6rf67r62opcnhgkxiy)
[+] /usr (external cmake-3.20.2-krqai4wfhu3u2do55zq3nfeg63wc3akd)
==> openmpi@4.1.4 : has external module in ['openmpi/4.1.4']
[+] /apps/openmpi/4.1.4 (external openmpi-4.1.4-ltxdephdcg3j5dajcykaoxewqzme2rza)
[+] /usr (external pkgconf-1.4.2-kbhodzapvipxdj6m36xif2hdhm7mj7qw)
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/zlib-1.2.12-sf2twx6y2kehnsae7odxskzjqs67zc3o                                             
[+] /usr (external m4-1.4.18-fhh4a2q3z4kdu2en5pj22bcmg4kau6ws)
[+] /usr (external perl-5.26.3-kzt7y2qs7lmyv6g4lq3f4jm47l6cx5dy)
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/libiconv-1.16-pl3qqm2tp3pgmmhhcoc63vogalzefkwz                                                
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/hdf5-1.12.2-vbqq44vf36sqxlyezh3uox5xmb3rdgc7                                             
==> Installing parallel-netcdf-1.12.2-g4x4mgo7heqesmkjhu3gn2zyhfq4rurw
==> No binary for parallel-netcdf-1.12.2-g4x4mgo7heqesmkjhu3gn2zyhfq4rurw found: installing from source                                                              
==> Using cached archive: /scratch/tm70/aph502/spack/var/spack/cache/_source-cache/archive/3e/3ef1411875b07955f519a5b03278c31e566976357ddfc74c2493a1076e7d7c74.tar.gz
==> No patches needed for parallel-netcdf
==> parallel-netcdf: Executing phase: 'autoreconf'
==> parallel-netcdf: Executing phase: 'configure'
==> parallel-netcdf: Executing phase: 'build'
==> parallel-netcdf: Executing phase: 'install'
==> parallel-netcdf: Successfully installed parallel-netcdf-1.12.2-g4x4mgo7heqesmkjhu3gn2zyhfq4rurw                                                                  
  Fetch: 0.06s.  Build: 8m 27.37s.  Total: 8m 27.43s.
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/parallel-netcdf-1.12.2-g4x4mgo7heqesmkjhu3gn2zyhfq4rurw                                  
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/xerces-c-3.2.3-7airba3wdwn55izglso27nv53dt4xkmz                                               
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/netcdf-c-4.8.1-ivdrt2yextjx2rs7o6yz32uypo4bpwmz                                          
==> Installing netcdf-fortran-4.5.4-cxx4ujcxvhf6ubvtf2tikj3rtotqbf3c
==> No binary for netcdf-fortran-4.5.4-cxx4ujcxvhf6ubvtf2tikj3rtotqbf3c found: installing from source                                                                
==> Using cached archive: /scratch/tm70/aph502/spack/var/spack/cache/_source-cache/archive/0a/0a19b26a2b6e29fab5d29d7d7e08c24e87712d09a5cafeea90e16e0a2ab86b81.tar.gz
==> No patches needed for netcdf-fortran
==> netcdf-fortran: Executing phase: 'autoreconf'
==> netcdf-fortran: Executing phase: 'configure'
==> netcdf-fortran: Executing phase: 'build'
==> netcdf-fortran: Executing phase: 'install'
==> netcdf-fortran: Successfully installed netcdf-fortran-4.5.4-cxx4ujcxvhf6ubvtf2tikj3rtotqbf3c                                                                     
  Fetch: 0.05s.  Build: 2m 1.93s.  Total: 2m 1.97s.
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/netcdf-fortran-4.5.4-cxx4ujcxvhf6ubvtf2tikj3rtotqbf3c                                    
==> Installing esmf-8.2.0-2eu34q4aa2fdohzkecewegmsjci3yrdj
==> No binary for esmf-8.2.0-2eu34q4aa2fdohzkecewegmsjci3yrdj found: installing from source                                                                          
==> Using cached archive: /scratch/tm70/aph502/spack/var/spack/cache/_source-cache/archive/36/3693987aba2c8ae8af67a0e222bea4099a48afe09b8d3d334106f9d7fc311485.tar.gz
==> No patches needed for esmf
==> esmf: Executing phase: 'edit'
==> esmf: Executing phase: 'build'
==> esmf: Executing phase: 'install'
==> esmf: Successfully installed esmf-8.2.0-2eu34q4aa2fdohzkecewegmsjci3yrdj
  Fetch: 0.12s.  Build: 33m 0.64s.  Total: 33m 0.76s.
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/esmf-8.2.0-2eu34q4aa2fdohzkecewegmsjci3yrdj
==> Updating view at /scratch/tm70/aph502/spack/var/spack/environments/esmf_intel/.spack-env/view
==> Warning: Skipping external package: libxml2@2.9.7%intel@2021.6.0~python arch=linux-rocky8-x86_64/v3vfk3p
==> Warning: Skipping external package: openmpi@4.1.4%gcc@8.5.0~atomics+cuda+cxx~cxx_exceptions~gpfs~internal-hwloc~java~legacylaunchers+lustre~memchecker+pmi+romio+rsh~singularity~static+vt~wrapper-rpath cuda_arch=none fabrics=ucx schedulers=tm arch=linux-rocky8-x86_64/ltxdeph
==> Warning: Skipping external package: pkgconf@1.4.2%intel@2021.6.0 arch=linux-rocky8-x86_64/kbhodza

And that seems to have used the system OpenMPI and other tools as expected:

$ spack find -dLfp esmf
==> In environment esmf_intel
==> Root specs
-- no arch / intel@2021.6.0 -------------------------------------
-------------------------------- esmf%intel@2021.6.0


==> 1 installed package
-- linux-rocky8-x86_64 / intel@2021.6.0 -------------------------
2eu34q4aa2fdohzkecewegmsjci3yrdj esmf@8.2.0%intel                   /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/esmf-8.2.0-2eu34q4aa2fdohzkecewegmsjci3yrdj
v3vfk3p6zrcphz6rf67r62opcnhgkxiy     libxml2@2.9.7%intel            /usr
ivdrt2yextjx2rs7o6yz32uypo4bpwmz     netcdf-c@4.8.1%intel           /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/netcdf-c-4.8.1-ivdrt2yextjx2rs7o6yz32uypo4bpwmz
vbqq44vf36sqxlyezh3uox5xmb3rdgc7         hdf5@1.12.2%intel          /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/hdf5-1.12.2-vbqq44vf36sqxlyezh3uox5xmb3rdgc7
krqai4wfhu3u2do55zq3nfeg63wc3akd             cmake@3.20.2%intel     /usr
ltxdephdcg3j5dajcykaoxewqzme2rza             openmpi@4.1.4%gcc      /apps/openmpi/4.1.4
kbhodzapvipxdj6m36xif2hdhm7mj7qw             pkgconf@1.4.2%intel    /usr
sf2twx6y2kehnsae7odxskzjqs67zc3o             zlib@1.2.12%intel      /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/zlib-1.2.12-sf2twx6y2kehnsae7odxskzjqs67zc3o
fhh4a2q3z4kdu2en5pj22bcmg4kau6ws         m4@1.4.18%intel            /usr
cxx4ujcxvhf6ubvtf2tikj3rtotqbf3c     netcdf-fortran@4.5.4%intel     /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/netcdf-fortran-4.5.4-cxx4ujcxvhf6ubvtf2tikj3rtotqbf3c
g4x4mgo7heqesmkjhu3gn2zyhfq4rurw     parallel-netcdf@1.12.2%intel   /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/parallel-netcdf-1.12.2-g4x4mgo7heqesmkjhu3gn2zyhfq4rurw
kzt7y2qs7lmyv6g4lq3f4jm47l6cx5dy         perl@5.26.3%intel          /usr
7airba3wdwn55izglso27nv53dt4xkmz     xerces-c@3.2.3%gcc             /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/xerces-c-3.2.3-7airba3wdwn55izglso27nv53dt4xkmz
pl3qqm2tp3pgmmhhcoc63vogalzefkwz         libiconv@1.16%gcc          /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/libiconv-1.16-pl3qqm2tp3pgmmhhcoc63vogalzefkwz

Are there any other environment variables I’m missing there @dale.roberts?

So a drawback of this approach is it assumes OpenMPI is being used with the Intel Compiler. The compiler name could be changed to emphasise this. Or these environment changes might be added to the OpenMPI package.

You could try lying to it about the underlying compiler? If spack is figuring out that the real Intel fortran compiler is at /apps/intel-oneapi/compiler/2022.1.0/linux/bin/intel64/ifort, perhaps you can convince it that its actually /apps/intel-ct/wrapper/ifort? That way you’ll get NCI’s path manipulation after spack’s.

Another option is to make a dummy openmpi/4.1.4%intel module to load as an external module that sets up the paths. It’s a shame you can’t prepend-path within packages.yaml like you can with the compilers.

Yeah I did wonder about that. I think the advantage of using a module is you can inspect the other loaded modules and see if either Intel or GNU is being used and set paths appropriately.

Ugh. What a waste of time. So I stuffed up and above where it errored it was because I was trying to use a funky mixed Intel/gcc hybrid I was messing about with.

When I used the vanilla %intel@2021.6.0 with the following compiler definition:

- compiler:
    spec: intel@2021.6.0
    paths:
      cc: /apps/intel-ct/wrapper/icc
      cxx: /apps/intel-ct/wrapper/icpc
      f77: /apps/intel-ct/wrapper/ifort
      fc: /apps/intel-ct/wrapper/ifort
    flags: {}
    operating_system: rocky8
    target: x86_64
    modules: [ intel-compiler/2021.6.0 ]
    environment: {}
    extra_rpaths: []

it worked fine:

$ spack install esmf %intel@2021.6.0
[+] /usr (external libxml2-2.9.7-v3vfk3p6zrcphz6rf67r62opcnhgkxiy)                  
[+] /usr (external cmake-3.20.2-krqai4wfhu3u2do55zq3nfeg63wc3akd)
==> openmpi@4.1.4 : has external module in ['openmpi/4.1.4']                                                               
[+] /apps/openmpi/4.1.4 (external openmpi-4.1.4-ltxdephdcg3j5dajcykaoxewqzme2rza)
[+] /usr (external pkgconf-1.4.2-kbhodzapvipxdj6m36xif2hdhm7mj7qw)                                   
==> Installing zlib-1.2.12-sf2twx6y2kehnsae7odxskzjqs67zc3o                                                                                                    
==> No binary for zlib-1.2.12-sf2twx6y2kehnsae7odxskzjqs67zc3o found: installing from source
==> Using cached archive: /scratch/tm70/aph502/spack/var/spack/cache/_source-cache/archive/91/91844808532e5ce316b3c010929493c0244f3d37593afd6de04f71821d5136d9.
tar.gz                                           
==> Applied patch /scratch/tm70/aph502/spack/var/spack/repos/builtin/packages/zlib/configure-cc.patch
==> zlib: Executing phase: 'install'        
==> zlib: Successfully installed zlib-1.2.12-sf2twx6y2kehnsae7odxskzjqs67zc3o
  Fetch: 0.05s.  Build: 16.26s.  Total: 16.30s.                                                 
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/zlib-1.2.12-sf2twx6y2kehnsae7odxskzjqs67zc3o
[+] /usr (external m4-1.4.18-fhh4a2q3z4kdu2en5pj22bcmg4kau6ws)                                                                   
[+] /usr (external perl-5.26.3-kzt7y2qs7lmyv6g4lq3f4jm47l6cx5dy)
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/libiconv-1.16-pl3qqm2tp3pgmmhhcoc63vogalzefkwz
==> Installing hdf5-1.12.2-vbqq44vf36sqxlyezh3uox5xmb3rdgc7                                                                                                    
==> No binary for hdf5-1.12.2-vbqq44vf36sqxlyezh3uox5xmb3rdgc7 found: installing from source
==> Using cached archive: /scratch/tm70/aph502/spack/var/spack/cache/_source-cache/archive/2a/2a89af03d56ce7502dcae18232c241281ad1773561ec00c0f0e8ee2463910f14.
tar.gz                           
==> Ran patch() for hdf5          
==> hdf5: Executing phase: 'cmake'  
==> hdf5: Executing phase: 'build'                                          
==> hdf5: Executing phase: 'install'                   
==> hdf5: Successfully installed hdf5-1.12.2-vbqq44vf36sqxlyezh3uox5xmb3rdgc7                                          
  Fetch: 0.10s.  Build: 7m 0.25s.  Total: 7m 0.35s.                                        
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/hdf5-1.12.2-vbqq44vf36sqxlyezh3uox5xmb3rdgc7
==> Installing parallel-netcdf-1.12.2-g4x4mgo7heqesmkjhu3gn2zyhfq4rurw                                                                                         
==> No binary for parallel-netcdf-1.12.2-g4x4mgo7heqesmkjhu3gn2zyhfq4rurw found: installing from source                
==> Using cached archive: /scratch/tm70/aph502/spack/var/spack/cache/_source-cache/archive/3e/3ef1411875b07955f519a5b03278c31e566976357ddfc74c2493a1076e7d7c74$
tar.gz                                             
==> No patches needed for parallel-netcdf
==> parallel-netcdf: Executing phase: 'autoreconf'
==> parallel-netcdf: Executing phase: 'configure'
==> parallel-netcdf: Executing phase: 'build'
==> parallel-netcdf: Executing phase: 'install'
==> parallel-netcdf: Successfully installed parallel-netcdf-1.12.2-g4x4mgo7heqesmkjhu3gn2zyhfq4rurw
  Fetch: 0.06s.  Build: 8m 34.65s.  Total: 8m 34.71s.
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/parallel-netcdf-1.12.2-g4x4mgo7heqesmkjhu3gn2zyhfq4rurw
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/xerces-c-3.2.3-7airba3wdwn55izglso27nv53dt4xkmz
==> Installing netcdf-c-4.8.1-ivdrt2yextjx2rs7o6yz32uypo4bpwmz
==> No binary for netcdf-c-4.8.1-ivdrt2yextjx2rs7o6yz32uypo4bpwmz found: installing from source
==> Using cached archive: /scratch/tm70/aph502/spack/var/spack/cache/_source-cache/archive/bc/bc018cc30d5da402622bf76462480664c6668b55eb16ba205a0dfb8647161dd0.
tar.gz
==> Applied patch /scratch/tm70/aph502/spack/var/spack/repos/builtin/packages/netcdf-c/4.8.1-no-strict-aliasing-config.patch
==> netcdf-c: Executing phase: 'autoreconf'
==> netcdf-c: Executing phase: 'configure'
==> netcdf-c: Executing phase: 'build'
==> netcdf-c: Executing phase: 'install'
==> netcdf-c: Successfully installed netcdf-c-4.8.1-ivdrt2yextjx2rs7o6yz32uypo4bpwmz                                                                             Fetch: 0.14s.  Build: 3m 48.86s.  Total: 3m 49.00s.
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/netcdf-c-4.8.1-ivdrt2yextjx2rs7o6yz32uypo4bpwmz
==> Installing netcdf-fortran-4.5.4-cxx4ujcxvhf6ubvtf2tikj3rtotqbf3c                                                                                           ==> No binary for netcdf-fortran-4.5.4-cxx4ujcxvhf6ubvtf2tikj3rtotqbf3c found: installing from source
==> Using cached archive: /scratch/tm70/aph502/spack/var/spack/cache/_source-cache/archive/0a/0a19b26a2b6e29fab5d29d7d7e08c24e87712d09a5cafeea90e16e0a2ab86b81.tar.gz                              
==> No patches needed for netcdf-fortran                                
==> netcdf-fortran: Executing phase: 'autoreconf'                                                                                                              ==> netcdf-fortran: Executing phase: 'configure'
==> netcdf-fortran: Executing phase: 'build'                                                                                                                   ==> netcdf-fortran: Executing phase: 'install'
==> netcdf-fortran: Successfully installed netcdf-fortran-4.5.4-cxx4ujcxvhf6ubvtf2tikj3rtotqbf3c
  Fetch: 0.04s.  Build: 2m 15.77s.  Total: 2m 15.81s.
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/netcdf-fortran-4.5.4-cxx4ujcxvhf6ubvtf2tikj3rtotqbf3c
==> Installing esmf-8.2.0-2eu34q4aa2fdohzkecewegmsjci3yrdj
==> No binary for esmf-8.2.0-2eu34q4aa2fdohzkecewegmsjci3yrdj found: installing from source
==> Using cached archive: /scratch/tm70/aph502/spack/var/spack/cache/_source-cache/archive/36/3693987aba2c8ae8af67a0e222bea4099a48afe09b8d3d334106f9d7fc311485.
tar.gz
==> No patches needed for esmf
==> esmf: Executing phase: 'edit'
==> esmf: Executing phase: 'build'
==> esmf: Executing phase: 'install'
==> esmf: Successfully installed esmf-8.2.0-2eu34q4aa2fdohzkecewegmsjci3yrdj
  Fetch: 0.13s.  Build: 32m 27.19s.  Total: 32m 27.32s.
[+] /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/esmf-8.2.0-2eu34q4aa2fdohzkecewegmsjci3yrdj
==> Updating view at /scratch/tm70/aph502/spack/var/spack/environments/esmf/.spack-env/view
==> Warning: Skipping external package: libxml2@2.9.7%intel@2021.6.0~python arch=linux-rocky8-x86_64/v3vfk3p
==> Warning: Skipping external package: openmpi@4.1.4%gcc@8.5.0~atomics+cuda+cxx~cxx_exceptions~gpfs~internal-hwloc~java~legacylaunchers+lustre~memchecker+pmi+
romio+rsh~singularity~static+vt~wrapper-rpath cuda_arch=none fabrics=ucx schedulers=tm arch=linux-rocky8-x86_64/ltxdeph
==> Warning: Skipping external package: pkgconf@1.4.2%intel@2021.6.0 arch=linux-rocky8-x86_64/kbhodza
$ spack find -dLfp esmf
==> In environment esmf
==> Root specs
-- no arch / intel@2021.6.0 -------------------------------------
-------------------------------- esmf%intel@2021.6.0


==> 1 installed package
-- linux-rocky8-x86_64 / intel@2021.6.0 -------------------------
2eu34q4aa2fdohzkecewegmsjci3yrdj esmf@8.2.0%intel                   /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/esmf-8.2.0-2eu34q4aa2fdohzkecewegmsjci3yrdj
v3vfk3p6zrcphz6rf67r62opcnhgkxiy     libxml2@2.9.7%intel            /usr
ivdrt2yextjx2rs7o6yz32uypo4bpwmz     netcdf-c@4.8.1%intel           /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/netcdf-c-4.8.1-ivdrt2yextjx2rs7o6yz32uypo4bpwmz
vbqq44vf36sqxlyezh3uox5xmb3rdgc7         hdf5@1.12.2%intel          /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/hdf5-1.12.2-vbqq44vf36sqxlyezh3uox5xmb3rdgc7
krqai4wfhu3u2do55zq3nfeg63wc3akd             cmake@3.20.2%intel     /usr
ltxdephdcg3j5dajcykaoxewqzme2rza             openmpi@4.1.4%gcc      /apps/openmpi/4.1.4
kbhodzapvipxdj6m36xif2hdhm7mj7qw             pkgconf@1.4.2%intel    /usr
sf2twx6y2kehnsae7odxskzjqs67zc3o             zlib@1.2.12%intel      /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/zlib-1.2.12-sf2twx6y2kehnsae7odxskzjqs67zc3o
fhh4a2q3z4kdu2en5pj22bcmg4kau6ws         m4@1.4.18%intel            /usr
cxx4ujcxvhf6ubvtf2tikj3rtotqbf3c     netcdf-fortran@4.5.4%intel     /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/netcdf-fortran-4.5.4-cxx4ujcxvhf6ubvtf2tikj3rtotqbf3c
g4x4mgo7heqesmkjhu3gn2zyhfq4rurw     parallel-netcdf@1.12.2%intel   /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/intel-2021.6.0/parallel-netcdf-1.12.2-g4x4mgo7heqesmkjhu3gn2zyhfq4rurw
kzt7y2qs7lmyv6g4lq3f4jm47l6cx5dy         perl@5.26.3%intel          /usr
7airba3wdwn55izglso27nv53dt4xkmz     xerces-c@3.2.3%gcc             /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/xerces-c-3.2.3-7airba3wdwn55izglso27nv53dt4xkmz
pl3qqm2tp3pgmmhhcoc63vogalzefkwz         libiconv@1.16%gcc          /scratch/tm70/aph502/spack/opt/spack/linux-rocky8-x86_64/gcc-8.5.0/libiconv-1.16-pl3qqm2tp3pgmmhhcoc63vogalzefkwz

Sorry to have wasted time with a wild goose chase.

Long story short: spack is fine to compile on NCI hardware, including using important pre-built system libraries and tools such as compilers and OpenMPI.

I am now working on getting MOM5 compiling.

Awesome, glad that worked. In my experience, its better to take advantage of work already done. In this case, NCI figured out when and how to manipulate paths to get mixed Intel/GNU builds working, and put all of that logic in those compiler wrappers. So rather than try to re-create that within spack, have the wrappers do it for spack.

1 Like

100% agreed.

I’ve tested all three of these intel compilers and they all work fine:

compilers:
- compiler:
    spec: intel@2021.6.0
    paths:
      cc: /apps/intel-ct/wrapper/icc
      cxx: /apps/intel-ct/wrapper/icpc
      f77: /apps/intel-ct/wrapper/ifort
      fc: /apps/intel-ct/wrapper/ifort
    flags: {}
    operating_system: rocky8
    target: x86_64
    modules: [intel-compiler/2021.6.0]
    environment: {}
    extra_rpaths: []
- compiler:
    spec: intel@2021.7.0
    paths:
      cc: /apps/intel-ct/wrapper/icc
      cxx: /apps/intel-ct/wrapper/icpc
      f77: /apps/intel-ct/wrapper/ifort
      fc: /apps/intel-ct/wrapper/ifort
    flags: {}
    operating_system: rocky8
    target: x86_64
    modules: [intel-compiler/2021.7.0]
    environment: {}
    extra_rpaths: []
- compiler:
    spec: oneapi@2022.0.0
    paths:
      cc: /apps/intel-ct/wrapper/icx
      cxx: /apps/intel-ct/wrapper/icpx
      f77: /apps/intel-ct/wrapper/ifx
      fc: /apps/intel-ct/wrapper/ifx
    flags: {}
    operating_system: rocky8
    target: x86_64
    modules: [intel-compiler-llvm@2022.0.0]
    environment: {}
    extra_rpaths: []

I decided the way to go was create a script to generate the compilers.yaml file for all the available compilers, as there isn’t much to add as they’re all using wrappers, so just have to match the module file to the spec. It’s straightforward for the non-OneAPI versions.

Probably should also do something similar for OpenMPI.

If anyone else wants to do this feel free. I think we’re pretty close to a point where we should make shared configs and put them somewhere in /g/data/access so anyone can experiment with using spack and know we’re all using the same (working) config.

It seems that OneAPI provides llvm and traditional compilers, is that just as simple as using the traditional wrappers, i.e. icc and ifort instead of icx and ifx?

The intel-compiler and intel-compiler-llvm modules provide access to the same set of compilers, but set some underlying environment variables differently such that OpenMPI and Intel MPI will use either the classic or LLVM-based intel compilers. The version number has diverged between classic and LLVM compilers, so intel-compiler-llvm/2022.2.0 and intel-compiler/2021.7.0 are from the same Intel OneAPI release. I wrote it all down here: https://opus.nci.org.au/display/Help/Intel+oneAPI. So it is best to tell spack about both of them, that way when you build MPI applications, the same compiler will be invoked for serial and parallel compilation.

1 Like

Hey @dale.roberts

I have generated the compilers.yaml and packages.yaml files programmatically to control what ends up in there.

Can you take a look, specifically at compilers.yaml and let me know if you think the general pattern is correct

They follow a similar pattern to the examples below:

- compiler:
    spec: intel@2021.7.0
    paths:
      cc: /apps/intel-ct/wrapper/icc
      cxx: /apps/intel-ct/wrapper/icpc
      f77: /apps/intel-ct/wrapper/ifort
      fc: /apps/intel-ct/wrapper/ifort
    flags: {}
    operating_system: rocky8
    target: x86_64
    modules:
    - intel-compiler/2021.7.0
    environment: {}
    extra_rpaths: []
- compiler:
    spec: oneapi@2022.0.0
    paths:
      cc: /apps/intel-ct/wrapper/icx
      cxx: /apps/intel-ct/wrapper/icp
      f77: /apps/intel-ct/wrapper/ifx
      fc: /apps/intel-ct/wrapper/ifx
    flags: {}
    operating_system: rocky8
    target: x86_64
    modules:
    - intel-compiler-llvm/2022.0.0
    environment: {}
    extra_rpaths: []

where the compilers are pointed to the wrappers and the corresponding module file is loaded when the compiler is used to build.

It seems to work, just wanted to check I had the right wrappers, especially gcc:

- compiler:
    spec: gcc@12.2.0
    paths:
      cc: /opt/nci/wrappers/gcc
      cxx: /opt/nci/wrappers/g++
      f77: /opt/nci/wrappers/gfortran
      fc: /opt/nci/wrappers/gfortran
    flags: {}
    operating_system: rocky8
    target: x86_64
    modules:
    - gcc/12.2.0
    environment: {}
    extra_rpaths: []

Thanks

Hi @Aidan . Sorry to be the bearer of bad news, but these wrapper paths aren’t correct. The gcc modules have their wrappers in /apps/gcc/<version>/wrappers. Also, there is no guarantee that these wrappers will remain in the same paths forever. Also, you may wish to add the nvidia-hpc-sdk compilers at some point, which have a different set of wrappers.

Your script should programatically derive the wrapper paths and compiler names too. What I would do is use the python interface to environment-modules, load each module as its discovered, inspect the $CC, $CXX, $F77 and $FC environment variables (if they exist, you’ll need to handle the gcc/default module as well) to determine the compiler names, then use e.g. shutil.which(compiler_name) to get the full path to the compiler wrapper.

Oh no, not bad news at all. Thanks for the feedback. I initially had the wrappers for the different versions but when I checked they were identical to each other and the one in /opt:

$ for w in /apps/gcc/*/wrappers/gcc; do diff -s /opt/nci/wrappers/gcc $w; done
Files /opt/nci/wrappers/gcc and /apps/gcc/10.3.0/wrappers/gcc are identical
Files /opt/nci/wrappers/gcc and /apps/gcc/11.1.0/wrappers/gcc are identical
Files /opt/nci/wrappers/gcc and /apps/gcc/12.2.0/wrappers/gcc are identical

so I just use the one wrapper path. Is there something I’m missing there?

Note to self: this is nvhpc in spack land

They may not stay identical. They are deployed this way in case changes have to be made for the next installation. I kind of wish I did that for the intel compiler wrappers instead of just carrying over what was deployed on Raijin. If you look at the ifx wrapper there is some ugly version checking needed to bypass a compiler bug. Would have been much nicer to just modify the wrapper for the affected versions.

1 Like

It may be this could be done in a much more straightforward manner, assuming the correct environment variables are set in the module: