Shared access spack configuration

I have started to set up a shared spack installation in /g/data/access. One of the motivations to do this now was to lower the barrier to collaboration and to have a common basis from which to work: spack is very sensitive to how it is configured when it comes to find packages and determining what to install.

I’ve created a spack directory in /g/data/access/apps:

$ ls -lh /g/data/access/apps/spack
total 20K
dr-xr-sr-x+ 9 aph502 access.admin 4.0K Dec  1 17:13 0.18.0
dr-xr-sr-x+ 9 aph502 access.admin 4.0K Dec  1 13:58 0.19.0
drwxrwsr-x+ 5 aph502 access.admin 4.0K Dec  2 15:19 config
-rwxrwxr-x+ 1 aph502 access.admin  672 Dec  1 13:57 get_spack
drwxrwsr-x+ 3 aph502 access.admin 4.0K Dec  2 15:05 opt

Thought it isn’t strictly necessary to support more than spack/0.19.0 this capability will be required so I’ve started with the two most recent versions as a proof of concept.

Each spack version is a shallow clone of just the branch corresponding to the version tag. The get_spack script makes this process reproducible and relatively painless:

$ cat /g/data/access/apps/spack/get_spack 
#!/bin/bash

# Copyright 2022 ACCESS-NRI and contributors.
# SPDX-License-Identifier: Apache-2.0

help() {
cat <<EOF

Download a specific version of spack using shallow clone

Usage:

  get_spack <version>

General Options:
    -h:         Print help

EOF
}

if [ "$#" -ne 1 ]; then
    echo "Must specify a version tag"
    help
    exit 22
fi

# version supplied as a command line argument
VERSION=$1

# Prepend v to version to create spack compatible version tag
VERSION_TAG=v${VERSION}

# Make a shallow clone of only this version
git clone https://github.com/spack/spack.git --depth 1 --branch $VERSION_TAG $VERSION

# Remove write permissions
chmod -R a=rX $VERSION

Note that the script removes write permission, as the goal is to not write into this directory.

I’ve created spack loadable modules in /g/data/access/modules/spack:

$ ls -lrtha /g/data/access/modules/spack
total 16K
drwxrwx---+ 69 wml548 access.admin 4.0K Dec  1 10:26 ..
-rw-rw----+  1 aph502 access.admin   56 Dec  1 17:18 .modulerc
lrwxrwxrwx   1 aph502 access.admin    7 Dec  1 17:20 0.18.0 -> .common
lrwxrwxrwx   1 aph502 access.admin    7 Dec  1 17:20 0.19.0 -> .common
-rw-r-----+  1 aph502 access.admin 2.0K Dec  2 15:30 .common
drwxrws---+  2 aph502 access.admin 4.0K Dec  2 15:30 .

All that is required to add a new spack module version is to run the above get_spack script in apps/spack and then add a symbolic link in modules/spack to /g/data/access/modules/spack/.common. This module file dynamically finds the version from the name of the file that links to it, and sets appropriate environment variables:

$ cat /g/data/access/modules/spack/.common 
#%Module1.0
# Loads standard directory structure into environment variables
# 

set help   "spack build from source HPC package manager"
set install-contact "https://forum.access-hive.org.au/c/infrastructure/modules/25"
set url    "https://spack.io"
set spackhome "/g/data/access/apps/spack"

set version [lindex [split [module-info name] {/}] 1]

# Cannot have conda loaded, it sets way too many environment variables to play
# nicely with spack
conflict conda

# System python has a bug when accessing lustre. Need to load a version of python that 
# works, so insist on a known good version

if ![ is-loaded python3/3.11.0 ] {
  module load python3/3.11.0
}

set prefix $spackhome/$version
setenv SPACK_PYTHON /apps/python3/3.11.0/bin/python3 
setenv SPACK_SYSTEM_CONFIG_PATH $spackhome/config/system
setenv SPACK_INSTALL_TREE $spackhome/opt/spack/$version

# the env array is a global, so need to access with :: syntax
set project $::env(PROJECT)
set user $::env(USER)

if [ file isdirectory "/scratch" ] {
  setenv SPACK_USER_CACHE_PATH /scratch/$project/$user/spack_user_cache
}

if { [module-info mode load] } {

  puts stderr "spack needs to create a number of shell functions and set environment variables to"
  puts stderr "work correctly. Run the following command to configure your shell to use spack:"
  puts stderr ""
  puts stderr ". $prefix/share/spack/setup-env.sh" 
  puts stderr ""

} elseif { [module-info mode unload] } {

  puts stderr ""
  puts stderr "WARNING! This shell will still have spack specific paths and commands loaded"
  puts stderr "If this causes any issues close this shell and open a new one"
  puts stderr ""

}

module-whatis "[module-info name]: $help"

proc ModulesHelp { } {
    variable help
    variable url
    variable install-contact
    puts stderr "Module [module-info name]"
    puts stderr "$help"
    puts stderr "Information available at $url"
    puts stderr "Go to ${install-contact} for more information"
}

Some important points:

  1. The system python version has a bug with the lustre file system, so we need to load a known good python3 module and set the SPACK_PYTHON environment variable
  2. The SPACK_SYSTEM_CONFIG_PATH environment variable is set to /g/data/access/apps/spack/config/system. This allows for system wide configuration (see the docs about configuration scopes).
  3. SPACK_INSTALL_TREE is set to /g/data/access/apps/spack/opt/spack/$version. This is where packages will be installed to be used by others. It isn’t intended that everyone who uses spack install into this location.
  4. To load the modules you need to do module use /g/data/access/modules and when you do module load spack you get this
spack needs to create a number of shell functions and set environment variables to
work correctly. Run the following command to configure your shell to use spack:

. /g/data/access/apps/spack/0.19.0/share/spack/setup-env.sh

This is because I can’t figure out how to source the bash setup script in the module file. It is against the design principle that environment modules can be unloaded and revert the shell to it’s previous state, which this can’t do.

The site specific installation files are here:

$ ls -lah /g/data/access/apps/spack/config/system/
total 16K
drwxrwsr-x+ 2 aph502 access.admin 4.0K Dec  2 15:27 .
drwxrwsr-x+ 5 aph502 access.admin 4.0K Dec  2 15:19 ..
lrwxrwxrwx  1 aph502 access.admin   30 Dec  2 12:37 compilers.yaml -> ../spack_config/compilers.yaml
-r--r--r--+ 1 aph502 access.admin 1.4K Dec  2 15:11 config.yaml
lrwxrwxrwx  1 aph502 access.admin   29 Dec  2 12:37 packages.yaml -> ../spack_config/packages.yaml
-rw-rw-r--+ 1 aph502 access.admin   46 Dec  2 15:27 repos.yml

Note that compilers.yaml and packages.yaml are generated by find_compilers_packages.py in this repo

This is still not working, but I wanted to get some eyes (principally @Scott and @dale.roberts) on this before I go too much further down the rabbit hole in case there are major issues with what I’ve done so far.

Also just wanted to document where I had go to so far.

For testing purposes I have created a module spack/config that contains just enough configuration to standardise compilers, MPI and external packages but use your own spack tree.

To use this do a shallow clone of spack into your own directory (on /scratch/ or /g/data)

git clone git@github.com:spack/spack.git --depth 1 --branch v0.19.0

then load the spack/config module

module use /g/data/access/modules/
module load spack/config

you’ll get a message like this:

spack needs to create a number of shell functions and set environment variables to
work correctly. Run the following command to configure your shell to use spack:

. <path to your spack directory>/share/spack/setup-env.sh

so do as it says and source the setup-env.sh script from the spack directory you cloned.

After that it you can test spack with something like this:

$ spack spec netcdf-fortran %intel@2021.6.0 ^openmpi@4.1.4                                                          
Input spec
--------------------------------
netcdf-fortran%intel@2021.6.0
    ^openmpi@4.1.4

Concretized
--------------------------------
netcdf-fortran@4.6.0%intel@2021.6.0~doc+pic+shared build_system=autotools arch=linux-rocky8-cascadelake
    ^netcdf-c@4.9.0%intel@2021.6.0~dap~fsync~hdf4~jna+mpi+optimize~parallel-netcdf+pic+shared+zstd build_system=autotools arch=linux-rocky8-cascadelake
        ^hdf5@1.12.2%intel@2021.6.0~cxx~fortran+hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_system=cmake build_type=RelWithDebInfo arch=linux-rocky8-cascadelake
            ^cmake@3.24.2%intel@2021.6.0~doc+ncurses+ownlibs~qt build_system=generic build_type=Release arch=linux-rocky8-cascadelake
            ^pkgconf@1.4.2%intel@2021.6.0 build_system=autotools arch=linux-rocky8-cascadelake
        ^m4@1.4.18%intel@2021.6.0+sigsegv build_system=autotools patches=3877ab5,fc9b616 arch=linux-rocky8-cascadelake
        ^openmpi@4.1.4%intel@2021.6.0~atomics~cuda~cxx~cxx_exceptions~gpfs~internal-hwloc~java~legacylaunchers~lustre~memchecker+romio+rsh~singularity+static+vt+wrapper-rpath build_system=autotools fabrics=none schedulers=none arch=linux-rocky8-cascadelake
        ^zlib@1.2.13%intel@2021.6.0+optimize+pic+shared build_system=makefile arch=linux-rocky8-cascadelake
        ^zstd@1.5.2%intel@2021.6.0~programs build_system=makefile libs=shared,static arch=linux-rocky8-cascadelake

and if that looks ok try

spack install netcdf-fortran %intel@2021.6.0 ^openmpi@4.1.4 

and afterwards you can check what was built, and what external packages were used:

$ spack find -dlfp netcdf-fortran
-- linux-rocky8-cascadelake / intel@2021.6.0 --------------------
l6jywju netcdf-fortran@4.6.0%intel          /scratch/v45/aph502/opt/spack/linux-rocky8-cascadelake/intel-2021.6.0/netcdf-fortran-4.6.0-l6jywjuwrnqdfheafvxrnslkvrafhpr7
t5vmcra     netcdf-c@4.9.0%intel            /scratch/v45/aph502/opt/spack/linux-rocky8-cascadelake/intel-2021.6.0/netcdf-c-4.9.0-t5vmcraajgpax4rnsuceljayogf5zmdb
xmsywdn         hdf5@1.12.2%intel           /scratch/v45/aph502/opt/spack/linux-rocky8-cascadelake/intel-2021.6.0/hdf5-1.12.2-xmsywdnkwqsjl5t3qqcngqeztbeybznn
tr5446k             cmake@3.24.2%intel      /apps/cmake/3.24.2
vqr4piq             pkgconf@1.4.2%intel     /usr
6lm7an5         m4@1.4.18%intel             /usr
yvq3ppq         openmpi@4.1.4%intel         /apps/openmpi/4.1.4
qbwlozg         zlib@1.2.13%intel           /scratch/v45/aph502/opt/spack/linux-rocky8-cascadelake/intel-2021.6.0/zlib-1.2.13-qbwlozg4cobphkvyaizqmi47xw3ara5o
hvjxleo         zstd@1.5.2%intel            /scratch/v45/aph502/opt/spack/linux-rocky8-cascadelake/intel-2021.6.0/zstd-1.5.2-hvjxleot65gcc6xeciefjgeimov4i72l

Current issues: it is defaulting to intel-mpi even though I thought I had created a preference for openmpi, and it is defaulting to a debug build of openmpi, even though, again, I thought I’d set a preference for a specific non-debug version.

Thanks Aidan, great to see the progress.

  • Should the spack environment be kept separate from the old unmanaged apps directory, e.g. install under /g/data/access/spack instead?
  • Further, can the central spack enviroment be controlled by CI somehow? Ideally I’d like us to have records of what’s been installed rather than things happening ad-hoc

Yeah probably. Do you have a specific layout in mind? Happy to go with it if so. Bear in mind that annoying sourcing issue that doesn’t seem to play nicely with modules. I can see other HPC centres have spack loadable from modules, so it isn’t insurmountable it seems, but I can see how they did it.

Yeah that is definitely the goal, which I did mean to state in the original post. The ideal is that this would be generated by a service account under CI and that packages would be added via pull requests to a repo.

It is undoubtedly true that this would easier to do from the NCI gitlab instance, but I’d prefer to have it more visible than that if possible.

The idea of starting this way was to get a bit more experience interactively with a shared configuration. If you don’t think that is necessary we can jump right into doing it via CI.

Looking at these examples from other HPC centres, it might be this could be done much more concisely, but it might require the modules to set the correct environment variables

e.g.

    netcdf:
        buildable: false
        modules:
            netcdf@4.4.1.1.6%gcc+parallel-netcdf+mpi:   cray-netcdf-hdf5parallel/4.4.1.1.6
            netcdf@4.4.1.1.6%intel+parallel-netcdf+mpi: cray-netcdf-hdf5parallel/4.4.1.1.6
            netcdf@4.4.1.1.6%cce+parallel-netcdf+mpi:   cray-netcdf-hdf5parallel/4.4.1.1.6
            netcdf@4.6.1.3%gcc+parallel-netcdf+mpi:     cray-netcdf-hdf5parallel/4.6.1.3
            netcdf@4.6.1.3%intel+parallel-netcdf+mpi:   cray-netcdf-hdf5parallel/4.6.1.3
            netcdf@4.6.1.3%cce+parallel-netcdf+mpi:     cray-netcdf-hdf5parallel/4.6.1.3
            netcdf@4.4.1.1.6%gcc~parallel-netcdf~mpi:   cray-netcdf/4.4.1.1.6
            netcdf@4.4.1.1.6%intel~parallel-netcdf~mpi: cray-netcdf/4.4.1.1.6
            netcdf@4.4.1.1.6%cce~parallel-netcdf~mpi:   cray-netcdf/4.4.1.1.6
            netcdf@4.6.1.3%gcc~parallel-netcdf~mpi:     cray-netcdf/4.6.1.3
            netcdf@4.6.1.3%intel~parallel-netcdf~mpi:   cray-netcdf/4.6.1.3
            netcdf@4.6.1.3%cce~parallel-netcdf~mpi:     cray-netcdf/4.6.1.3
    netcdf-fortran:
        buildable: false
        modules:
            netcdf-fortran@4.4.1.1.6%intel: cray-netcdf-hdf5parallel/4.4.1.1.6
            netcdf-fortran@4.4.1.1.6%cce:   cray-netcdf-hdf5parallel/4.4.1.1.6
            netcdf-fortran@4.4.1.1.6%gcc:   cray-netcdf-hdf5parallel/4.4.1.1.6
            netcdf-fortran@4.6.1.3%intel:   cray-netcdf-hdf5parallel/4.6.1.3
            netcdf-fortran@4.6.1.3%cce:     cray-netcdf-hdf5parallel/4.6.1.3
            netcdf-fortran@4.6.1.3%gcc:     cray-netcdf-hdf5parallel/4.6.1.3

Actually I might have changed my mind about this @Scott. The existing spack Gitlab CI support is a perfect tool for this purpose. It is used to generate the E4S software stack.

This is an example of using CI to install software at NERSC

Related: A nice summary of deploying an E4S software stack at NERSC

A simple spack pipeline demo

Part of the reason I might have changed my mind, is that it could well be useful to know how easy this is done on gitlab@NCI as a point of comparison with running CI directly from GitHub.

I’ve got a minimal version of building spack modules from gitlab ci at BOM / Next Generation Modeling Team / spack-pipeline · GitLab (nci.org.au), it should be viewable with a NCI account. For some reason there have been intermittent issues uploading artifacts to the nci gitlab, hopefully that’s only a temporary outage.

For the moment it’s installing software into the bureau hc46 project. It doesn’t appear to automatically make modules.

Sweet! Did that take much work to get going?

Edit: Specifically, was the gitlab-ci stuff easy to configure? I see you’re picking up the shared config stuff, thanks, good to know it is working for you.

This is cool. Reading between the lines a bit, I assume you have a gitlab runner… somewhere, The existence of the hc46_gitlab user implies to me that you have some service user in control of this. Would this runner be on some Nirin/VMware VM, or is it a long-running service on some hardware inside Gadi’s VLAN? If its on a cloud node, that means someone must have planted a public key on Gadi to get it to build things. Just trying to figure out what kind of interesting things the cool people can get out of NCI…

1 Like

It was a bit tricky to get an understanding of how exactly it works, and some very strange glitches in the gitlab archive system didn’t help. Once I’d worked out what environment variables needed to be set it came together though.

Yeah our (bom ngm) gitlab runner is on nirin, with ssh to talk to the hc46_gitlab service user on Gadi. It can also run docker containers on the VM itself which I’ve used to build singularity containers of conda environments (we kept running out of inodes when doing the builds on Gadi). Happy to share the ansible config if you’re interested

That’d be good. I played around a bit with ansible in the past to, funnily enough, deploy and configure a gitlab runner.

I think after today’s meeting, its clear we’ll all need a robust CI solution, however, those CI solutions will all have different purposes. A shared ansible config to get e.g. gitlab runners or other services in place (postgres?) could save a lot of time when it comes to provisioning these things for different parts of the community.

No problem, I’ve invited you both to a copy of the repository with secrets stripped out (SSH private key and CI tokens)

1 Like

Ok, I’m having trouble with configuring intel-mkl installed on gadi for use with spack.

Any ideas are welcome

/g/data/access/spack/config/spack_config/packages.yaml:

  intel-mkl:
    externals:
...
    - spec: intel-mkl@2022.2.0
      prefix: /apps/intel/compilers_and_libraries_2022.2.0/linux/mkl
      modules:
      - intel-mkl/2022.2.0

The directory doesn’t exist:

$ stat /apps/intel/compilers_and_libraries_2022.2.0/linux/mkl
stat: cannot statx '/apps/intel/compilers_and_libraries_2022.2.0/linux/mkl': No such file or directory

Odd. This one does:

$ ls -l /apps/intel/compilers_and_libraries_2020.2.254/linux/mkl/
total 28
drwxrwxr-x.  5 apps apps 4096 Jul 27  2020 benchmarks
drwxrwxr-x.  3 apps apps 4096 Jul 27  2020 bin
drwxrwxr-x.  2 apps apps 4096 Jul 27  2020 examples
drwxrwxr-x.  5 apps apps 4096 Jul 27  2020 include
drwxrwxr-x. 11 apps apps 4096 Jul 27  2020 interfaces
drwxrwxr-x.  4 apps apps 4096 Jul 27  2020 lib
drwxrwxr-x.  3 apps apps 4096 Jul 27  2020 tools

Odd that it is getting the wrong version, it picks it up from the output of module avail and that version isn’t there.

$ module avail intel-mkl
---------------------------------------------------------------- /apps/Modules/modulefiles -----------------------------------------------------------------
intel-mkl/2019.3.199  intel-mkl/2019.5.281  intel-mkl/2020.1.217  intel-mkl/2020.3.304  intel-mkl/2021.2.0  intel-mkl/2021.4.0  intel-mkl/2022.1.0  
intel-mkl/2019.4.243  intel-mkl/2020.0.166  intel-mkl/2020.2.254  intel-mkl/2021.1.1    intel-mkl/2021.3.0  intel-mkl/2022.0.2  intel-mkl/2022.2.0  

Thanks. I’ll check that out.

Hi @Aidan . Intel compiler versions 2021 and later come from oneAPI, and Intel changed the installation layout compared to Parallel Studio XE. Your path is correct for the earlier versions, but in the later versions you’ll find MKL in /apps/intel-oneapi/mkl/<version>/

1 Like

Thanks @dale.roberts. It seems NCI has chosen to use the /apps/intel-ct directory to provide a consistent path to intel-mkl libraries.

$ ls -ld /apps/intel-ct/*/mkl*                                                                                                    
lrwxrwxrwx. 1 apps apps 56 Nov  1  2019 /apps/intel-ct/2019.3.199/mkl -> /apps/intel/compilers_and_libraries_2019.3.199/linux/mkl                                              
lrwxrwxrwx. 1 apps z30  56 Nov 26  2019 /apps/intel-ct/2019.4.243/mkl -> /apps/intel/compilers_and_libraries_2019.4.243/linux/mkl                                              
lrwxrwxrwx. 1 apps z30  56 Nov 28  2019 /apps/intel-ct/2019.5.281/mkl -> /apps/intel/compilers_and_libraries_2019.5.281/linux/mkl                                              
lrwxrwxrwx. 1 apps z30  56 Jan 14  2020 /apps/intel-ct/2020.0.166/mkl -> /apps/intel/compilers_and_libraries_2020.0.166/linux/mkl                                              
lrwxrwxrwx. 1 apps z30  56 May  6  2020 /apps/intel-ct/2020.1.217/mkl -> /apps/intel/compilers_and_libraries_2020.1.217/linux/mkl                                              
lrwxrwxrwx. 1 apps z30  56 Jul 27  2020 /apps/intel-ct/2020.2.254/mkl -> /apps/intel/compilers_and_libraries_2020.2.254/linux/mkl                                              
lrwxrwxrwx. 1 apps z30  56 Feb 24  2021 /apps/intel-ct/2020.3.304/mkl -> /apps/intel/compilers_and_libraries_2020.4.304/linux/mkl                                              
lrwxrwxrwx. 1 apps z30  31 May 10  2021 /apps/intel-ct/2021.1.1/mkl -> /apps/intel-oneapi/mkl/2021.1.1                                                                         
lrwxrwxrwx. 1 apps z30  31 Jun 22  2021 /apps/intel-ct/2021.2.0/mkl -> /apps/intel-oneapi/mkl/2021.2.0                                                                         
lrwxrwxrwx. 1 apps z30  31 Aug 31  2021 /apps/intel-ct/2021.3.0/mkl -> /apps/intel-oneapi/mkl/2021.3.0                                                                         
lrwxrwxrwx. 1 apps z30  31 Nov  8  2021 /apps/intel-ct/2021.4.0/mkl -> /apps/intel-oneapi/mkl/2021.4.0                                                                         
lrwxrwxrwx. 1 apps z30  31 Feb  8  2022 /apps/intel-ct/2022.0.2/mkl -> /apps/intel-oneapi/mkl/2022.0.2                                                                         
lrwxrwxrwx. 1 apps z30  31 Jun  1  2022 /apps/intel-ct/2022.1.0/mkl -> /apps/intel-oneapi/mkl/2022.1.0                                                                         
lrwxrwxrwx. 1 apps z30  31 Nov  7 20:42 /apps/intel-ct/2022.2.0/mkl -> /apps/intel-oneapi/mkl/2022.2.0

but as the internal directory layout is different, and spack has different classes for Intel and IntelOneAPI they should be treated separately. I did this with the compilers, but didn’t with MKL. That was a mistake. Will fix.

I wanted to use a spack environment “stack” to test the this shared config by compiling with a number of different combinations of compiler, mpi and intel-mkl.

Using an environment definition like this:

spack:
  include:
  - ../compilers.yaml
  - ../packages.yaml
  definitions:
  - packages: [amdscalapack@3.1]
  - compilers: [intel@2019.3.199, intel@2021.8.0]
  - mpis: [intel-mpi@2019.9.304, openmpi@4.1.4]
  - blases: [intel-mkl@2019.3.199]
  # add package specs to the `specs` list
  specs:
  - matrix:
    - [$packages]
    - [$%compilers]
    - [$^mpis]
    - [$^blases]
  view: false
  concretizer:
    unify: false

I was getting this:

$ spack concretize -f                                                                                                                                                         
==> Error: ^intel-mkl@2019.3.199 does not satisfy amdscalapack@3.1%intel@2019.3.199 ^openmpi@4.1.4

That same command worked fine with openblas instead of intel-mkl.

Note: I had not loaded the shared config, it just includes it from the directory above so as to test the current version

Removing the the explicit reference to a blas library from the matrix avoids this issue

# This is a Spack Environment file.
#
# It describes a set of packages to be installed, along with
spack:
  include:
  - ../compilers.yaml
  - ../packages.yaml
  definitions:
  - packages: [amdscalapack@3.1]
  - compilers: [intel@2019.3.199, intel@2021.8.0]
  - mpis: [intel-mpi@2019.9.304, openmpi@4.1.4]
  - blases: [openblas]
  # add package specs to the `specs` list
  specs:
  - matrix:
    - [$packages]
    - [$%compilers]
    - [$^mpis]
    # - [$^blases]
  view: false
  concretizer:
    unify: false

and builds a bunch of combinations successfully:

$ spack concretize -f          
==> Starting concretization pool with 2 processes                              
==> Environment concretized in 29.21 seconds.                             
==> Concretized amdscalapack@3.1%intel@2019.3.199 ^intel-mpi@2019.9.304 ^openblas
 -   4u37wda  amdscalapack@3.1%intel@2019.3.199~ilp64~ipo~pic+shared build_system=cmake build_type=Release arch=linux-rocky8-cascadelake
[+]  z7fur2l      ^cmake@3.24.2%intel@2019.3.199~doc+ncurses+ownlibs~qt build_system=generic build_type=Release arch=linux-rocky8-cascadelake
 -   534uamu      ^intel-mpi@2019.9.304%intel@2019.3.199~external-libfabric build_system=generic arch=linux-rocky8-cascadelake
 -   q2dkwr2      ^openblas@0.3.21%intel@2019.3.199~bignuma~consistent_fpcsr+fortran~ilp64+locking+pic+shared build_system=makefile patches=d3d9b15 symbol_suffix=none threads=none arch=linux-rocky8-casc
adelake                                                   
 -   zqxrc72          ^perl@5.26.3%intel@2019.3.199~cpanm+shared+threads build_system=generic patches=8cf4302 arch=linux-rocky8-cascadelake
                                                                          
==> Concretized amdscalapack@3.1%intel@2021.8.0 ^intel-mpi@2019.9.304 ^openblas 
 -   khh5cqp  amdscalapack@3.1%intel@2021.8.0~ilp64~ipo~pic+shared build_system=cmake build_type=Release arch=linux-rocky8-cascadelake
 -   c3tk4bf      ^cmake@3.24.2%intel@2021.8.0~doc+ncurses+ownlibs~qt build_system=generic build_type=Release arch=linux-rocky8-cascadelake
 -   emdaboy      ^intel-mpi@2019.9.304%intel@2021.8.0~external-libfabric build_system=generic arch=linux-rocky8-cascadelake
 -   jk3c77c      ^openblas@0.3.21%intel@2021.8.0~bignuma~consistent_fpcsr+fortran~ilp64+locking+pic+shared build_system=makefile patches=d3d9b15 symbol_suffix=none threads=none arch=linux-rocky8-cascad
elake                                                           
 -   avbz2gz          ^perl@5.26.3%intel@2021.8.0~cpanm+shared+threads build_system=generic patches=8cf4302 arch=linux-rocky8-cascadelake
                                                                                                                                                                                                         
[aph502@gadi-login-02 test]$ spack concretize -f
==> Starting concretization pool with 4 processes
==> Environment concretized in 20.36 seconds.
==> Concretized amdscalapack@3.1%intel@2019.3.199 ^intel-mpi@2019.9.304 ^openblas
 -   4u37wda  amdscalapack@3.1%intel@2019.3.199~ilp64~ipo~pic+shared build_system=cmake build_type=Release arch=linux-rocky8-cascadelake
[+]  z7fur2l      ^cmake@3.24.2%intel@2019.3.199~doc+ncurses+ownlibs~qt build_system=generic build_type=Release arch=linux-rocky8-cascadelake
 -   534uamu      ^intel-mpi@2019.9.304%intel@2019.3.199~external-libfabric build_system=generic arch=linux-rocky8-cascadelake                                                                           
 -   q2dkwr2      ^openblas@0.3.21%intel@2019.3.199~bignuma~consistent_fpcsr+fortran~ilp64+locking+pic+shared build_system=makefile patches=d3d9b15 symbol_suffix=none threads=none arch=linux-rocky8-casc
adelake                                                                                          
 -   zqxrc72          ^perl@5.26.3%intel@2019.3.199~cpanm+shared+threads build_system=generic patches=8cf4302 arch=linux-rocky8-cascadelake                                                              
                                      
==> Concretized amdscalapack@3.1%intel@2019.3.199 ^openblas ^openmpi@4.1.4
 -   lfewrot  amdscalapack@3.1%intel@2019.3.199~ilp64~ipo~pic+shared build_system=cmake build_type=Release arch=linux-rocky8-cascadelake
[+]  z7fur2l      ^cmake@3.24.2%intel@2019.3.199~doc+ncurses+ownlibs~qt build_system=generic build_type=Release arch=linux-rocky8-cascadelake
 -   q2dkwr2      ^openblas@0.3.21%intel@2019.3.199~bignuma~consistent_fpcsr+fortran~ilp64+locking+pic+shared build_system=makefile patches=d3d9b15 symbol_suffix=none threads=none arch=linux-rocky8-casc
adelake                                              
 -   zqxrc72          ^perl@5.26.3%intel@2019.3.199~cpanm+shared+threads build_system=generic patches=8cf4302 arch=linux-rocky8-cascadelake                                                              
 -   l6qhjdb      ^openmpi@4.1.4%intel@2019.3.199~atomics~cuda~cxx~cxx_exceptions~gpfs~internal-hwloc~java~legacylaunchers~lustre~memchecker+romio+rsh~singularity+static+vt+wrapper-rpath build_system=au
totools fabrics=none schedulers=none arch=linux-rocky8-cascadelake                               
                                                                                                                                                                                                         
==> Concretized amdscalapack@3.1%intel@2021.8.0 ^intel-mpi@2019.9.304 ^openblas
 -   khh5cqp  amdscalapack@3.1%intel@2021.8.0~ilp64~ipo~pic+shared build_system=cmake build_type=Release arch=linux-rocky8-cascadelake
 -   c3tk4bf      ^cmake@3.24.2%intel@2021.8.0~doc+ncurses+ownlibs~qt build_system=generic build_type=Release arch=linux-rocky8-cascadelake
 -   emdaboy      ^intel-mpi@2019.9.304%intel@2021.8.0~external-libfabric build_system=generic arch=linux-rocky8-cascadelake
 -   jk3c77c      ^openblas@0.3.21%intel@2021.8.0~bignuma~consistent_fpcsr+fortran~ilp64+locking+pic+shared build_system=makefile patches=d3d9b15 symbol_suffix=none threads=none arch=linux-rocky8-cascad
elake                                                
 -   avbz2gz          ^perl@5.26.3%intel@2021.8.0~cpanm+shared+threads build_system=generic patches=8cf4302 arch=linux-rocky8-cascadelake                                                                
                                                                
==> Concretized amdscalapack@3.1%intel@2021.8.0 ^openblas ^openmpi@4.1.4                         
 -   yfuoaqc  amdscalapack@3.1%intel@2021.8.0~ilp64~ipo~pic+shared build_system=cmake build_type=Release arch=linux-rocky8-cascadelake                                                                   
 -   c3tk4bf      ^cmake@3.24.2%intel@2021.8.0~doc+ncurses+ownlibs~qt build_system=generic build_type=Release arch=linux-rocky8-cascadelake
 -   jk3c77c      ^openblas@0.3.21%intel@2021.8.0~bignuma~consistent_fpcsr+fortran~ilp64+locking+pic+shared build_system=makefile patches=d3d9b15 symbol_suffix=none threads=none arch=linux-rocky8-cascad
elake                                     
 -   avbz2gz          ^perl@5.26.3%intel@2021.8.0~cpanm+shared+threads build_system=generic patches=8cf4302 arch=linux-rocky8-cascadelake
 -   2la5nbk      ^openmpi@4.1.4%intel@2021.8.0~atomics~cuda~cxx~cxx_exceptions~gpfs~internal-hwloc~java~legacylaunchers~lustre~memchecker+romio+rsh~singularity+static+vt+wrapper-rpath build_system=auto
tools fabrics=none schedulers=none arch=linux-rocky8-cascadelake

with the effect that the bias library can no longer be specifically part of the combination of builds

Try swapping the ordering:

  specs:
  - matrix:
    - [$packages]
    - [$^blases]
    - [$^mpis]

Not sure why this would matter though

1 Like

Awesome! Thanks @Scott. I too am a little puzzled about that. There are definitely some rough edges and pot-holes with spack, but enough that works that it is worth sticking with and improving IMO.

I think one of the most important things is to get some experience about “what works” and “what doesn’t”, this is a bit random though.

So once I swapped the ordering it worked,

$ spack concretize -f                                                                                                                           ==> Starting concretization pool with 12 processes                                                                                                                        
==> Environment concretized in 102.96 seconds.                                                                                             
==> Concretized amdscalapack@3.1%intel@2019.3.199 ^intel-mpi@2019.9.304 ^openblas                                                  
 -   4u37wda  amdscalapack@3.1%intel@2019.3.199~ilp64~ipo~pic+shared build_system=cmake build_type=Release arch=linux-rocky8-cascadelake                                   
[+]  z7fur2l      ^cmake@3.24.2%intel@2019.3.199~doc+ncurses+ownlibs~qt build_system=generic build_type=Release arch=linux-rocky8-cascadelake
[+]  534uamu      ^intel-mpi@2019.9.304%intel@2019.3.199~external-libfabric build_system=generic arch=linux-rocky8-cascadelake
 -   q2dkwr2      ^openblas@0.3.21%intel@2019.3.199~bignuma~consistent_fpcsr+fortran~ilp64+locking+pic+shared build_system=makefile patches=d3d9b15 symbol_suffix=none thre
ads=none arch=linux-rocky8-cascadelake                                                                                                
 -   zqxrc72          ^perl@5.26.3%intel@2019.3.199~cpanm+shared+threads build_system=generic patches=8cf4302 arch=linux-rocky8-cascadelake
                                                                                                                                                                          
==> Concretized amdscalapack@3.1%intel@2019.3.199 ^openblas ^openmpi@4.1.4                                                                                                
 -   lfewrot  amdscalapack@3.1%intel@2019.3.199~ilp64~ipo~pic+shared build_system=cmake build_type=Release arch=linux-rocky8-cascadelake                                  
[+]  z7fur2l      ^cmake@3.24.2%intel@2019.3.199~doc+ncurses+ownlibs~qt build_system=generic build_type=Release arch=linux-rocky8-cascadelake                             
 -   q2dkwr2      ^openblas@0.3.21%intel@2019.3.199~bignuma~consistent_fpcsr+fortran~ilp64+locking+pic+shared build_system=makefile patches=d3d9b15 symbol_suffix=none thre
ads=none arch=linux-rocky8-cascadelake
 -   zqxrc72          ^perl@5.26.3%intel@2019.3.199~cpanm+shared+threads build_system=generic patches=8cf4302 arch=linux-rocky8-cascadelake                                
[+]  l6qhjdb      ^openmpi@4.1.4%intel@2019.3.199~atomics~cuda~cxx~cxx_exceptions~gpfs~internal-hwloc~java~legacylaunchers~lustre~memchecker+romio+rsh~singularity+static+v
t+wrapper-rpath build_system=autotools fabrics=none schedulers=none arch=linux-rocky8-cascadelake                                                                          
                                                                               
==> Concretized amdscalapack@3.1%intel@2019.3.199 ^intel-oneapi-mpi@2021.8.0 ^openblas                                                                                     
 -   2jdxkww  amdscalapack@3.1%intel@2019.3.199~ilp64~ipo~pic+shared build_system=cmake build_type=Release arch=linux-rocky8-cascadelake
[+]  z7fur2l      ^cmake@3.24.2%intel@2019.3.199~doc+ncurses+ownlibs~qt build_system=generic build_type=Release arch=linux-rocky8-cascadelake                              
[+]  4bh3urz      ^intel-oneapi-mpi@2021.8.0%intel@2019.3.199~external-libfabric~generic-names~ilp64 build_system=generic arch=linux-rocky8-cascadelake                    
 -   q2dkwr2      ^openblas@0.3.21%intel@2019.3.199~bignuma~consistent_fpcsr+fortran~ilp64+locking+pic+shared build_system=makefile patches=d3d9b15 symbol_suffix=none thre
ads=none arch=linux-rocky8-cascadelake                                                                                                                                      -   zqxrc72          ^perl@5.26.3%intel@2019.3.199~cpanm+shared+threads build_system=generic patches=8cf4302 arch=linux-rocky8-cascadelake                               
                                                                                                                                                                           ==> Concretized amdscalapack@3.1%intel@2019.3.199 ^intel-mkl ^intel-mpi@2019.9.304                                                                                        
[+]  5f6jhkc  amdscalapack@3.1%intel@2019.3.199~ilp64~ipo~pic+shared build_system=cmake build_type=Release arch=linux-rocky8-cascadelake                                   [+]  z7fur2l      ^cmake@3.24.2%intel@2019.3.199~doc+ncurses+ownlibs~qt build_system=generic build_type=Release arch=linux-rocky8-cascadelake                             
[+]  stscatf      ^intel-mkl@2019.3.199%intel@2019.3.199~ilp64+shared build_system=generic threads=none arch=linux-rocky8-cascadelake                                      
[+]  534uamu      ^intel-mpi@2019.9.304%intel@2019.3.199~external-libfabric build_system=generic arch=linux-rocky8-cascadelake
                                                                                                                                                                           
==> Concretized amdscalapack@3.1%intel@2019.3.199 ^intel-mkl ^openmpi@4.1.4                                                                                                [+]  e4vw6zm  amdscalapack@3.1%intel@2019.3.199~ilp64~ipo~pic+shared build_system=cmake build_type=Release arch=linux-rocky8-cascadelake                                   
[+]  z7fur2l      ^cmake@3.24.2%intel@2019.3.199~doc+ncurses+ownlibs~qt build_system=generic build_type=Release arch=linux-rocky8-cascadelake                              [+]  stscatf      ^intel-mkl@2019.3.199%intel@2019.3.199~ilp64+shared build_system=generic threads=none arch=linux-rocky8-cascadelake                                     
[+]  l6qhjdb      ^openmpi@4.1.4%intel@2019.3.199~atomics~cuda~cxx~cxx_exceptions~gpfs~internal-hwloc~java~legacylaunchers~lustre~memchecker+romio+rsh~singularity+static+vt+wrapper-rpath build_system=autotools fabrics=none schedulers=none arch=linux-rocky8-cascadelake                                                                         
                                                                                                                                                                           
==> Concretized amdscalapack@3.1%intel@2019.3.199 ^intel-mkl ^intel-oneapi-mpi@2021.8.0
[+]  msa63vk  amdscalapack@3.1%intel@2019.3.199~ilp64~ipo~pic+shared build_system=cmake build_type=Release arch=linux-rocky8-cascadelake
[+]  z7fur2l      ^cmake@3.24.2%intel@2019.3.199~doc+ncurses+ownlibs~qt build_system=generic build_type=Release arch=linux-rocky8-cascadelake
[+]  stscatf      ^intel-mkl@2019.3.199%intel@2019.3.199~ilp64+shared build_system=generic threads=none arch=linux-rocky8-cascadelake
[+]  4bh3urz      ^intel-oneapi-mpi@2021.8.0%intel@2019.3.199~external-libfabric~generic-names~ilp64 build_system=generic arch=linux-rocky8-cascadelake
                                                                                                                                                                          
==> Concretized amdscalapack@3.1%intel@2021.8.0 ^intel-mpi@2019.9.304 ^openblas                                                                                           
 -   khh5cqp  amdscalapack@3.1%intel@2021.8.0~ilp64~ipo~pic+shared build_system=cmake build_type=Release arch=linux-rocky8-cascadelake                                    
[+]  c3tk4bf      ^cmake@3.24.2%intel@2021.8.0~doc+ncurses+ownlibs~qt build_system=generic build_type=Release arch=linux-rocky8-cascadelake                               
[+]  emdaboy      ^intel-mpi@2019.9.304%intel@2021.8.0~external-libfabric build_system=generic arch=linux-rocky8-cascadelake                                              
 -   jk3c77c      ^openblas@0.3.21%intel@2021.8.0~bignuma~consistent_fpcsr+fortran~ilp64+locking+pic+shared build_system=makefile patches=d3d9b15 symbol_suffix=none threa$
s=none arch=linux-rocky8-cascadelake                                                                                                                                      
 -   avbz2gz          ^perl@5.26.3%intel@2021.8.0~cpanm+shared+threads build_system=generic patches=8cf4302 arch=linux-rocky8-cascadelake                                 
                                                                                                                                                                          
==> Concretized amdscalapack@3.1%intel@2021.8.0 ^openblas ^openmpi@4.1.4                                                                                                  
 -   yfuoaqc  amdscalapack@3.1%intel@2021.8.0~ilp64~ipo~pic+shared build_system=cmake build_type=Release arch=linux-rocky8-cascadelake                                    
[+]  c3tk4bf      ^cmake@3.24.2%intel@2021.8.0~doc+ncurses+ownlibs~qt build_system=generic build_type=Release arch=linux-rocky8-cascadelake                               
 -   jk3c77c      ^openblas@0.3.21%intel@2021.8.0~bignuma~consistent_fpcsr+fortran~ilp64+locking+pic+shared build_system=makefile patches=d3d9b15 symbol_suffix=none thread
s=none arch=linux-rocky8-cascadelake                      
 -   avbz2gz          ^perl@5.26.3%intel@2021.8.0~cpanm+shared+threads build_system=generic patches=8cf4302 arch=linux-rocky8-cascadelake
[+]  2la5nbk      ^openmpi@4.1.4%intel@2021.8.0~atomics~cuda~cxx~cxx_exceptions~gpfs~internal-hwloc~java~legacylaunchers~lustre~memchecker+romio+rsh~singularity+static+vt+
wrapper-rpath build_system=autotools fabrics=none schedulers=none arch=linux-rocky8-cascadelake
                                                                                                                                                                           
==> Concretized amdscalapack@3.1%intel@2021.8.0 ^intel-oneapi-mpi@2021.8.0 ^openblas        
 -   v3ypp26  amdscalapack@3.1%intel@2021.8.0~ilp64~ipo~pic+shared build_system=cmake build_type=Release arch=linux-rocky8-cascadelake                                     
[+]  c3tk4bf      ^cmake@3.24.2%intel@2021.8.0~doc+ncurses+ownlibs~qt build_system=generic build_type=Release arch=linux-rocky8-cascadelake
[+]  gfoanpc      ^intel-oneapi-mpi@2021.8.0%intel@2021.8.0~external-libfabric~generic-names~ilp64 build_system=generic arch=linux-rocky8-cascadelake                      
 -   jk3c77c      ^openblas@0.3.21%intel@2021.8.0~bignuma~consistent_fpcsr+fortran~ilp64+locking+pic+shared build_system=makefile patches=d3d9b15 symbol_suffix=none thread
s=none arch=linux-rocky8-cascadelake                            
 -   avbz2gz          ^perl@5.26.3%intel@2021.8.0~cpanm+shared+threads build_system=generic patches=8cf4302 arch=linux-rocky8-cascadelake                                  
                                                                                                                                                                  
==> Concretized amdscalapack@3.1%intel@2021.8.0 ^intel-mkl ^intel-mpi@2019.9.304                                                                                           
[+]  wzjrdco  amdscalapack@3.1%intel@2021.8.0~ilp64~ipo~pic+shared build_system=cmake build_type=Release arch=linux-rocky8-cascadelake
[+]  c3tk4bf      ^cmake@3.24.2%intel@2021.8.0~doc+ncurses+ownlibs~qt build_system=generic build_type=Release arch=linux-rocky8-cascadelake                                
[+]  mhwznkq      ^intel-mkl@2020.3.304%intel@2021.8.0~ilp64+shared build_system=generic threads=none arch=linux-rocky8-cascadelake
[+]  emdaboy      ^intel-mpi@2019.9.304%intel@2021.8.0~external-libfabric build_system=generic arch=linux-rocky8-cascadelake 

==> Concretized amdscalapack@3.1%intel@2021.8.0 ^intel-mkl ^openmpi@4.1.4                                                                                                  [+]  mwqmkps  amdscalapack@3.1%intel@2021.8.0~ilp64~ipo~pic+shared build_system=cmake build_type=Release arch=linux-rocky8-cascadelake                                    
[+]  c3tk4bf      ^cmake@3.24.2%intel@2021.8.0~doc+ncurses+ownlibs~qt build_system=generic build_type=Release arch=linux-rocky8-cascadelake                                
[+]  mhwznkq      ^intel-mkl@2020.3.304%intel@2021.8.0~ilp64+shared build_system=generic threads=none arch=linux-rocky8-cascadelake
[+]  2la5nbk      ^openmpi@4.1.4%intel@2021.8.0~atomics~cuda~cxx~cxx_exceptions~gpfs~internal-hwloc~java~legacylaunchers~lustre~memchecker+romio+rsh~singularity+static+vt+
wrapper-rpath build_system=autotools fabrics=none schedulers=none arch=linux-rocky8-cascadelake                                              
                                                                                                                                     
==> Concretized amdscalapack@3.1%intel@2021.8.0 ^intel-mkl ^intel-oneapi-mpi@2021.8.0                                                                  
[+]  673qa5b  amdscalapack@3.1%intel@2021.8.0~ilp64~ipo~pic+shared build_system=cmake build_type=Release arch=linux-rocky8-cascadelake                                    
[+]  c3tk4bf      ^cmake@3.24.2%intel@2021.8.0~doc+ncurses+ownlibs~qt build_system=generic build_type=Release arch=linux-rocky8-cascadelake                               
[+]  mhwznkq      ^intel-mkl@2020.3.304%intel@2021.8.0~ilp64+shared build_system=generic threads=none arch=linux-rocky8-cascadelake                                       
[+]  gfoanpc      ^intel-oneapi-mpi@2021.8.0%intel@2021.8.0~external-libfabric~generic-names~ilp64 build_system=generic arch=linux-rocky8-cascadelake                     

and what is more, I could do this and parallelise the builds which is, to put it mildly, really effing cool

$ spack -e . env depfile -o Makefile                                                                                                           
$ less Makefile                                                                                                                                
$ make -j16

and the end result:

$ spack find -lpfdx
....

==> Installed packages
-- linux-rocky8-cascadelake / intel@2019.3.199 ------------------
4u37wda amdscalapack@3.1%intel                /scratch/v45/aph502/spack/opt/spack/linux-rocky8-cascadelake/intel-2019.3.199/amdscalapack-3.1-4u37wdaissm5fqgrr2sgcq7ecjswlxp5
z7fur2l     cmake@3.24.2%intel                /apps/cmake/3.24.2
534uamu     intel-mpi@2019.9.304%intel        /apps/intel
q2dkwr2     openblas@0.3.21%intel             /scratch/v45/aph502/spack/opt/spack/linux-rocky8-cascadelake/intel-2019.3.199/openblas-0.3.21-q2dkwr2letohsw5xhc7yqp3qohzwktk2
zqxrc72         perl@5.26.3%intel             /usr

msa63vk amdscalapack@3.1%intel                /scratch/v45/aph502/spack/opt/spack/linux-rocky8-cascadelake/intel-2019.3.199/amdscalapack-3.1-msa63vkagtblzcllg7o2lyshemlafdor
z7fur2l     cmake@3.24.2%intel                /apps/cmake/3.24.2
stscatf     intel-mkl@2019.3.199%intel        /apps/intel
4bh3urz     intel-oneapi-mpi@2021.8.0%intel   /apps/intel-oneapi

lfewrot amdscalapack@3.1%intel                /scratch/v45/aph502/spack/opt/spack/linux-rocky8-cascadelake/intel-2019.3.199/amdscalapack-3.1-lfewrotk5vhwmzbj3hziqbpopv3rtbd5
z7fur2l     cmake@3.24.2%intel                /apps/cmake/3.24.2
q2dkwr2     openblas@0.3.21%intel             /scratch/v45/aph502/spack/opt/spack/linux-rocky8-cascadelake/intel-2019.3.199/openblas-0.3.21-q2dkwr2letohsw5xhc7yqp3qohzwktk2
zqxrc72         perl@5.26.3%intel             /usr
l6qhjdb     openmpi@4.1.4%intel               /apps/openmpi/4.1.4

5f6jhkc amdscalapack@3.1%intel                /scratch/v45/aph502/spack/opt/spack/linux-rocky8-cascadelake/intel-2019.3.199/amdscalapack-3.1-5f6jhkcrqxe6htrlegr242vdt2z6k7sn
z7fur2l     cmake@3.24.2%intel                /apps/cmake/3.24.2
stscatf     intel-mkl@2019.3.199%intel        /apps/intel
534uamu     intel-mpi@2019.9.304%intel        /apps/intel

e4vw6zm amdscalapack@3.1%intel                /scratch/v45/aph502/spack/opt/spack/linux-rocky8-cascadelake/intel-2019.3.199/amdscalapack-3.1-e4vw6zmg4d4pvg5xppd32amwql4l3r6u
z7fur2l     cmake@3.24.2%intel                /apps/cmake/3.24.2
stscatf     intel-mkl@2019.3.199%intel        /apps/intel
l6qhjdb     openmpi@4.1.4%intel               /apps/openmpi/4.1.4

2jdxkww amdscalapack@3.1%intel                /scratch/v45/aph502/spack/opt/spack/linux-rocky8-cascadelake/intel-2019.3.199/amdscalapack-3.1-2jdxkww5jxwwbqlepsrmz5cnn2yjkczh
z7fur2l     cmake@3.24.2%intel                /apps/cmake/3.24.2
4bh3urz     intel-oneapi-mpi@2021.8.0%intel   /apps/intel-oneapi
q2dkwr2     openblas@0.3.21%intel             /scratch/v45/aph502/spack/opt/spack/linux-rocky8-cascadelake/intel-2019.3.199/openblas-0.3.21-q2dkwr2letohsw5xhc7yqp3qohzwktk2
zqxrc72         perl@5.26.3%intel             /usr


-- linux-rocky8-cascadelake / intel@2021.8.0 --------------------
khh5cqp amdscalapack@3.1%intel                /scratch/v45/aph502/spack/opt/spack/linux-rocky8-cascadelake/intel-2021.8.0/amdscalapack-3.1-khh5cqpou55pgun2hgbqp5urgt6lyxkl
c3tk4bf     cmake@3.24.2%intel                /apps/cmake/3.24.2
emdaboy     intel-mpi@2019.9.304%intel        /apps/intel
jk3c77c     openblas@0.3.21%intel             /scratch/v45/aph502/spack/opt/spack/linux-rocky8-cascadelake/intel-2021.8.0/openblas-0.3.21-jk3c77cermdzecbid5zyx476ypf264x2
avbz2gz         perl@5.26.3%intel             /usr

yfuoaqc amdscalapack@3.1%intel                /scratch/v45/aph502/spack/opt/spack/linux-rocky8-cascadelake/intel-2021.8.0/amdscalapack-3.1-yfuoaqcspznkerefdbwot7lt3jlexdcj
c3tk4bf     cmake@3.24.2%intel                /apps/cmake/3.24.2
jk3c77c     openblas@0.3.21%intel             /scratch/v45/aph502/spack/opt/spack/linux-rocky8-cascadelake/intel-2021.8.0/openblas-0.3.21-jk3c77cermdzecbid5zyx476ypf264x2
avbz2gz         perl@5.26.3%intel             /usr
2la5nbk     openmpi@4.1.4%intel               /apps/openmpi/4.1.4

v3ypp26 amdscalapack@3.1%intel                /scratch/v45/aph502/spack/opt/spack/linux-rocky8-cascadelake/intel-2021.8.0/amdscalapack-3.1-v3ypp26czp4npmvd7kz4lrnwreavc4tb
c3tk4bf     cmake@3.24.2%intel                /apps/cmake/3.24.2
gfoanpc     intel-oneapi-mpi@2021.8.0%intel   /apps/intel-oneapi
jk3c77c     openblas@0.3.21%intel             /scratch/v45/aph502/spack/opt/spack/linux-rocky8-cascadelake/intel-2021.8.0/openblas-0.3.21-jk3c77cermdzecbid5zyx476ypf264x2
avbz2gz         perl@5.26.3%intel             /usr

wzjrdco amdscalapack@3.1%intel                /scratch/v45/aph502/spack/opt/spack/linux-rocky8-cascadelake/intel-2021.8.0/amdscalapack-3.1-wzjrdcoepbisu2us27ycwrmevo2rbt2r
c3tk4bf     cmake@3.24.2%intel                /apps/cmake/3.24.2
mhwznkq     intel-mkl@2020.3.304%intel        /apps/intel
emdaboy     intel-mpi@2019.9.304%intel        /apps/intel

mwqmkps amdscalapack@3.1%intel                /scratch/v45/aph502/spack/opt/spack/linux-rocky8-cascadelake/intel-2021.8.0/amdscalapack-3.1-mwqmkpskddd7go3speyhjpy5vzthyfi6
c3tk4bf     cmake@3.24.2%intel                /apps/cmake/3.24.2
mhwznkq     intel-mkl@2020.3.304%intel        /apps/intel
2la5nbk     openmpi@4.1.4%intel               /apps/openmpi/4.1.4

673qa5b amdscalapack@3.1%intel                /scratch/v45/aph502/spack/opt/spack/linux-rocky8-cascadelake/intel-2021.8.0/amdscalapack-3.1-673qa5bicixhnhfagvfecectg3jkoxog
c3tk4bf     cmake@3.24.2%intel                /apps/cmake/3.24.2
mhwznkq     intel-mkl@2020.3.304%intel        /apps/intel
gfoanpc     intel-oneapi-mpi@2021.8.0%intel   /apps/intel-oneapi

==> 12 installed packages