Getting custom conda environment loaded into ARE environment

Evening all.

The World Climate Research Program 2025 Hackathon is trying to build a temporary python environment to get around the current disk issues on gadi.

We want to build a python environment on /scratch/nf33 which can then be used in ARE sessions.

The .yaml file is available from tools/python_envs/environment.yaml at main · digital-earths-global-hackathon/tools · GitHub

So I downloaded and installed miniconda (because I can’t rely on existing hh5/xp65 /g/data conda installs) at this location
/scratch/nf33/public/miniconda
and ran

conda env create --name digital_earths_env --file environment.yaml

Which created an environment in
/scratch/nf33/public/miniconda/envs/digital_earths_env
This environment can be activated from the command line and the requisite modules imported.

Now, the trick part to loading it into ARE.

The NRI ARE docs state:
Python or Conda virtual environment base
Path to a Python or conda base environment to be activated for the JupyterLab session.
It is the equivalent to running source <path/to/environment/bin/activate> on the command line.”

The problem is, this <environment>/bin/activate file doesn’t exist. So my ARE sessions can’t find any kernels associated with the environment.

This is a known issue with new conda, see

I’ve read the suggested reply at

but can’t make sense of it.

I understand that we prefer to containerise conda installs these days, from Conda hh5 environment setup — CLEX CMS Wiki (and Scott Wales’ template) but that’s isn’t going to solve our immediate problem.

The only activate script I can find in this new environment is
/scratch/nf33/public/miniconda/envs/digital_earths_env/lib/python3.12/venv/scripts/common/activate

Now - before I create a symlink to /bin/activate to that file (and I’m afraid of unintended consequences), does anyone have a work-around?

Or is that the solution? I’ve noted the hh5 bin/activate file is very different to envs/digital_earths_env/lib/python3.12/venv/scripts/common/activate.

Currently- if I specify this to ARE

ARE can find the bin/jupyter directory

But no kernels are available.

EDIT : A temporary work-around was to build a python virtual environment on top of the conda environment.

python -m venv digital_earths_venv 
source /scratch/nf33/public/digital_earths_venv/bin/activate
python -m ipykernel install --prefix /scratch/nf33/public/digital_earths_venv --name digital_earths_venv --display-name "digital_earths_kernel"
cd ~/.local/share/jupyter/kernels/
ln -s /scratch/nf33/public/digital_earths_venv/share/jupyter/kernels/digital_earths_venv

This allows the digital earth conda environment to be loaded into ARE.

But surely there must be an easier way?

Mods - this topic is smilier to How to create a personal Conda environment on Gadi when you really need one? so feel free to merge if required.

Environment activation can be manually done with

eval $($CONDA_PREFIX/bin/conda shell.bash activate ENV)

(this is what conda activate runs), you could put that in a new script file $CONDA_PREFIX/bin/activate and point the virtual environment base to $CONDA_PREFIX.

The activate script you found is part of the venv package, it would be for activating virtual environments rather than conda environments.

1 Like

Thanks a lot Scott for that explanation.

We are running with the extra virtual environment at the moment, I will test your solution when I can.

Hi Paul,
Based on my interactions on my local machine, and what I can see of your conda, it looks like the “Python or Conda virtual environment base” should be /scratch/nf33/public/miniconda/ (looking for /scratch/nf33/public/miniconda/bin/activate) and even though it doesn’t specify the environment I think this would give you options for a kernel of digital_earths_env, assuming that either ipykernel or jupyterlab is installed.
I can’t test this at the moment (I’m not part of nf33) but I will once I can.

I think I have a conda working with your environment.
Python or Conda virtual environment base: /scratch/nf33/public/miniconda
Conda environment: digital_earths_env



Screenshot 2025-05-13 at 3.19.57 pm

Hi @jemmajeffree

Which kernel have you selected? What other kernels can you select?

I selected something with a bunch of random numbers and letters (bdccf2dc), and I only had the one option, which surprised me

1 Like

Ah. I have learnt something.

I have six kernels available. Five of them are in my ~/.local/share/jupyter/kernels/ directory. The sixth is the ‘Python 3 (ipkernel)’.

When I used previous ARE environments, the default kernel was usually linked to a standard python installation which didn’t have any of the required modules.

However, I now realise the ‘Python 3 (ipkernel)’ is actually located in
/scratch/nf33/public/miniconda/envs/digital_earths_env/share/jupyter/kernels/python3/kernel.json

more /scratch/nf33/public/miniconda/envs/digital_earths_env/share/jupyter/kernels/python3/kernel.json

{
 "argv": [
  "/scratch/nf33/public/miniconda/envs/digital_earths_env/bin/python",
  "-m",
  "ipykernel_launcher",
  "-f",
  "{connection_file}"
 ],
 "display_name": "Python 3 (ipykernel)",
 "language": "python",
 "metadata": {
  "debugger": true
 }
}

So in this case, the default Python3 kernel works fine. So I didn’t need to build another virtual environment on top of it.

Thanks for the investigation.

You can install the conda pacakge nb_conda_kernels in the same environment as jupyterlab if you want to be able to select kernels from other environments (you probably don’t need to enable this for your hackathon)

1 Like