I use dask_jobqueue to set up bespoke clusters within a jupyter lab notebook. I don’t use the ARE but a script to start the notebook, which seems to work fine with the new xp65 env.
My problem occurs when submitting the dask_jobqueue to set up a cluster, which I do like this:
@access-nri folks, the path to python would be handy to have as an environment variable from the module, rather than needing to set the version number here explicitly
1 Like
clairecarouge
(Claire Carouge, ACCESS-NRI Land Modelling Team Lead)
3
@jpeter Can you please confirm @Scott 's answer solve your issue so we can show this topic as solved and we can close it?
Based on @Scott suggestion, I have configured the DASK_JOBQUEUE__PBS__PYTHON environment variable to point to the Python interpreter in the container. That should fix your problem @jpeter. It is available for all versions of the conda/analysis3 environment
$ which python # Should be used for jobqueue.pbs.python
/g/data/xp65/public/apps/med_conda_scripts/analysis3-25.04.d/bin/python
$ python -c 'import dask; import dask_jobqueue; print(dask.config.get("jobqueue.pbs.python"))'
None