Hello. I would like to calculate neutral density (Jackett & McDougall, 1997) from ACCESS-OM2 (0.1 deg version in this case) output data. For example, calculation of depth-latitude sections of neutral density. I found that the neutralocean module for omega-surfaces is available on gadi, but what I would like is neutral surface labels identical to the JM97 definition, for comparison with other datasets rather than for optimal neutrality. While one can label omega surfaces with neutral density-like labels, they donât necessarily remain very similar away from a defined point of reference, and the referencing still requires one JM97 calculation at the reference point.
I have the legacy matlab package for neutral surface computation, but are there python modules which can do that currently and are readily available on gadi?
Hi Aviv,
Sounds like youâre pretty familiar with the neutral landscape already.
I believe the best way to calculate 1997 neutral density values in Python is to use Eric Firingâs pygamma
code, which is a Python wrapper for the original Jackett and McDougall (1997) FORTRAN code.
There should be two public functions: one to calculate the gamma^n value of a given bottle (ie given S, T, P, lon, lat values), and one to calculate the pressure of a certain gamma^n = const. isosurface (using a slightly more advanced method compared to a simple linear interpolation of gamma^n values, say).
The second function requires first pre-calculating gamma^n values for all the bottles in a cast. It sounds like the second one is what you want.
Beware that 1997 gamma^n performs fairly well when your ocean data is similar to the Levitus (1982) atlas upon which it was built, but the further your ocean data is from Levitus, the worse gamma^n will perform in terms of keeping its isosurfaces close to neutral.
Cheers,
Geoff
Thanks Geoff, good to know! Sounds like that can be a great solution.
I looked at the instructions for requesting new packages, and it is stated that ``as a general rule we will only install packages from the âconda-forgeâ channelâ'. mmm. pygamma is not on conda-forge as far as I can see (I tried âconda search conda-forge::pygammaâ and variations of that). Do you know if there a way around this @Aidan?
You can always load a conda
environment and then do pip install <packagename> --user
to install a package into your ~/.local
directory. This works well for python-only packages. In this case it is a python front end to some fortran code, so it might work, but Iâm not confident, and it may break in the future with updates to the conda
environment. However, it is worth giving it a crack and see how you go.
Otherwise I would contact CLEX CMS Help if youâre affiliated with CLEX. In some cases CMS have created a conda
package in their channel.
Thanks Aidan! The pip install did not work for me, but Iâll contact CLEX helpdesk at this point and write the results here later.
I got help from @Paola-CMS, who recommended a possible solution is to install pygamma in the home folder of a user (e.g. me) via the following approach, which has worked for me:
module unload conda/analysis3-22.10
module load conda/analysis3. #or whichever environment you prefer
module load gcc
module load netcdf
download pygamma (press âzipâ in the pygamma webpage), scp to your home folder, unzip, change its folder name (pygamma-2172748e338d) to, e.g., pygamma, cd into the folder, and then,
pip install . --user
Then you can import the pygamma module in your jupyter notebook as well as in a python terminal session (via âimport pygammaâ). There are various README files in the package, and the test_all.py file there is also a good demonstration of the interface.
Btw, there is a completely different package called pygamma (has to do with magnetic resonance) which can be installed automatically with pip. This unrelated package might be installed if you do something like âpip install pygammaâ, so beware.
I am trying to do this but i dont understand how to install pygamma once i have it as a folder in my home directoryâŚ
cd into the folder, and then do:
pip install . --user
Hi Aviv! How would you feel about uploading your code to do this to COSIMA recipes? I was interested in doing something similar and would be great to see an example
(by âthisâ I mean code that calculates neutral density for a given transect)
Hi Julia
Sure, happy to contribute a recipie. Were you successful in installing and importing pygamma as a module btw? Iâm asking since now I am getting an error when I try to import pygamma into a notebook running through ARE. I originally did this when I was running notebooks through the gadi_jupyter command line tool. I verified that the latter still works - no error returns when I import pygamma through a gadi_jupyter notebook, and the neutral density looks fine when calculated and plotted. With ARE, Iâm getting:
---------------------------------------------------------------------------
import pygamma
ImportError Traceback (most recent call last)
Cell In[3], line 1
----> 1 import pygamma
File ~/pygamma/__init__.py:35
31 if not os.path.exists(ncfile):
32 raise ImportError("Cannot find data file, %s" % ncfile)
---> 35 from pygamma import _gamma
36 from pygamma._gamma import neutral_surfaces as _neutral_surfaces
38 def _check(ierr):
ImportError: cannot import name '_gamma' from partially initialized module 'pygamma' (most likely due to a circular import) (/home/552/as2408/pygamma/__init__.py)
Not sure why there is an error (e.g., a circular import) with ARE but not with gadi_jupyter. Does anyone have a clue what might be happening here?
I contacted CWS helpdesk, and @dale.roberts found what was the issue and provided a simple fix, in the quote below which I attach for completeness for anyone who might install the package:
The issue is specifically that there is an init.py file found in /home/552/as2408/pygamma/. On ARE, the working directory for you jupyterlab sessions is always your home directory. When you run import pygamma, python has an order in which it will check whether there is something that it can import, and the top of that order is the current working directory. If you run import some_module, and your working directory has a directory called some_module with an init.py file in it, that is what will python will attempt to import, regardless of what youâve pip installâd previously. If that fails, it will not fall back to anything else, as the code within python to check for things it thinks it can import is separate to the code that actually performs the import. If you rename the pygamma directory in your home directory to anything else, python will pick up the pip installâd pygamma, and the import will succeed.
Iâll try to find time soon to add a cosima recipie (which I have never done) for the pygamma package. In the meantime if anyone wants the specific code I wrote for my purpose, I can of course send it their way.
Hi @avivsolo, looks like the formatting ate the underscores there. For future reference, its the presence of __init__.py
that causes python to think a directory is an importable module. By default, the current working directory is the first place python checks for something to import, and on ARE the working directory defaults to your home directory.
Thank you Aviv! Sorry for the late reply. Iâd be interested in the code, but of course also happy to wait if youâd rather upload it to recipes.
No problem. Until I upload a cookbook recipie, here is a mini-recipie which loads ACCESS-OM2-01 cycle 3 data in a Southern Ocean latitudinal section, and calculates and plots the neutral density in that section (figure attached):
%matplotlib inline
import cosima_cookbook as cc
import matplotlib.pyplot as plt
import gsw
import pygamma
expt = '01deg_jra55v140_iaf_cycle3';
session = cc.database.create_session()
t1 = '1999-01-01 00:00:00'; t2 = '1999-02-01 00:00:00';
lat_slice = slice(-70,-40)
lon0 = 45 # degrees E
pot_rho_2 = cc.querying.getvar(expt=expt, variable='pot_rho_2', session=session, frequency='1 monthly',
start_time=t1, end_time=t2) #potential density rho_2
CT = cc.querying.getvar(expt=expt, variable='temp', session=session, frequency='1 monthly',
start_time=t1, end_time=t2) #Conservative temperature
SP = cc.querying.getvar(expt=expt, variable='salt', session=session, frequency='1 monthly',
start_time=t1, end_time=t2) #Practical salinity
CT = CT.sel(yt_ocean=lat_slice).sel(time=slice(t1,t2))
SP = SP.sel(yt_ocean=lat_slice).sel(time=slice(t1,t2))
CT_sec = CT.isel(time=0).sel(xt_ocean=lon0,method='nearest').load()-273.16
SP_sec = SP.isel(time=0).sel(xt_ocean=lon0,method='nearest').load()
st_ocean = cc.querying.getvar(expt,'st_ocean',session,n=1)
depth = -st_ocean.values
depth = xr.DataArray(depth, coords = [st_ocean], dims = ['st_ocean'])
depth_array = depth*(CT_sec[0,...]*0+1)
depth_array = depth_array.load()
depth_array.shape
yt_ocean = CT_sec.yt_ocean
lat_array = (depth_array*0+1)*yt_ocean
lat_array = lat_array.load()
xt_ocean = CT_sec.xt_ocean
lon_array = (depth_array*0+1)*xt_ocean
lon_array = lon_array.load()
pressure = gsw.p_from_z(depth_array,lat_array)
pressure = pressure.load()
SA_sec = gsw.SA_from_SP(SP_sec.values, pressure.values, lon_array.values, lat_array.values)
T_sec = gsw.t_from_CT(SA_sec, CT_sec.values, pressure.values)
gamma, dg_lo, dg_hi = pygamma.gamma_n(SP_sec.T, T_sec.T, pressure.T, (yt_ocean*0+1)*lon0, yt_ocean)
gamma = gamma.T
fig = plt.figure(figsize=(16, 7))
ax = plt.subplot(111);
p1 = plt.pcolor(CT_sec.yt_ocean,-CT_sec.st_ocean,gamma,vmin=26.5,vmax=28.3)
plt.contour(CT_sec.yt_ocean,-CT_sec.st_ocean,gamma,levels=[27,27.7,28,28.2,28.27],colors='k',linewidths=0.25)
plt.xlabel('Latitude [$^{\circ}$]'); plt.ylabel('Depth [m]')
plt.title('Neutral density $\gamma_n$ [kg m$^{-3}$], at longitude='+str(lon0)+'$^{\circ}$E.')
ax_cb = plt.axes([0.9, 0.15, 0.007, 0.7])
cb = plt.colorbar(p1,cax=ax_cb, orientation='vertical');

fn = 'NeutralDensLatSec_lon'+str(lon0)+'E_example.png'
plt.savefig(fn,dpi=250,facecolor='w',bbox_inches = 'tight')
This is great Aviv! Thank you very much
Thatâs a great recipe! It just a few sentences of explanation â nothing much!
Iâd urge you to open a PR with this in GitHub - COSIMA/cosima-recipes: Example recipes for analyzing model output using the cosima-cookbook infrastructure