# True Zonal Mean¶

Calculate the true zonal mean of a scalar quantity regardless of the horizontal mesh.

Specifically, we calculate the volume weighted mean along all grid cells whose centres fall within finite latitude intervals rather than the arithmetic mean of cells along the model’s curvilinear grid. The method presented can also be used to re-grid models onto the same latitudinal grid and the general principles can be used to define any multidimensional sum or average using the xhistogram package.

Requirements: Select the conda/analysis3-20.01 (or later) kernel. This code should work for just about any MOM5 configuration since all we are grabbing is temeprature and standard grid information. You can swap temperature with any other scalar variable. You can also in principle swap latitude with another scalar.

Last updated by Jan Zika 8-4-2020

import cosima_cookbook as cc
import matplotlib.pyplot as plt
import xarray as xr
import numpy as np
from xhistogram.xarray import histogram #This package has only recently been added to conda/analysis3-20.01

client = Client(n_workers=4)
client


### Cluster

• Workers: 8
• Cores: 48
• Memory: 202.49 GB
session = cc.database.create_session()


Choose the run and the variable we want to average. I am using temperature but you can choose any scalar.

expt =  '025deg_jra55v13_ryf8485_gmredi6' #Choose any run
variable = 'temp' #any scalar variable where volume weighted averaging makes sense
variable_to_average = cc.querying.getvar(expt, variable, session, ncfile='ocean.nc')
variable_to_average

xarray.DataArray
'temp'
• time: 298
• st_ocean: 50
• yt_ocean: 1080
• xt_ocean: 1440
• dask.array<chunksize=(1, 10, 216, 288), meta=np.ndarray>


Array  Chunk

Bytes  92.69 GB   2.49 MB
Shape  (298, 50, 1080, 1440)   (1, 10, 216, 288)
Type  float32  numpy.ndarray

298
1

1440
1080
50


• yt_ocean
(yt_ocean)
float64
-81.08 -80.97 ... 89.84 89.95
long_name :
tcell latitude
units :
degrees_N
cartesian_axis :
Y
array([-81.077001, -80.971402, -80.865804, ...,  89.736085,  89.841684,
89.947282])
• xt_ocean
(xt_ocean)
float64
-279.9 -279.6 ... 79.62 79.88
long_name :
tcell longitude
units :
degrees_E
cartesian_axis :
X
array([-279.875, -279.625, -279.375, ...,   79.375,   79.625,   79.875])
• st_ocean
(st_ocean)
float64
1.152 3.649 ... 5.034e+03 5.254e+03
long_name :
tcell zstar depth
units :
meters
cartesian_axis :
Z
positive :
down
edges :
st_edges_ocean
array([1.151750e+00, 3.648674e+00, 6.564918e+00, 9.970869e+00, 1.394871e+01,
1.859438e+01, 2.401987e+01, 3.035589e+01, 3.775489e+01, 4.639470e+01,
5.648257e+01, 6.825986e+01, 8.200741e+01, 9.805146e+01, 1.167703e+02,
1.386016e+02, 1.640494e+02, 1.936921e+02, 2.281881e+02, 2.682812e+02,
3.148002e+02, 3.686535e+02, 4.308139e+02, 5.022893e+02, 5.840771e+02,
6.771003e+02, 7.821267e+02, 8.996818e+02, 1.029968e+03, 1.172813e+03,
1.327662e+03, 1.493618e+03, 1.669533e+03, 1.854114e+03, 2.046034e+03,
2.244025e+03, 2.446937e+03, 2.653770e+03, 2.863688e+03, 3.076005e+03,
3.290172e+03, 3.505757e+03, 3.722423e+03, 3.939909e+03, 4.158016e+03,
4.376593e+03, 4.595523e+03, 4.814719e+03, 5.034116e+03, 5.253663e+03])
• time
(time)
object
1900-07-02 12:00:00 ... 2197-07-02 12:00:00
long_name :
time
cartesian_axis :
T
calendar_type :
NOLEAP
bounds :
time_bounds
array([cftime.DatetimeNoLeap(1900, 7, 2, 12, 0, 0, 0, 3, 183),
cftime.DatetimeNoLeap(1901, 7, 2, 12, 0, 0, 0, 4, 183),
cftime.DatetimeNoLeap(1902, 7, 2, 12, 0, 0, 0, 5, 183), ...,
cftime.DatetimeNoLeap(2195, 7, 2, 12, 0, 0, 0, 4, 183),
cftime.DatetimeNoLeap(2196, 7, 2, 12, 0, 0, 0, 5, 183),
cftime.DatetimeNoLeap(2197, 7, 2, 12, 0, 0, 0, 6, 183)], dtype=object)
• long_name :
Conservative temperature
units :
deg_C
valid_range :
[-10. 500.]
cell_methods :
time: mean
time_avg_info :
average_T1,average_T2,average_DT
coordinates :
geolon_t geolat_t

First we will show the standard approach, which is to take the arithmetic mean of all grid cells along the quasi-longitudinal coordinate. For MOM5’s tri-polar grid this approach is in principle “okay” for the southern hemisphere where grid cell areas are constant at fixed latitude. It doesn’t though, take into account partial cells.

The xarray’s method .mean(dim='dimension') applies numpy.mean() across that dimension. This is simply the arithmetic mean.

For some scalar $$T$$ the arithemtic mean is given by

$\left<T\right>_{j,k} = \frac{1}{I}\sum_{i=1}^{I} T_{i,j,k},$

where $$i$$, $$j$$ and $$k$$ are the indicies in the $$x$$, $$y$$ and $$z$$ directions respectively of the curvilinear grid and $$I$$ is the number of indicies along the $$x$$ axis.

x_arith_mean = variable_to_average.groupby('time.year').mean(dim='time').mean(dim='xt_ocean')

plt.figure(figsize=(10, 5))
x_arith_mean.sel(year=2000).plot(yincrease=False, vmin=273, vmax=300, cmap='Oranges')
plt.title('x-coordinate arithmetic mean');


The main issue with this average is the ‘latitude’ coordinate may be meaningless near the north pole, particulalrly when comparing to observational analysis or other models which will have either a regular grid or a different curvilinear grid. Even different versions of MOM might have different grids.

Let us consider what the true zonal average looks like. That is consider a set of latitude ‘edges’ $$\{\phi'_{1/2},\phi'_{1+1/2},...,\phi'_{\ell-1/2},\phi'_{\ell+1/2},...,\phi'_{L+1/2}\}$$ between which we want to compute an average of $$T$$ at $$\{\phi'_{1},\phi'_{2},...,\phi'_{\ell},...,\phi'_{L}\}$$ such that

$\overline{T}(\phi'_\ell,\sigma) = \dfrac{\iint_{\phi'_{\ell-1/2} < \phi \leq \phi'_{\ell+1/2}} T(\phi,\lambda,\sigma)\frac{\partial z}{\partial \sigma}(\phi,\lambda,\sigma)\,\mathrm{d}A}{\iint_{\phi'_{\ell-1/2} < \phi \leq \phi'_{\ell+1/2}}\frac{\partial z}{\partial \sigma}(\phi,\lambda,\sigma)\,\mathrm{d}A},$

where $$\lambda$$ is longitude and $$\sigma$$ is an arbitrary vertical coordinate.

In discrete form this average is

$\overline{T}_{\ell,k} = \frac{\sum_{i=1}^{I}\sum_{j=1}^{J}\delta_{i,j}T_{i,j,k}\Delta Z_{i,j,k}\Delta Area_{i,j}}{\sum_{i=1}^{I}\sum_{j=1}^{J}\delta_{i,j,k}\Delta Z_{i,j,k}\Delta Area_{i,j}},$

where $$\delta_{i,j} = 1$$ if $$\phi'_{\ell-1/2}<\phi_{i,j}\leq \phi'_{\ell+1/2}$$ and $$\delta_{i,j} = 0$$ elsewhere, $$\Delta Z$$ is the grid cell vertical thickness and $$\Delta Area$$ is the grid cell horizontal area.

For our purposes we will use the edges of the models xt_ocean coordinate to define $$\phi'_{\ell+1/2}$$ so the number of ‘bins’ $$L$$ will be the same as the length of the quasi-latitude coordinate ($$J$$).

Fortunately, as you can see below, the two sums are weighted histograms (one for $$T$$ times volume and the other for just volume) and these can be rapidly computed using xhistogram.

First lets load the scalar variable (latitude) we want to use as our coordinate then define the bin edges.

coord = 'geolat_t' #can be any scalar (2D, 3D, eulerian, lagrangian etc)
variable_as_coord = cc.querying.getvar(expt, coord, session, n=1) #might need ncfile='ocean.nc' if a tracer

#Define the coordinate bins as the latitude edges of the T-cells
yu_ocean = cc.querying.getvar(expt, 'yu_ocean', session, n=1)
#make numpy array (using .values) and add 1st edge at -90
bins =  np.insert(yu_ocean.values, 0, np.array(-90), axis=0)

#Alternatively we could just use some regular grid like this
#coordbins =  np.linspace(-80, 90, 50)
#or use a grid from a different (coarser) model.

dzt = cc.querying.getvar(expt, 'dzt', session, ncfile='ocean.nc') #thickness of cells
area_t = cc.querying.getvar(expt, 'area_t', session, n=1) #area of cells
dVol = dzt*area_t #Volume of cells


Now let’s compute the numerator and denominator of the equation above using xhistogram, then the time mean and then the zonal mean.

histVolCoordDepth = histogram(variable_as_coord.broadcast_like(dVol).where(~np.isnan(dVol)), bins=[bins], weights=dVol, dim=['yt_ocean', 'xt_ocean'])
histTVolCoordDepth = histogram(variable_as_coord.broadcast_like(dVol).where(~np.isnan(dVol)), bins=[bins], weights=dVol*variable_to_average, dim=['yt_ocean', 'xt_ocean'])
coord_mean = (histTVolCoordDepth/histVolCoordDepth).groupby('time.year').mean(dim='time')


Now we can plot the results which thankfully retain all the data-array info on variables and axis etc.

plt.figure(figsize=(10, 5))
coord_mean.sel(year=2000).plot(yincrease=False, vmin=273, vmax=300, cmap='Oranges')
plt.title('True zonal mean');


Since we used the same bin edges as the standard yt_ocean coordinate we can take a difference between the arithmetic mean along the model’s x-axis and our mean along grid cells within latitude bands. The main differences are near the North Pole where the grid is furthest for being regular. There are also differences near the Antacrtic Shelf suggesting partial cells also matter.

zonal_minus_x_mean = coord_mean.sel(year=2000)-x_arith_mean.sel(year=2000).values

plt.figure(figsize=(10, 5))
zonal_minus_x_mean.plot(yincrease=False, vmin=-0.8, vmax=0.8, cmap='RdBu_r')
plt.title('True zonal minus x-coordinate arithmetic mean');


xarray has a new weighted functionality which allows it to do weighted means instead of arithmetic mean.

Let’s see how that works out… We chose dVol as the weights and we only do the comparisson for year 2000.

variable_to_average_weighted = variable_to_average.copy().sel(time='2000').mean(dim='time')
variable_to_average_weighted = variable_to_average_weighted.weighted(dVol.sel(time='2000').fillna(0))
meanweighted_y2000 = variable_to_average_weighted.mean(dim='xt_ocean').groupby('time.year').mean(dim='time').sel(year=2000)

zonal_minus_x_mean = coord_mean.sel(year=2000) - meanweighted_y2000.values

plt.figure(figsize=(10, 5))
zonal_minus_x_mean.plot(yincrease=False, vmin=-0.8, vmax=0.8, cmap='RdBu_r')
plt.title("True zonal minus xarray''s x-coordinate weighted mean");


South of 65N, where complications of the tripolar grid don’t matter, xarray’s weighted mean does the job. But in the region of the tripolar we need to be more careful.

Download python script: True_Zonal_Mean.py

Download Jupyter notebook: True_Zonal_Mean.ipynb