Aakash Sane

and 3 more

We demonstrate the use of information theory metrics, Shannon entropy and mutual information, for measuring internal and forced variability in general circulation coastal and global ocean models. These metrics have been applied on spatially and temporally averaged data. A combined metric reliably delineates intrinsic and extrinsic variability in a wider range of circumstances than previous approaches based on variance ratios that therefore assume Gaussian distributions. Shannon entropy and mutual information manage correlated fields, apply to any distribution, and are insensitive to outliers and a change of units or scale. Different metrics are used to quantify internal vs forced variability in (1) idealized Gaussian and uniformly distributed data, (2) an initial condition ensemble of a realistic coastal ocean model (OSOM), (3) the GFDL-ESM2M climate model large ensemble. A metric based on information theory partly agrees with the traditional variance-based metric and identifies regions where non-linear correlations might exist. Mutual information and Shannon entropy are used to quantify the impact of different boundary forcings in a coastal ocean model ensemble. Information theory enables ranking the potential impacts of improving boundary and forcing conditions across multiple predicted variables with different dimensions. The climate model ensemble application shows how information theory metrics are robust even in a highly skewed probability distribution (Arctic sea surface temperature) resulting from sharply non-linear behavior (freezing point).