Ocean reanalysis is a method of combining historical ocean observations with a general ocean model (typically a computational model) driven by historical estimates of surface winds, heat, and freshwater, by way of a data assimilation algorithm to reconstruct historical changes in the state of the ocean.
Historical observations are sparse and insufficient for understanding the history of the ocean and its circulation. By utilizing data assimilation techniques in combination with advanced computational models of the global ocean, researchers are able to interpolate the historical observations to all points in the ocean. This process has an analog in the construction of atmospheric reanalysis and is closely related to ocean state estimation.
Current projects
editA number of efforts have been initiated in recent years to apply data assimilation to estimate the physical state of the ocean, including temperature, salinity, currents, and sea level, in recent years.[1] There are three alternative state estimation approaches. The first approach is used by the ‘no-model’ analyses, for which temperature or salinity observations update a first guess provided by climatological monthly estimates.
The second approach is that of the sequential data assimilation analyses, which move forward in time from a previous analysis using a numerical simulation of the evolving temperature and other variables produced by an ocean general circulation model. The simulation provides the first guess of the state of the ocean at the next analysis time, while corrections are made to this first guess based on observations of variables such as temperature, salinity, or sea level.
The third approach is 4D-Var, which in the implementation described uses the initial conditions and surface forcing as control variables to be modified in order to be consistent with the observations as well as a numerical representation of the equations of motion through iterative solution of a giant optimization problem.
Methodologies
editNo-model approach
editISHII and LEVITUS begin with a first guess of the climatological monthly upper-ocean temperature based on climatologies produced by the NOAA National Oceanographic Data Center. The innovations are mapped onto the analysis levels. ISHII uses and alternative 3DVAR approach to do an objective mapping with a smaller decorrelation scale in midlatitudes (300 km) that elongates in the zonal direction by a factor of 3 at equatorial latitudes. LEVITUS begins similarly to ISHII, but uses the technique of Cressman and Barnes with a homogeneous scale of 555 km to objectively map the temperature innovation onto a uniform grid.
Sequential approaches
editThe sequential approaches can be further divided into those using Optimal Interpolation and its more sophisticated cousin the Kalman Filter, and those using 3D-Var. Among those mentioned above, INGV and SODA use versions of Optimal Interpolation. CERFACS, GODAS, and GFDL all use 3DVar. "To date we are unaware of any attempt to use Kalman Filter for multi-decadal ocean reanalyses."[1] The 4-Dimensional Local Ensemble Transform Kalman Filter (4D-LETKF) has been applied to the Geophysical Fluid Dynamics Laboratory's (GFDL) Modular Ocean Model (MOM2) for a 7-year ocean reanalysis from January 1997 – 2004.[2]
Variational (4D-Var) approach
editOne innovative attempt by GECCO has been made to apply 4D-Var to the decadal ocean estimation problem. This approach faces daunting computational challenges, but provides some interesting benefits including satisfying some conservation laws and the construction of the ocean model adjoint.
See also
editReferences
edit- ^ a b Carton, J.A., and A. Santorelli, 2008: Global upper ocean heat content as viewed in nine analyses, J. Clim., 21, 6015–6035.
- ^ Hunt, B.R., Kostelich E.J., Szunyogh, I. Efficient Data Assimilation for Spatiotemporal Chaos: A Local Ensemble Transform Kalman Filter. arXiv:physics/0511236 v1 28 Nov 2005. Dated May 24, 2006.