Oceanographic data interpolation: Objective analysis and splines
- 15 August 1990
- journal article
- Published by American Geophysical Union (AGU) in Journal of Geophysical Research: Oceans
- Vol. 95 (C8) , 13529-13541
- https://doi.org/10.1029/jc095ic08p13529
Abstract
Two data interpolation methods are reviewed and compared. They are objective analysis (also known as objective mapping, statistical interpolation or Gauss‐Markov interpolation), and spline interpolation. The former is a statistical method based on minimizing the interpolation error variance; the latter is a deterministic method which seeks to obtain the smoothest interpolated field consistent with the data. The two methods are formally equivalent. This means that a single interpolated field may be viewed as deriving from the extremization of either a statistical or a deterministic quantity. This helps to explain why objective analysis remains a useful tool even when the statistics are poorly known. Two types of spline interpolation are considered: “norm splines”, for which smoothness is defined in terms of the field amplitude and curvature, and “semi‐norm splines”, which consider only the field curvature. The objective analysis algorithm may be used to perform norm spline interpolation simply by a suitable choice of covariance function. An example of such a choice is given. The statistical, norm spline, and semi‐norm spline methods are tested on 100 realizations of a random field with known covariance structure. It is shown that the interpolation error is a weak function of the length scale needed by objective analysis and norm splines, but that the statistical error estimate depends strongly on the length scale, and can be quite misleading. The data spacing is shown to have more influence on the interpolation error than does the length scale. Semi‐norm splines are shown to have poor performance near the boundaries of the data set, and do not provide an error estimate. However, they have the advantage of not requiring the specification of a length scale, and provide a more accurate interpolation if a good estimate of the length scale is not available. The objective analysis error estimate does not reflect any error in the estimate of the true mean of the field, a statistic which is required by the method. Fortunately, all the methods appear to be insensitive to errors in estimating the mean, with the exception of objective analysis with an oscillatory covariance function. This result is all the more surprising because the data sets were generated using exactly this covariance function.Keywords
This publication has 18 references indexed in Scilit:
- A least‐squares smooth fitting for irregularly spaced data: Finite‐element approach using the cubic B-spline basisGeophysics, 1986
- Analysis methods for numerical weather predictionQuarterly Journal of the Royal Meteorological Society, 1986
- Sources of Error in Objective AnalysisMonthly Weather Review, 1985
- Array design by inverse methodsProgress in Oceanography, 1985
- Open Ocean Modeling as an Inverse Problem: M2Tides in Bass StraitJournal of Physical Oceanography, 1984
- A technique for objective analysis and design of oceanographic experiments applied to MODE-73Deep Sea Research and Oceanographic Abstracts, 1976
- Numerical Solution of Waveguide Scattering Problems by Finite-Difference Green's FunctionsIEEE Transactions on Microwave Theory and Techniques, 1970
- A Correspondence Between Bayesian Estimation on Stochastic Processes and Smoothing by SplinesThe Annals of Mathematical Statistics, 1970
- An Approach to Time Series AnalysisThe Annals of Mathematical Statistics, 1961
- Theory of reproducing kernelsTransactions of the American Mathematical Society, 1950