Abstract
Time series modeling as the sum of a deterministic signal and an autoregressive (AR) process is studied. Maximum likelihood estimation of the signal amplitudes and AR parameters is seen to result in a nonlinear estimation problem. However, it is shown that for a given class of signals, the use of a parameter transformation can reduce the problem to a linear least squares one. For unknown signal parameters, in addition to the signal amplitudes, the maximization can be reduced to one over the additional signal parameters. The general class of signals for which such parameter transformations are applicable, thereby reducing estimator complexity drastically, is derived. This class includes sinusoids as well as polynomials and polynomial-times-exponential signals. The ideas are based on the theory of invariant subspaces for linear operators. The results form a powerful modeling tool in signal plus noise problems and therefore find application in a large variety of statistical signal processing problems. The authors briefly discuss some applications such as spectral analysis, broadband/transient detection using line array data, and fundamental frequency estimation for periodic signals

This publication has 16 references indexed in Scilit: