Martin Casdagli, Stephen Eubank, J Farmer, John Gibson

Paper #: 91-03-019

Takens' theorem demonstrates that in the absence of noise a multidimensional state space can be reconstructed from a scalar time series. This theorem gives little guidance, however, about practical considerations for reconstructing a good state space. We extend Takens' treatment, applying statistical methods to incorporate the effects of observational noise and estimation error. We define to incorporate the effects of observational noise and estimation error. We define the ‘distortion matrix,’ which is proportional to the conditional covariance of a state, given a series of noisy measurements, and the ‘noise amplification,’ which is proportional to root-mean-square time series prediction errors with an ideal model. We derive explicit formulae for these quantities, and we prove that in the low limit minimizing the distortion is equivalent to minimizing the noise amplification. We identify several different scaling regimes for distortion and noise amplification, and derive asymptotic scaling laws. When the dimension and Lyapunov exponents are sufficiently large, these scaling laws show that, no matter how the state space is reconstructed, there is an explosion in the noise amplification--from a practical point of view determinism is lost, and the time series is effectively a random process. In the low-noise, large data limit we show that the technique of local singular value decomposition is an optimal coordinate transformation, in the sense that it achieves the minimum distortion in a state space of the lowest possible dimension. However, in numerical experiments we find that estimation error complicates this issue. For local approximation methods, we analyze the effect of reconstruction on estimation error, derive a scaling law, and suggest an algorithm for reducing estimation errors.

PDF