Approximate entropy (ApEn) as a complexity measure
- 1 March 1995
- journal article
- Published by AIP Publishing in Chaos: An Interdisciplinary Journal of Nonlinear Science
- Vol. 5 (1) , 110-117
- https://doi.org/10.1063/1.166092
Abstract
Approximate entropy (ApEn) is a recently developed statistic quantifying regularity and complexity, which appears to have potential application to a wide variety of relatively short (greater than 100 points) and noisy time-series data. The development of ApEn was motivated by data length constraints commonly encountered, e.g., in heart rate, EEG, and endocrine hormone secretion data sets. We describe ApEn implementation and interpretation, indicating its utility to distinguish correlated stochastic processes, and composite deterministic/ stochastic models. We discuss the key technical idea that motivates ApEn, that one need not fully reconstruct an attractor to discriminate in a statistically valid manner-marginal probability distributions often suffice for this purpose. Finally, we discuss why algorithms to compute, e.g., correlation dimension and the Kolmogorov-Sinai (KS) entropy, often work well for true dynamical systems, yet sometimes operationally confound for general models, with the aid of visual representations of reconstructed dynamics for two contrasting processes. (c) 1995 American Institute of Physics.Keywords
This publication has 21 references indexed in Scilit:
- Enhanced basal and disorderly growth hormone secretion distinguish acromegalic from normal pulsatile growth hormone release.Journal of Clinical Investigation, 1994
- Approximate Entropy of Heart Rate as a Correlate of Postoperative Ventricular DysfunctionAnesthesiology, 1993
- Approximate entropy: Statistical properties and applicationsCommunications in Statistics - Theory and Methods, 1992
- Aging and the complexity of cardiovascular dynamicsBiophysical Journal, 1991
- How Sampling Reveals a ProcessThe Annals of Probability, 1990
- Nonlinear forecasting as a way of distinguishing chaos from measurement error in time seriesNature, 1990
- Predicting chaotic time seriesPhysical Review Letters, 1987
- Ergodic theory of chaos and strange attractorsReviews of Modern Physics, 1985
- Estimation of the Kolmogorov entropy from a chaotic signalPhysical Review A, 1983
- Bernoulli shifts with the same entropy are isomorphicAdvances in Mathematics, 1970