An Empirical Comparison of Information-Theoretic Selection Criteria for Multivariate Behavior Genetic Models
- 1 November 2004
- journal article
- Published by Springer Nature in Behavior Genetics
- Vol. 34 (6) , 593-610
- https://doi.org/10.1007/s10519-004-5587-0
Abstract
Information theory provides an attractive basis for statistical inference and model selection. However, little is known about the relative performance of different information-theoretic criteria in covariance structure modeling, especially in behavioral genetic contexts. To explore these issues, information-theoretic fit criteria were compared with regard to their ability to discriminate between multivariate behavioral genetic models under various model, distribution, and sample size conditions. Results indicate that performance depends on sample size, model complexity, and distributional specification. The Bayesian Information Criterion (BIC) is more robust to distributional misspecification than Akaike's Information Criterion (AIC) under certain conditions, and outperforms AIC in larger samples and when comparing more complex models. An approximation to the Minimum Description Length (MDL; Rissanen, J. (1996). IEEE Transactions on Information Theory 42:40–47, Rissanen, J. (2001). IEEE Transactions on Information Theory 47:1712–1717) criterion, involving the empirical Fisher information matrix, exhibits variable patterns of performance due to the complexity of estimating Fisher information matrices. Results indicate that a relatively new information-theoretic criterion, Draper's Information Criterion (DIC; Draper, 1995), which shares features of the Bayesian and MDL criteria, performs similarly to or better than BIC. Results emphasize the importance of further research into theory and computation of information-theoretic criteria.Keywords
This publication has 31 references indexed in Scilit:
- Distributions Generated by Perturbation of Symmetry with Emphasis on a Multivariate Skewt-DistributionJournal of the Royal Statistical Society Series B: Statistical Methodology, 2003
- Detection of signals by information theoretic criteria: general asymptotic performance analysisIEEE Transactions on Signal Processing, 2002
- Elements of Information TheoryPublished by Wiley ,2001
- The minimum description length principle in coding and modelingIEEE Transactions on Information Theory, 1998
- The multivariate skew-normal distributionBiometrika, 1996
- An entropy criterion for assessing the number of clusters in a mixture modelJournal of Classification, 1996
- Variable Selection in Nonparametric Regression with Categorical CovariatesJournal of the American Statistical Association, 1992
- Minimum complexity density estimationIEEE Transactions on Information Theory, 1991
- Genetics of Asthma and Hay Fever in Australian TwinsAmerican Review of Respiratory Disease, 1990
- The genetic analysis of repeated measures. I. Simplex modelsBehavior Genetics, 1987