Mixture Models Based on Neural Network Averaging
- 1 January 2006
- journal article
- Published by MIT Press in Neural Computation
- Vol. 18 (1) , 1-9
- https://doi.org/10.1162/089976606774841576
Abstract
A modified version of the single hidden-layer perceptron architecture is proposed for modeling mixtures. A particular flexible mixture model is obtained by implementing the Box-Cox transformation as transfer function. In this case, the network response can be expressed in closed form as a weighted power mean. The quadratic Scheffé K-polynomial and the exponential Wilson equation turn out to be special forms of this general mixture model. Advantages of the proposed network architecture are that binary data sets suffice for “training” and that it is readily extended to incorporate additional mixture components while retaining all previously determined weights.Keywords
This publication has 9 references indexed in Scilit:
- Mixture Experiments: ILL-Conditioning and Quadratic Model SpecificationTechnometrics, 2002
- Generalized power means and interpolating inequalitiesProceedings of the American Mathematical Society, 1999
- Mixture models based on homogeneous polynomialsJournal of Statistical Planning and Inference, 1998
- Exact limits of mixture properties and excess thermodynamic functionsFluid Phase Equilibria, 1998
- On composition-dependent interaction coefficeintsFluid Phase Equilibria, 1990
- Additivity and interaction in three-component experiments with mixturesBiometrika, 1985
- An Analysis of TransformationsJournal of the Royal Statistical Society Series B: Statistical Methodology, 1964
- Vapor-Liquid Equilibrium. XI. A New Expression for the Excess Free Energy of MixingJournal of the American Chemical Society, 1964
- Experiments with MixturesJournal of the Royal Statistical Society Series B: Statistical Methodology, 1958