A Calibration Tutorial for Spectral Data. Part 2. Partial Least Squares Regression Using Matlab and Some Neural Network Results
- 1 January 1996
- journal article
- Published by SAGE Publications in Journal of Near Infrared Spectroscopy
- Vol. 4 (1) , 243-255
- https://doi.org/10.1255/jnirs.94
Abstract
Part 1 explained multiplicative scatter correction (MSC), the building of a principal component regression (PCR) model and how the test data can be used in prediction. Emphasis was on data pretreatment for linearistion and on spectral/chemical interpretation of the results. Part 2 discusses partial least squares (PLS or PLSR) regression. The data set prepared in Part 1 is also used here. Details on data pretreatment are, therefore, not repeated. Some details of PLS modeling are explained using the calculations of the example. Also, the interpretation of the PLS model gets some attention. Neural network calculation results are included for comparison. Artifical neural networks (ANN) are non-linear, so linearisation is not considered necessary. Latent variable regression methods such as PLS and PCR and ANNs are all successive approximations to the unknown function y = f(x) that forms the basis of all calibration methods. In latent variable regression, the rank of the model determines the degree of approximation. In ANNs, the number of hidden nodes and the number of iterations determine the degree of approximation.Keywords
This publication has 11 references indexed in Scilit:
- Rugged spectroscopic calibration using neural networksPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2005
- Universal approximation of an unknown mapping and its derivatives using multilayer feedforward networksPublished by Elsevier ,2003
- Neural network correction of nonlinearities in scanning probe microscope imagesJournal of Vacuum Science & Technology B: Microelectronics and Nanometer Structures, 1996
- Application of neural networks to a scanning probe microscopy systemThin Solid Films, 1995
- A PLS kernel algorithm for data sets with many variables and fewer objects. Part 1: Theory and algorithmJournal of Chemometrics, 1994
- The kernel algorithm for PLSJournal of Chemometrics, 1993
- Networks for approximation and learningProceedings of the IEEE, 1990
- Approximation by superpositions of a sigmoidal functionMathematics of Control, Signals, and Systems, 1989
- Multilayer feedforward networks are universal approximatorsNeural Networks, 1989
- The Significance of Simultaneous Methods of Parameter Estimation in Econometric ModelsJournal of the Royal Statistical Society Series C: Applied Statistics, 1963