Using Relevance to Reduce Network Size Automatically
- 1 January 1989
- journal article
- research article
- Published by Taylor & Francis in Connection Science
- Vol. 1 (1) , 3-16
- https://doi.org/10.1080/09540098908915626
Abstract
This paper proposes a means of using the knowledge in a network to determine the functionality or relevance of individual units, both for the purpose of understanding the network's behavior and improving its performance. The basic idea is to iteratively train the network to a certain performance criterion, compute a measure of relevance that identifies which input or hidden units are most critical to performance, and automatically remove the least relevant units. This skeletonization technique can be used to simplify networks by eliminating units that convey redundant information; to improve learning performance by first learning with spare hidden units and then removing the unnecessary ones, thereby constraining generalization; and to understand the behavior of networks in terms of minimal ‘rules’.Keywords
This publication has 3 references indexed in Scilit:
- What Size Net Gives Valid Generalization?Neural Computation, 1989
- Constraints and Preferences in Inductive Learning: An Experimental Study of Human and Machine PerformanceCognitive Science, 1987
- Parallel Distributed ProcessingPublished by MIT Press ,1986