Building precise classifiers with automatic rule extraction

Abstract
An algorithm is presented to train a special kind of a local basis function classifier. The so-called "rectangular basis function network" (RecBFN) consists of hidden units, each covering a rectangular area in the input space, using a trapezoidal activation function. The underlying training algorithm allows easy and fast construction of these types of networks and no parameters need to be adjusted, only normalization of the input-data is necessary. Classification performance of the RecBFN is shown to be comparable to the state of the art classifiers on eight datasets from the StatLog archive. In addition the resulting network allows easy extraction of the learned rules in a form of if-then statements. These rules additionally include soft boundaries resulting in membership values for each class (a possibility of membership is provided). Extraction of meaningful rules is demonstrated on several datasets. The resulting rules can be ranked according to the order of importance and allow the net to extract only few relevant rules in the case of a larger rule base. It is shown that the performance of the network degrades smoothly with the number of rules excluded from the final rule set.

This publication has 4 references indexed in Scilit: