Abstract
It is shown that for n-valued conditionally independent features a large family of classifiers can be expressed as an (n-1)st-degree polynomial discriminant function. The usefulness of the polynomial expansion is discussed and demonstrated by considering the first-order Minkowski metric, the Euclidean distance, and Bayes' classifiers for the ternary-feature case. Finally, some interesting side observations on the classifiers are made with respect to optimality and computational requirements.

This publication has 11 references indexed in Scilit: