Neural network transformation of arbitrary Boolean functions

Abstract
Boolean logic is considered to be a good source for classification problems, an area dominated by neural networks. Although quite a few algorithms exist for training and implementing neural networks, no technique exists that can guarantee the transformation of any arbitrary Boolean function to neural networks. This paper describes a method that accomplishes exactly that. The algorithm is tested on the classic character recognition problem using translated, rotated, deformed, and noisy patterns. The initial simulation results are presented. Comparison of the proposed network to several popular existing networks has been performed and its advantages outlined. The future direction of research has also been explained.