Performance measurement and data base design
Alfonso P. Cardenas, Larry F. Bowman, et al.
ACM Annual Conference 1975
Neural networks offer a valuable alternative to Bayesian classifiers in evaluating a posteriori class probabilities for classifying stochastic patterns. In contrast to the Bayesian classifier, the “neural” classifier makes no assumptions on the probabilistic nature of the problem, and is thus universal in the sense that it is not restricted to an underlying probabilistic model. Instead, it adjusts itself to a given training set by a learning algorithm, and thus, can learn the stochastic properties of the specific problem. Evaluation of the a posteriori probabilities can be computed, in principle, by stochastic networks such as the Boltzmann machine. However, these networks are computationally extremely inefficient. In this paper we show that the a posteriori class probabilities can be efficiently computed by a deterministic feedforward network which we call the Boltzmann Perceptron Classifier (BPC). Maximum a posteriori (MAP) classifiers are also constructed as a special case of the BPC. Structural relationship between the BPC and a conventional multilayer Perceptron (MLP) are given, and it is demonstrated that rather intricate boundaries between classes can be formed even with a relatively modest number of network units. Simulation results show that the BPC is comparable in performance to a Bayesian classifier although no assumptions on the probabilistic model of the problem are assumed for the BPC. © 1990, IEEE
Alfonso P. Cardenas, Larry F. Bowman, et al.
ACM Annual Conference 1975
Erich P. Stuntebeck, John S. Davis II, et al.
HotMobile 2008
Michael C. McCord, Violetta Cavalli-Sforza
ACL 2007
Rolf Clauberg
IBM J. Res. Dev