Multiple Neural Networks and Bayesian Belief Revision for a Never-Ending Unsupervised Learning

TitleMultiple Neural Networks and Bayesian Belief Revision for a Never-Ending Unsupervised Learning
Publication TypeBook Chapter
Year of Publication2012
AuthorsBaldassarri, P., A F. Dragoni, and G. Vallesi
Book TitleMethods, Models, Simulations And Approaches Towards A General Theory Of Change
Chapter21
Pagination297-307
KeywordsBayes rule, hybrid systems, multiple neural networks, supervised learning
Abstract

We propose an Hybrid System for dynamic environments, where a Multiple Neural Networks system works with Bayes Rule. One or more neural nets could no longer be able to properly operate, due to partial changes in some of the characteristics of the individuals. We assume that each network has a reliability factor that can be dynamically re-evaluated on the ground of the global recognition operated by the overall group. Since the net's degree of reliability is defined as the probability that the net is giving the desired output, in case of conflicts between the outputs of the nets the re-evaluation of their degrees of reliability can be simply performed on the basis of the Bayes Rule. The new vector of reliability will be used for making the final choice, by applying two algorithms, the Inclusion based and the Weighted one over all the maximally consistent subsets of the global outcome.

URLhttp://www.worldscientific.com/doi/abs/10.1142/9789814383332_0021
DOI10.1142/9789814383332_0021