A comparison of fuzzy ARTMAP and Gaussian ARTMAP neural networks for incremental learning

A comparison of fuzzy ARTMAP and Gaussian ARTMAP neural networks for incremental learning

Granger, Eric and Connolly, Jean François and Sabourin, Robert

Proceedings of the International Joint Conference on Neural Networks 2008

Abstract : Automatic pattern classifiers that allow for incremental learning can adapt internal class models efficiently in response to new information, without having to retrain from the start using all the cumulative training data. In this paper, the performance of two such classifiers – the fuzzy ARTMAP and Gaussian ARTMAP neural networks – are characterize and compared for supervised incremental learning in environments where class distributions are fixed. Their potential for incremental learning of new blocks of training data, after previously been trained, is assessed in terms of generalization error and resource requirements, for several synthetic pattern recognition problems. The advantages and drawbacks of these architectures are discussed for incremental learning with different data block sizes and data set structures. Overall results indicate that Gaussian ARTMAP is the more suitable for incremental learning as it usually provides an error rate that is comparable to that of batch learning for the data sets, and for a wide range of training block sizes. The better performance is a result of the representation of categories as Gaussian distributions, and of using category-specific learning rate that decreases during the training process. With all the data sets, the error rate obtained by training through incremental learning is usually significantly higher than through batch learning for fuzzy ARTMAP. Training fuzzy ARTMAP and Gaussian ARTMAP through incremental learning often requires fewer training epochs to converge, and leads to more compact networks. © 2008 IEEE.