Sequences Of Bayes Gaussian Classifier

Date

2008-04-22T02:41:15Z

Authors

Journal Title

Journal ISSN

Volume Title

Publisher

Electrical Engineering

Abstract

A new method for designing sequences of Bayes Gaussian Classifiers is presented in this thesis. First, a basic Bayes Gaussian Classifier is designed with an assumption of data being Gaussian. Then, we have used the Output Weight Optimization-Back Propagation (OWO-BP) technique to iteratively modify the coefficients of the classifier, resulting in less classification error. Through use of an iterative Gram-Schmidt procedure, to train linear functional link nets, input features are ordered from most useful to least useful. Another important development in this thesis is the generation of nested feature subsets. This ensures that the curve for error percentage versus the number of features is monotonically non-increasing. Based upon this list of ordered features, nested feature subsets are produced, with a Bayes Gaussian Classifier designed for each subset. These classifiers exhibit reduced probability of error as the subset size (number of selected inputs) increases. Various real world data have been used to test and verify the classifier's performances.

Description

Keywords

Citation