Title: OPTIMAL CLASSIFIER MODEL STATUS SELECTION USING BAYES BOUNDARY UNCERTAINTY
Abstract:We propose a method to select the optimal parameter status for any classifier model. In the statistical pattern recognition framework, optimal classification is defined as achieving the minimum classi...We propose a method to select the optimal parameter status for any classifier model. In the statistical pattern recognition framework, optimal classification is defined as achieving the minimum classification error probability (Bayes error). Although the error probability is defined on infinite data, in practice only a finite amount of data is available. Using the same finite data for classifier training and evaluation provides a serious underestimate of the Bayes error. Traditional solutions consist in holding out some of the available data for evaluation, which unavoidably decreases the data available for either training or evaluation. By contrast, our proposed method uses the same data for training and evaluation in a single training without splitting, which is made possible by evaluating the ideality of the classifier's classification boundary instead of estimating the error probability. Here, ideal classification boundary (Bayes boundary) refers to the boundary that leads to the Bayes error. We use the fact that the Bayes boundary solely consists of uncertain samples, namely samples whose class posterior probability is equal for the two classes separated by the boundary. Tests on several real-life datasets and experimental comparison to Cross-Validation clearly show the potential of our method.Read More
Publication Year: 2018
Publication Date: 2018-09-01
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
Cited By Count: 4
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot