Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Kubat, Miroslava | Cooperson Jr., Martinb
Affiliations: [a] Department of Electrical and Computer Engineering, University of Miami, Coral Gables, FL 33146, USA | [b] Center for Advanced Computer Studies, University of Louisiana at Lafayette, Lafayette, LA 70504-4330, USA
Abstract: An important issue in nearest-neighbor classifiers is how to reduce the size of large sets of examples. Whereas many researchers recommend to replace the original set with a carefully selected subset, we investigate a mechanism that creates three or more such subsets. The idea is to make sure that each of them, when used as a 1-NN subclassifier, tends to err in a different part of the instance space. In this case, failures of individuals can be corrected by voting. The costs of our example-selection procedure are linear in the size of the original training set and, as our experiments demonstrate, dramatic data reduction can be achieved without a major drop in classification accuracy.
Keywords: nearest-neighbor classifiers, example selection, voting
DOI: 10.3233/IDA-2001-5603
Journal: Intelligent Data Analysis, vol. 5, no. 6, pp. 463-476, 2001
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
[email protected]
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office [email protected]
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
如果您在出版方面需要帮助或有任何建, 件至: [email protected]