Abstract
In some ordinal classification problems we know beforehand that the class label should be increasing (or decreasing) in the attributes. Such relations between class label and attributes are called monotone. We attempt to exploit such monotonicity constraints to reduce label noise. Noise may cause violations of the monotonicity constraint in the data set. In an attempt to reduce label noise, we make the data set monotone by relabeling data points. Through experiments on artificial data, we demonstrate that relabeling almost always produces an improved data set.
Original language | English |
---|---|
Title of host publication | 2016 International Joint Conference on Neural Networks |
Publisher | IEEE |
Pages | 2148-2155 |
Number of pages | 8 |
ISBN (Electronic) | 978-1-5090-0620-5 |
DOIs | |
Publication status | Published - 2016 |
Keywords
- Prediction algorithms
- Neural networks
- Labeling
- Upper bound
- Electronic mail