Truncated Loss Smooth Support Vector Ordinal Regression
-
Graphical Abstract
-
Abstract
Support vector ordinal regression (SVOR) has been proven to be the promising algorithm for solving ordinal regression problems. However, its performance tends to be strongly affected by outliers in the training datasets. To remedy this drawback, a truncated loss smooth SVOR (TLS-SVOR) is proposed. While learning ordinal regression models, the loss s of the misranked sample is bounded between 0 and the truncated coefficient u. First, a piecewise polynomial function with parameter u is approximated to s. Then, by applying the strategy of smooth support vector machine for classification, the optimization problem is replaced with an unconstrained function which is twice continuously differentiable. The algorithm employs Newton's method to obtain the unique discriminant hyperplane. The optimal parameter combination of TLS-SVOR is determined by a two-stage uniform designed model selection methodology. The experimental results on benchmark datasets show that TLS-SVOR has advantage in terms of accuracy over other ordinal regression approaches.
-
-