Sunday, Aug 6: 4:45 PM - 5:05 PM

Topic-Contributed Paper Session

Topic-Contributed Paper Session

Metro Toronto Convention Centre

Receiver Operating Characteristic (ROC) curves are commonly used in binary classification, and can also be used to evaluate learned penalty functions in the context of supervised changepoint detection. Since the Area Under the Curve (AUC) is a piecewise constant function of the predicted values, it can not be directly optimized by gradient descent. Recently we showed that minimizing a piecewise linear surrogate loss, AUM (Area Under Min of false positives and false negatives), results in maximizing AUC. In this talk we propose a new algorithm for AUM minimization, which exploits the piecewise linear structure to efficiently compute an exact line search, for every step of gradient descent. Because the exact line search is quadratic time in the worst case, we additionally propose an approximate line search which is log-linear time in the worst case (asymptotically the same as a constant step size). Our empirical results show that the proposed algorithm is more computationally efficient than other variants of gradient descent (constant step size, line search using grid search, etc).