Penalized Trimmed Squares and a Modification of Support Vectors for Unmasking Outliers in Linear Regress
DOI:
https://doi.org/10.57805/revstat.v5i1.45Keywords:
robust regression, mixed integer programming, penalty method, least trimmed squares, identifying outliers, support vector machinesAbstract
We consider the problem of identifying multiple outliers in linear regression models. We propose a penalized trimmed squares (PTS) estimator, where penalty costs for discarding outliers are inserted into the loss function. We propose suitable penalties for unmasking the multiple high-leverage outliers. The robust procedure is formulated as a Quadratic Mixed Integer Programming (QMIP) problem, computationally suitable for small sample data. The computational load and the effectiveness of the new procedure are improved by using the idea of ǫ-insensitive loss function from support vector machines regression. The small errors are ignored, and the mathematical formula gains the sparseness property. The good performance of the PTS estimator allows identification of multiple outliers avoiding masking effects.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2007 REVSTAT-Statistical Journal
This work is licensed under a Creative Commons Attribution 4.0 International License.