Penalized Trimmed Squares and a Modification of Support Vectors for Unmasking Outliers in Linear Regress

Authors

  • G. Zioutas Aristotle University of Thessaloniki
  • A. Avramidis Aristotle University of Thessaloniki
  • L. Pitsoulis Aristotle University of Thessaloniki

DOI:

https://doi.org/10.57805/revstat.v5i1.45

Keywords:

robust regression, mixed integer programming, penalty method, least trimmed squares, identifying outliers, support vector machines

Abstract

We consider the problem of identifying multiple outliers in linear regression models. We propose a penalized trimmed squares (PTS) estimator, where penalty costs for discarding outliers are inserted into the loss function. We propose suitable penalties for unmasking the multiple high-leverage outliers. The robust procedure is formulated as a Quadratic Mixed Integer Programming (QMIP) problem, computationally suitable for small sample data. The computational load and the effectiveness of the new procedure are improved by using the idea of ǫ-insensitive loss function from support vector machines regression. The small errors are ignored, and the mathematical formula gains the sparseness property. The good performance of the PTS estimator allows identification of multiple outliers avoiding masking effects.

Published

2007-03-30

How to Cite

Zioutas , G., Avramidis , A., & Pitsoulis , L. (2007). Penalized Trimmed Squares and a Modification of Support Vectors for Unmasking Outliers in Linear Regress. REVSTAT-Statistical Journal, 5(1), 115–136. https://doi.org/10.57805/revstat.v5i1.45