Robust Variable Selection Via Reciprocal Elastic Net in High-Dimensional Regression
-
https://doi.org/10.14419/m6v7vs95
-
Robust Regression; Variable Selection; Reciprocal Elastic Net; Huber Loss; High-Dimensional Data; Financial Risk Analysis -
Abstract
Variable selection in high-dimensional regression models is crucial for improving interpretability and predictive accuracy. Traditional penalized regression methods, such as the LASSO and Elastic Net, suffer from sensitivity to outliers, which can lead to biased coefficient estimation and incorrect variable selection. In this study, we propose a robust variable selection method based on the reciprocal elastic net penalty, which enhances sparsity while maintaining stability in the presence of extreme values. To further improve robustness, we integrate Huber loss and M-estimators, thereby mitigating the influence of outliers on the regression coefficients. The proposed method is evaluated through an extensive simulation study under different contamination levels and applied to a financial risk dataset, where the presence of anomalies is common. Performance is assessed using mean absolute error and breakdown point as evaluation criteria. The results demonstrate that the robust reciprocal elastic net outperforms traditional penalized regression models and provides more reliable variable selection in the presence of outliers.
-
References
- Alhamzawi R, Ali H T, & Matar M (2023), A new reciprocal elastic net penalty for high-dimensional regression models. Statistical Methods & Ap-plications, 32(1), 85–104.
- AL-Sabbah S A, Mohammed L A, & Raheem S H (2021), Sliced inverse regression (SIR) with robust group LASSO. International Journal of Ag-ricultural & Statistical Sciences, 17(1). https://doi.org/10.55562/jrucs.v46i1.101.
- Hastie T, Tibshirani R, & Friedman J (2009), The elements of statistical learning: Data mining, inference, and prediction. Springer. https://doi.org/10.1007/978-0-387-84858-7.
- Huber P J (1964), Robust estimation of a location parameter. Annals of Mathematical Statistics, 35(1), 73–101. https://doi.org/10.1214/aoms/1177703732.
- Huber P J (1981), Robust statistics. Wiley. https://doi.org/10.1002/0471725250.
- Maronna R A, Martin R D, & Yohai V J (2019), Robust statistics: Theory and methods (with R). Wiley. https://doi.org/10.1002/9781119214656.
- Rousseeuw P J, & Yohai V J (1984), Robust regression by means of S-estimators. In Robust and Nonlinear Time Series Analysis, 256–272. Spring-er. https://doi.org/10.1007/978-1-4615-7821-5_15.
- Tibshirani R (1996), Regression shrinkage and selection via the LASSO. Journal of the Royal Statistical Society: Series B (Statistical Methodolo-gy), 58(1), 267–288. https://doi.org/10.1111/j.2517-6161.1996.tb02080.x.
- Zou H, & Hastie T (2005), Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 67(2), 301–320. https://doi.org/10.1111/j.1467-9868.2005.00503.x.
- Kepplinger D (2023), Robust variable selection and estimation via adaptive elastic net S-estimators for linear regression. Computational Statistics & Data Analysis, 183, 107740. https://doi.org/10.1016/j.csda.2023.107730.
- Wang W, Liang J, Liu R, Song Y, & Zhang M (2022), A robust variable selection method for sparse online regression via the elastic net penalty. Mathematics, 10(16), 2985. https://doi.org/10.3390/math10162985.
-
Downloads
-
How to Cite
Raheem, S. H. . . (2025). Robust Variable Selection Via Reciprocal Elastic Net in High-Dimensional Regression. International Journal of Advanced Mathematical Sciences, 11(2), 67-73. https://doi.org/10.14419/m6v7vs95
