Coordinate Descent Algorithm for Least Absolute Deviations Regression
作者
Authors
Zehaan Naik|Debasis Kundu
期刊
Journal
暂无期刊信息
年份
Year
2026
分类
Category
国家
Country
英国United Kingdom
📝 摘要
Abstract
Least Absolute Deviations (LAD) regression provides a robust alternative to ordinary least squares by minimizing the sum of absolute residuals. However, its widespread use has been limited by the computational cost of existing solvers, particularly simplex-based methods in high-dimensional settings. We propose a coordinate descent algorithm for LAD regression that avoids matrix inversion, naturally accommodates the non-differentiability of the objective function, and remains well-defined even when the number of predictors exceeds the number of observations. The key observation is that each coordinate update reduces to a one-dimensional minimization admitting a closed-form solution given by a median or weighted median. The resulting algorithm has per-iteration complexity $O(p\,n \log n)$ and is provably convergent due to the convexity of the LAD objective and the exactness of each coordinate update. Experiments on synthetic and real datasets show that the method matches the accuracy of linear-programming-based LAD solvers while offering improved scalability and stability in high-dimensional regimes, including cases where $p \ge n$. The method is easy to implement, requires no specialized optimization software, and provides a practical tool for robust linear models.
📊 文章统计
Article Statistics
基础数据
Basic Stats
171
浏览
Views
0
下载
Downloads
16
引用
Citations
引用趋势
Citation Trend
阅读国家分布
Country Distribution
阅读机构分布
Institution Distribution
月度浏览趋势
Monthly Views