LASSO is a computationally attractive technique for Parsimonius model estimation. Though highly successful in many applications, LASSO fails to identify the true model when the predictors are correlated. Recent attempts suggest that instead of using $\ell_1$ norm based regularization, such as the one used in LASSO, one should use an Ordered Weighted $\ell_1$(OWL) norm regularization. In this talk we will discuss the efficacy of OWL for model estimation and show that OWL induces un-necessary bias. By using a Sub-modular prespective we motivate Smoothed OWL(SOWL), a new norm, which significantly alleviates this problem. We establish several algorithmic and theoretical properties of SOWL including group identification and model consistency. We also provide algorithmic tools to compute the SOWL norm and its proximal operator, whose computational complexity O(d log d) is significantly better than that of general purpose solvers.
Chiranjib Bhattacharyya (Indian Institute of Science, Bangalore)
Date & Time
Mon, 12 March 2018, 15:00 to 16:00
Emmy Noether Seminar Room, ICTS Campus, Bangalore