Statistical analysis of some sparsity-based estimators
Dr. Benjamin Poignard (Osaka University)
We study the asymptotic properties of a new version of the Sparse Group Lasso estimator, called adaptive SGL. This new version includes two distinct regularization parameters, one for the Lasso penalty and one for the Group Lasso penalty, and we consider the adaptive version of this regularization, where both penalties are weighted by preliminary random coefficients. The asymptotic properties are established in a general framework, where the data are dependent and the loss function is convex. We prove that this estimator satisfies the oracle property (Fan and Li, 2001): the sparsity based estimator recovers the true underlying sparse model and is asymptotically normally distributed. We also study its asymptotic properties in a double-asymptotic framework, where the number of parameters diverges with the sample size.
In another work, we study the estimation of high-dimensional factor models. Based on a first step estimator of the idiosyncratic covariance matrix, we show that the least square loss function satisfies the restricted strong convexity property (Negahban, Ravikumar, Wainwright, Yu, 2012) with respect to the loading factor matrix, allowing for non-convexity in both the loss and regularization functions. Under this property and proper regularity conditions on the penalty term, we provide finite-sample precision bounds for the regularized factor loading matrix
|Date||March 23, 2018 (Fri) 14:00 - 15:00|