Speaker: Yichi Zhang, Ph.D. Candidate, Department of Statistics, NC State University
Title: Spectral Method in Action: Inference and Large-scale Computation
The spectral methods are popular in statistics and machine learning for their succinct algorithms and solid theoretical properties. By viewing diverse statistical inference problems under the general “signal-plus-noise” model, the spectral method exploits the leading singular value decomposition (SVD) of the data matrix to approximate the underlying low-rank truth, and the theoretical optimality is further established under the umbrella of rapidly-developed matrix perturbation theory of classic SVD.
This talk mainly contains two parts. In Part I, I will introduce a new spectral method for the distribution-free covariance estimation problem of high-dimensional matrix-valued data. The minimax-optimal convergence rate of the proposed estimator is presented under certain divergence regimes. The superior finite-sample performance of our method is demonstrated in the simulation and real application from a gridded temperature anomalies dataset.
In Part II, a new matrix perturbation framework for the randomized SVD (RSVD) is established. Thanks to the new theoretical framework, the spectral method, like our covariance estimator in Part I, can be alternatively driven by RSVD instead of the classic SVD, which significantly eases the large-scale computation burden; meanwhile, its theoretical property remains solid. Specifically, 2-norm and 2-to-infinity perturbation bounds between the RSVD-based singular vectors and their true counterparts are derived. A sharp phase transition phenomenon is further observed in which a smaller signal-to-noise ratio requires larger values of the power iteration in RSVD to guarantee optimal perturbation rates. Finally, we illustrate our theoretical results by deriving nearly-optimal performance guarantees when RSVD is applied to two additional statistical inference problems, namely, community detection in network and matrix completion.