Explore the effectiveness of nonconvex optimization for noisy tensor completion in this 33-minute conference talk from the Tensor Methods and Emerging Applications to the Physical and Data Sciences 2021 workshop. Delve into Yuxin Chen's presentation on a two-stage nonconvex algorithm that addresses the high-volatility issue in sample-starved regimes, enabling linear convergence, minimal sample complexity, and minimax statistical accuracy. Learn about the characterization of the nonconvex estimator's distribution and its application in constructing entrywise confidence intervals for unseen tensor entries and unknown tensor factors. Gain insights into the role of statistical models in facilitating efficient and guaranteed nonconvex statistical learning, covering topics such as imperfect data acquisition, statistical computational gaps, gradient descent challenges, and key proof ideas like leave-one-out decoupling.
The Effectiveness of Nonconvex Tensor Completion - Fast Convergence and Uncertainty Quantification