Zhihui Zhu, Department of Electrical and Computer Engineering, University of Denver
Provable Nonsmooth Nonconvex Approaches for Low-Dimensional Models
As technological advances in fields such as the Internet, medicine, finance, and remote sensing have produced larger and more complex data sets, we are faced with the challenge of efficiently and effectively extracting meaningful information from large-scale and high-dimensional signals and data. Many modern approaches to addressing this challenge naturally involve nonconvex optimization formulations. Although in theory finding a local minimizer for a general nonconvex problem could be computationally hard, recent progress has shown that many practical (smooth) nonconvex problems obey benign geometric properties and can be efficiently solved to global solutions.
In this talk, I will extend this powerful geometric analysis to robust low-dimensional models where the data or measurements are corrupted by outliers taking arbitrary values. We consider nonsmooth nonconvex formulations of the problems, in which we employ an L1-loss function to robustify the solution against outliers. We characterize a sufficiently large basin of attraction around the global minima, enabling us to develop subgradient-based optimization algorithms that can rapidly converge to a global minimum with a data-driven initialization. I will also talk about our very recent work for general nonsmooth optimization on the Stiefel manifold which appears widely in engineering. I will discuss the efficiency of this approach in the context of robust subspace recovery, robust low-rank matrix recovery, and orthogonal dictionary learning.
Zhihui Zhu received a Ph.D. degree in electrical engineering from Colorado School of Mines, Golden, CO, in 2017, and was a Postdoctoral Fellow in the Mathematical Institute for Data Science at the Johns Hopkins University, Baltimore, MD, in 2018-2019. He is an Assistant Professor with the Department of Electrical and Computer Engineering, University of Denver, CO. His research interests include the areas of data science, machine learning, signal processing, and optimization. His current research largely focuses on the theory and applications of nonconvex optimization and low-dimensional models in large-scale machine learning and signal processing problems.