Published: March 5, 2019

Optimization for High-dimensional Analysis and Estimation

 

High-dimensional signal analysis and estimation appear in many signal processing applications, including modal analysis, and parameter estimation in the spectrally sparse signals. The underlying low-dimensional structure in these high-dimensional signals inspires us to develop optimization-based techniques and theoretical guarantees for the above fundamental problems in signal processing. In many applications, high-dimensional signals often have certain concise representations, which is a linear combination of a small number of atoms in a dictionary with elements drawn from the signal space. In compressive sensing, L1-minimization is a widely used framework to find the sparse representations of a signal. It has recently been shown that atomic norm minimization, which is a generalization of L1-minimization, is an efficient and powerful way for exactly recovering unobserved time-domain samples and identifying unknown frequencies in signals having sparse frequency spectra, namely, finding a concise representation for spectrally sparse signals. This new technique works on a continuous dictionary and can completely avoid the effects of basis mismatch, which can plague conventional grid-based compressive sensing techniques. The objective of this presentation is to analyze and estimate the high-dimensional signals or parameters contained in these signals with optimization-based techniques.