Statistics, Optimization, and Machine Learning Seminar - Pratyush Tiwary (Virtual)

April 28, 2020

Pratyush Tiwary; Department of Chemistry & Biochemistry and Institute for Physical Science and Technology; University of Maryland From atoms to emergent dynamics (with help from statistical physics and artificial intelligence) ABSTRACT: The ability to rapidly learn from high-dimensional data to make reliable predictions about the future of a given system...

Stats, Optimization, and Machine Learning Seminar - Anindya De

Feb. 25, 2020

Anindya De, Department of Computer and Information Science, University of Pennsylvania Testing noisy linear functions for sparsity Consider the following basic problem in sparse linear regression -- an algorithm gets labeled samples of the form (x, + \eps) where w is an unknown n-dimensional vector, x is drawn from a...

Stats, Optimization, and Machine Learning Seminar - Stephen Becker

Jan. 28, 2020

Stephen Becker, Department of Applied Mathematics, University of Colorado Boulder Stochastic Subspace Descent: Stochastic gradient-free optimization, with applications to PDE-constrained optimization We describe and analyze a family of algorithms that generalize block-coordinate descent, where we assume one can take directional derivatives (for low-precision optimization, this can be approximated with finite...

Stats, Optimization, and Machine Learning Seminar - Zhihui Zhu

Jan. 21, 2020

Zhihui Zhu, Department of Electrical and Computer Engineering, University of Denver Provable Nonsmooth Nonconvex Approaches for Low-Dimensional Models As technological advances in fields such as the Internet, medicine, finance, and remote sensing have produced larger and more complex data sets, we are faced with the challenge of efficiently and effectively...

Stats, Optimization, and Machine Learning Seminar - Amir Ajalloeian, Maddela Avinash, Ayoub Ghriss

Dec. 10, 2019

Amir Ajalloeian; Department of Electrical, Computer, and Energy Engineering; University of Colorado Boulder Inexact Online Proximal-gradient Method for Time-varying Convex Optimization This paper considers an online proximal-gradient method to track the minimizers of a composite convex function that may continuously evolve over time. The online proximal-gradient method is "inexact,'' in...

Stats, Optimization, and Machine Learning Seminar - Sriram Sankaranarayanan

Nov. 12, 2019

Sriram Sankaranarayanan, Department of Computer Science, University of Colorado Boulder Reasoning about Neural Feedback Systems Data-driven components such as feedforward neural networks are increasingly being used in critical safety systems such as autonomous vehicles and closed-loop medical devices. Neural networks compute nonlinear functions. Relatively tiny networks present enormous challenges for...

Stats, Optimization, and Machine Learning Seminar - Mohsen Imani

Nov. 5, 2019

Mohsen Imadi; Department of Computer Science and Engineering; University of California, San Diego Towards Learning with Brain Efficiency Modern computing systems are plagued with significant issues in efficiently performing learning tasks. In this talk, I will present a new brain-inspired computing architecture. It supports a wide range of learning tasks...

Stats, Optimization, and Machine Learning Seminar - Alec Dunton

Oct. 22, 2019

Alec Dunton, Department of Applied Mathematics, University of Colorado Boulder Learning a kernel matrix for nonlinear dimensionality reduction (Weinberger et. al. 2004) We investigate how to learn a kernel matrix for high dimensional data that lies on or near a low dimensional manifold. Noting that the kernel matrix implicitly maps...

Stats, Optimization, and Machine Learning Seminar - Purnendu

Oct. 15, 2019

Purnendu, ATLAS Institute, University of Colorado Boulder The mathematical secrets of Computational Origami Origami is the Japanese name for the centuries-old art of folding paper into representations of birds, insects, animals, plants, human figures, inanimate objects, and abstract shapes. In the purest form of origami, the figure is folded from...

Stats, Optimization, and Machine Learning Seminar - Yury Makarychev

Oct. 1, 2019

Yury Makarychev Toyota Technological Institute at Chicago (TTIC) Performance of Johnson-Lindenstrauss Transform for k-Means and k-Medians Clustering Consider an instance of Euclidean k-means or k-medians clustering. We show that the cost of the optimal solution is preserved up to a factor of (1+ε) under a projection onto a random O(log(k/ε)/ε^2)-dimensional...

Pages