Sparse representations and compressive sensing
Abstract: Do you know when and how we can solve linear systems with many more columns than rows? What if I tell you that behind this question lies the notion of representing a mathematical object sparsely, i.e., writing it as a combination of as few functions or vectors as possible? In this three-day course, we will dive into an introduction to the fascinating world of sparse representations and compressive sensing. We will explore the mathematical foundations and practical implications of these techniques, which have revolutionized signal processing, data science, and form the basis of many methods in machine learning. We will see how this simple but fundamental question has deep connections to various mathematical fields, including concentration of measure, geometry of Banach spaces, non-linear optimization, and harmonic analysis. If time allows, we will also learn about algorithms for finding sparse representations and discuss exciting applications and open problems in the field.