Seminários

Inscreva-se na Lista | Propor Palestra | Coordenadores | Anteriores

PRÓXIMOS



Mating quadratic maps with the modular group

Expositor: Luna Lomonaco - IMPA
Qui 19 mai 2022, 15:30 - SALA 228Seminário de Sistemas Dinâmicos e Teoria Ergódica

Resumo: Holomorphic correspondences are multi-valued maps defined by polynomial relations P(z,w)=0. We consider a specific 1-(complex)parameter family of (2:2) correspondences (every point has 2 images and 2 preimages) which we show encodes both the dynamics of a rational map and the dynamics of the modular group. We show that the connectedness locus for this family is homeomorphic to the parabolic Mandelbrot set, itself homeomorphic to the Mandelbrot set. Joint work with S. Bullett.


Cruzando mapas quadráticos com o grupo modular

Expositor: Luna Lomonaco - IMPA
Sex 20 mai 2022, 15:30 - AUDITORIO 2Palestra Especial

Resumo: Correspondências holomorfas são mapas multivaluados definidos por relações polinomiais da forma P(z,w)=0. Consideramos uma família específica de (2:2) correspondências (cada ponto tem 2 imagens e 2 pré imagens) que mostramos apresentar ambas a dinâmica de mapas racionais e do grupo modular. Mostraremos como o lugar de conexidade desta família é homeomorfo ao conjunto de Mandelbrot parabólico, por sua vez homeomorfo ao conjunto de Mandelbrot. O trabalho é em colaboração com S. Bullett.


Data-centric approach for Machine Learning applications development

Expositor: Lucas Rolim - HURB - Hotel Urbano
Seg 23 mai 2022, 09:00 - SALA 232Centro Pi

Resumo: Data quality plays a vital role in machine learning, which consists of models trying to extract signals from patterns within datasets. In most industries and companies, available data are messy, poorly labeled, unstructured, and susceptible to drifts in their distributions and nature. A data-centric approach embraces these issues and sheds light on techniques for optimizing datasets for machine learning applications. We will discuss techniques where the spotlight is on the dataset. Instead of iterating over different model architectures, we explore different data manipulations that better differentiate signal and noise in the data that we use to feed arbitrary machine learning models. This talk will discuss the pros and cons of this approach and why it may better fit the reality of most companies and can become a vibrant topic for research.


A Cyclic Douglas-Rachford Iteration Scheme

Expositor: Di Liu - IMPA
Qua 25 mai 2022, 15:30 - SALA 228Seminário de Otimização

Resumo: In this lecture, I will talk about the cyclic Douglas-Rachford iteration scheme proposed by M.Borwein and K.Tam in 2014. This scheme can be applied directly to N-set convex feasibility problems in Hilbert space. The main results are the weak convergence of the methods to a point whose nearest point projection onto each of the N sets coincides. For affine subspaces, convergence is in norm. The numerical results are compared with the classical (product-space) Douglas-Rachford scheme.


Biological waves from the perspective of shock-wave theory

Expositor: Arthur Bizzi - IMPA
Qui 26 mai 2022, 13:30 - SALA 224Seminário de Matemática Aplicada e Computacional

Resumo: We'll discuss some preliminary results regarding a connection between the reaction-diffusion equations of mathematical biology and the nonlinear wave equations of fluid dynamics.

By means of a generalized inverse Cole-Hopf transformation, we mean to study the travelling waves of FKPP-type systems from the perspective of hyperbolic balance laws. In particular, these new 'wave-like' forms are amenable to scaling, which ultimately allows us to obtain an inviscid first-order approximation from which most of the canonical results on FKPP travelling waves follow easily. We'll finish with a discussion on possible generalizations to multiple space dimensions and to systems of equations.


On a Common Ground Between Geological Modeling and Machine Learning

Expositor: Viacheslav Borovitskiy - ETH Zürich
Seg 13 jun 2022, 09:00 - Palestra Virtual / Online LectureCentro Pi

Resumo: One of the central problems of geological modeling is and always was the interpolation of spatial data. The sparseness of this data usually makes accurate deterministic interpolation impossible. Because of this, probabilistic interpolation driven by Gaussian processes is the standard tool of geological modeling. The very same Gaussian processes are used as machine learning models for various applications, including optimization and control.

The geological modeling community and the machine learning community study similar problems related to Gaussian processes, but stay essentially separate, with only occasional interchange of ideas and results. I will talk about the problem of efficiently sampling from posterior (a.k.a. conditional) Gaussian processes given a large dataset, which may be rather easily solved by combining the ideas of the two communities and which is relevant for both.


Weakly convex foliations in Mechanical systems

Expositor: Alexsandro Schneider - Unicentro
Ter 14 jun 2022, 15:30 - SALA 236Seminário de Geometria Diferencial

Resumo: In this presentation, I will talk about some special foliations on Hamiltonian systems where each leaf is a transversal section. In our main result we assume a mechanical system with a critical energy level having a finite number of saddle-center equilibrium points. We proved that if the critical energy level satisfies certain dynamical condition, then there exists a region on each energy level slightly above the critical one that admits a special foliation called weakly convex foliation. Since the dynamical condition is easly checkble, we applied the main result to some classical mechanical systems. This is a joint work with de Paulo, Salomão, Kim and Vanderlinde.


Robust linear regression in high-dimensions and stochastic gradient descent

Expositor: Philip Thompson - Purdue University, Krannert School of Management
Qua 15 jun 2022, 15:30 - SALA 228Seminário de Otimização

Resumo: We revisit the problem of robust linear regression where a fraction of the sample is contaminated by outliers and the distribution is heavy-tailed (both in the features and covariates). Limited work is done in the setting where the number of samples is smaller than the dimension assuming a sparse parameter. We develop an iterative algorithm that is able to achieve the optimal estimation rate for this class of problems in reasonable time complexity. Our analysis combines ideas from recent literature of algorithmic robust statistical learning, proximal gradient descent and precise concentration inequalities. If time permits, we discuss improvements on the recent literature of robust stochastic gradient descent.This is joint work with R.I. Oliveira and Zoraida Fernadez-Rico.