Minicourses

EBP- Minicurso 1:

“Determinantal processes and zeros of Gaussian analytic functions”
Yuval Peres (UC Berkeley e Microsoft)

Aula 1: POINT PROCESSES AND REPULSION.

Point processes (random scatters of points in space) have applications in many areas, including statistics and cosmology. Recently, there has been increasing interest in processes that exhibit “repulsion”. We will see why zeros of random polynomials have this property, and describe the effect of repulsion on matching and allocation problems.

Aula 2: ZEROS OF GAUSSIAN ANALYTIC FUNCTIONS

Zeros of Gaussian analytic functions have a remarkable rigidity property, discovered by M. Sodin: The first order intensity determines the whole process. In several senses (e.g. hole probabilities, gravitational potential), the planar Gaussian zeros behave like a four dimensional Poisson process.

Aula 3: DETERMINANTAL PROCESSES

Discrete and continuous point processes where the joint intensities are determinants arise in Combinatorics (Random spanning trees) and Physics (Fermions, eigenvalues of Random matrices). For these processes the number of points in a region can be represented as a sum of independent, zero-one valued variables, one for each eigenvalue of the relevant operator.

Aulas 4 e 5: ZEROS OF THE I.I.D. GAUSSIAN POWER SERIES.

The power series with i.i.d. complex Gaussian coefficients has zeros that form an isometry-invariant determinantal process in the disk model of the hyperbolic plane. This allows an exact calculation of the law of the number of zeros in a subdisk. We also analyze the dynamic version where the coefficients perform Brownian motion.

 

EBP- Minicurso 2:

“Self-similarity and long-range dependence”
Murad S. Taqqu (U. Boston)

Aula 1: “SELF-SIMILARITY AND COMPUTER NETWORK TRAFFIC”

In this lecture we will introduce self-similarity in the context of computer network traffic. It will show why self-similarity is important in this area and will motivate the subsequent lectures. Ethernet local area network traffic appears to be approximately statistically self-similar. This discovery, made about eight years ago, has had a profound impact on the field. I will try to explain what statistical self-similarity means, how it is detected and indicate how one can construct random processes with that property by aggregating a large number of “on-off” renewal processes. If the number of replications grows to infinity then, after rescaling, the limit turns out to be the Gaussian self-similar process called fractional Brownian motion. If, however, the rewards are heavy-tailed as well, then the limit is a stable non-Gaussian process with infinite variance and dependent increments. Since linear fractional stable motion is the stable counterpart of the Gaussian fractional Brownian motion, a natural conjecture is that the limit process is linear fractional stable motion. This conjecture, it turns out, is false. The limit is a new type of infinite variance self-similar process.

Aula 2: “FRACTIONAL BROWNIAN MOTION, LONG-RANGE DEPENDENCE AND FARIMA MODELS”

Long-range dependence in a stationary time series occurs when the covariances tend to zero like a power function and so slowly that their sums diverge. It is often observed in nature, for example in economics, telecommunications and hydrology. It is closely related to self-similarity. Self-similarity refers to invariance in distribution under a suitable change of scale. To understand the relationship between self-similarity and long-range dependence, suppose that the self-similar process has stationary increments. Then these increments form a stationary time series which can display long-range dependence. Conversely, start with a stationary time series (with long-range dependence). Then a central limit-type theorem will yield a self-similar process with stationary increments. The intensity of long-range dependence is related to the scaling exponent of the self-similar process. We shall provide here a tutorial on fractional Brownian motion, the Gaussian self-similar process with stationary increments, on its increment process known as fractional Gaussian noise, which displays long-range dependence, and on a large class of long-range dependent stationary sequences called FARIMA, which are commonly used in modeling such physical phenomena.

Aula 3: “SELF-SIMILARITY AND LONG-RANGE DEPENDENCE THROUGH THE WAVELET LENS”

We provide a brief introduction to wavelets and describe how self-similar and long-range dependent processes can be detected by using the discrete wavelet transform. We discuss the nature of the wavelet coefficients and their statistical properties. The Logscale Diagram is introduced as a natural means to study scaling data and we show how it can be used to obtain unbiased semi-parametric estimates of the scaling exponent. We then focus on the case of long-range dependence and address the problem of defining a lower cutoff scale corresponding to where scaling starts. We also discuss some related problems arising from the application of wavelet analysis to discrete time series. Numerical examples using many discrete time models are then presented to show the quality of the wavelet-based estimator and how it compares with alternative ones. The examples include strong short range dependence, and non Gaussian series with both finite and infinite variance.

Aula 4: “SELF-SIMILAR STABLE PROCESSES AND FLOWS”

Self-similarity involves invariance of the probability distribution under scaling and it is characterized by a parameter H . Brownian motion, for example, is self-similar with H=1/2< /EM> . Fractional Brownian motion is a stochastic process parameterized by H with three characteristics: it is Gaussian, is self-similar and has stationary increments. It is the unique process with these characteristics. If the Gaussian distribution is replaced by an infinite variance symmetric alpha-stable distribution, then one does not have uniqueness anymore. There are in fact an infinite number of processes X that are symmetric alpha-stable, self-similar with stationary increments. We want to classify a subclass of them, the so-called “mixed moving average” ones by relating their representations to flows. We obtain a decomposition of the process X, unique in distribution, into three independent components, which we characterize and associate with flows. The first component is associated with a dissipativeflow. Examples include the limit of telecom process, the so-called “random wavelet expansion” and Takenaka processes. The second component is associated with a conservative flow. Particular cases include linear fractional stable motions.

Ambos os mini-cursos tiveram a duração de cinco horas. Notas foram impressas e distribuídas aos participantes.