Skip to content
Skip to navigation
# Agenda: Talking Across Fields

## Friday, January 31, 2020

**9:30am**
**10:25am**
**11:30am**
**2:00pm**
**2:55pm**
**4:00pm**
## Saturday, February 1, 2020

**9:30am**
**10:25am**
**11:30am**
**2:00pm**
**2:55pm**
**4:00pm**
## Sunday, February 2, 2020

**9:30am**
**10:25am**
**11:30am**

Registration check-in and breakfast will be available at 8:45am each day; morning breaks are scheduled for 11:10am and afternoon breaks (Fri/Sat) for 3:40pm. The conference will adjourn for lunch on Friday and Saturday at 12:15pm.

Stephen Stigler (University of Chicago)

"How did the rabbit get into the hat?"

The question of my title is asked every time someone witnesses a certain feat of stage magic, no matter how young or old the witness. Statistical versions of essentially the same question are becoming increasingly rare in the Big Data era, where magical results can make headlines without such questions being raised. Issues arising from this phenomenon will be discussed, including susceptibility to zombie data, whether post-selection inference is really a “thing”, and what some of Persi's earliest work has to tell us related to this issue.

Donald Knuth (Stanford)

"Stratified importance sampling"

I will discuss Pang Chen's method for estimating the size of backtrack search trees, a beautiful technique that deserves to be much better known.

Georgia Benkart (University of Wisconsin-Madison)

"McKay matrices and Markov chains"

Quivers constructed by tensoring modules for finite groups or finite-dimensional Hopf algebras and their adjacency matrices (McKay matrices) can provide results on invariants, chip-firing, and Markov chains. The analysis to determine sharp rates of convergence to stationarity of these chains can be challenging, and the chains may exhibit unexpected phenomena, as in our joint work with Persi, Martin Liebeck, and Pham Tiep.

Jim Pitman (UC Berkeley)

"Probabilistic interpretations of Bernoulli, Stirling and Eulerian numbers"

It is known that the Bernoulli, Stirling and Eulerian numbers are involved in the description of the probability functions, moments and cumulants of various classical probability distributions. These include the binomial, Poisson, geometric and negative binomial, the distribution of a sum of independent uniform variables, both discrete and continuous, and some distributions related to Brownian motion and the Riemann zeta function. I will review some of these relations, which involve a number Persi's favorite topics: excursions of random walks, Fourier analysis, random partitions, cycles and descents of permutations, and stationary one-dependent sequences.

Evita Nestoridi (Princeton)

"Mixing time of the upper triangular matrix walk over $\mathbb{Z}/m \mathbb{Z}$"

In joint work with Allan Sly, we study a natural random walk on the upper triangular matrices, with entries in $\mathbb{Z}/m \mathbb{Z}$, generated by steps which add or subtract a uniformly random row to the row above. We show that the mixing time of this random walk is $O(n^2m^2)$. This generalizes a result of Peres and Sly and answers a question of Stong, Arias-Castro, Diaconis and Stanley. When $m$ is prime, we get a sharper upper bound for the mixing time by using the work of Diaconis and Hough.

Persi Diaconis (Stanford) and Sandy Zabell (Northwestern)

"In praise of J.V. Uspensky"

The two of us have shared a fascination with James Victor Uspensky's book *Introduction to Mathematical Probability* ever since our graduate student days: it contains many interesting results not found in other books on the same subject in the English language, together with many non-trivial examples, all this clearly stated with careful proofs. We would like to present some of Uspensky's gems to a modern audience hoping to tempt people to read Uspensky for themselves, as well as report on some of the other mathematical topics he also wrote on (for example, his book on number theory contains results about perfect shuffles).

Uspensky also had an interesting life: a member of the Russian Academy of Sciences, he spoke at the 1924 ICM in Toronto before fleeing Russia and coming to the US and Stanford. His students included Vinogradov and Kuzmin in Russia, and Carl Olds in the US. Comparatively little has been written about him not in Russian; we hope to remedy this.

Charles Radin (University of Texas at Austin)

"How cereal settles in a shaking box"

Around 1960 Bernal and Scott did experiments on the settling of ball bearings in a large box as the box was shaken, and discovered the curious phenomenon of random close packing. Recent experimental advances on this phenomenon now show that when shaken gently enough, and long enough, the balls spontaneously begin to reorganize into a global crystalline configuration. Theoretical explanation requires the same stochastic reasoning invented by Boltzmann and used to model the freezing of water molecules into crystalline ice. It is here applied to the deterministic evolution of ball bearings (or cereal) in a precisely shaken box. This offers an accessible but physically realistic setting to mathematically study the entry of probability into macroscopic physical laws.

Susan Holmes (Stanford)

"Dependencies and contingencies"

Count data are often recorded and analysed as contingency tables whose within-row or within-column categories are considered independent. In recent work on the human microbiome, we have come across cases where all categories present dependencies invalidating the traditional “degrees of freedom” and relevant decompositions. For instance if the rows are different species of bacteria whose phylogenetic relationships are known, supposing that this classification is unknown discards important information. The framework developed by Persi and his co-authors where tables come with given row sums and column sums does not apply here and I will present some more realistic approaches that attempt to overcome this challenge.

Alexei Borodin (MIT)

"Integrable growth of random surfaces"

While a variety of integrable (or exactly solvable) random interface models in (1+1)-dimensions are known, there is only one class of integrable examples in (2+1)-dimensions, and it would not have appeared if it were not for Persi. The goal of the talk is to tell that story.

Laurent Miclo (University of Toulouse)

"Construction of set-valued/measure dual processes via random mappings"

The strong stationary times introduced by Aldous and Diaconis (1986) provide a probabilistic approach to quantitative convergence to equilibrium. They are often obtained as the absorption times of intertwining dual processes, following a method due to Diaconis and Fill (1990). We will see how to deduce explicit constructions from certain random mappings, related to the coupling-from-the-past algorithm of Propp and Wilson (1996) and to the evolving sets of Morris and Peres (2005). This approach can be adapted, via the coalescing stochastic flows of Le Jan and Raimond (2006) associated to Tanaka's equation, to recover Pitman's theorem (1975) on the intertwining relation between the Brownian motion and the Bessel-3 process. Nevertheless the theory of coalescing stochastic flows is not sufficiently developed to go further in this direction and the talk will end with an alternative approach proposed in collaboration with Arnaudon and Coulibaly-Pasquier.

Steven N. Evans (UC Berkeley)

"Radix sort and PATRICIA bridges"

Given a finite collection of pairwise distinct infinite binary words, there is a minimal length finite prefix for each word such that the resulting collection of finite prefixes is also pairwise distinct. The set of initial segments of these finite prefixes may be identified with the vertices of a finite rooted tree in which each interior vertex has a left child, a right child or both, and the finite prefixes themselves correspond to the leaves of the tree. A depth first search of this so-called radix sort tree produces an ordering of the leaves that agrees with the lexicographic ordering of the corresponding infinite binary words. The radix sort tree stores information that is redundant for the purpose of sorting the input infinite binary words into lexicographic order. Indeed, if one deletes the out-degree one vertices in the radix sort tree and “closes up the gaps”, then the resulting PATRICIA tree maintains all the information that is necessary for sorting. We investigate the infinite bridges corresponding to the radix sort and PATRICIA chains; that is, the tree-valued Markov chains built by successively feeding the radix sort and PATRICIA algorithms the elements of an infinite i.i.d. sequence of infinite binary words. In particular, we show that the infinite PATRICIA bridges coincide with those associated with a chain introduced by Rémy for successively generating uniform random binary trees with larger and larger numbers of leaves, and we obtain a concrete representation of all such bridges in terms of sampling from an underlying real tree.

Brian Skyrms (UC Irvine)

"Reinforcement in social dynamics"

I will talk about some reinforcement processes in some social dynamics models of philosophical interest. There are interesting open problems.

Arun Ram (University of Melbourne)

"Level 0 and the Bethe ansatz"

This is about the relation between representations of affine Lie algebras, the energy (Hamiltonian) for the Bethe ansatz and the combinatorics of Macdonald polynomials. Although I probably won't get to talk about the shuffles in the background (sorry Persi), I probably will talk about Young tableaux, quantum groups, and Heisenberg spin chains.

Richard Stanley (MIT)

"Persification"

“Persification” can be defined as the process of turning a mathematical result into a “story” explaining how this result applies to a concrete or real-world situation, in the manner of Persi Diaconis. We will give several examples of persification related to algebraic combinatorics.

David Aldous (UC Berkeley)

"The nearest unvisited vertex walk on random graphs"

We revisit an old topic in algorithms, the deterministic walk on a connected finite graph-with-edge-lengths which always moves at speed one toward the nearest unvisited vertex, until every vertex is visited. There is an elementary relation between this cover time and ball-covering (metric entropy) measures. For some familiar models of random connected graphs, this relation allows the order of magnitude of the cover time to be deduced easily from first passage percolation estimates. Obtaining sharper estimates of this cover time remains a challenging problem.