# Calendar

## Yesterday

Title: Introduction to R Abstract: R is a free programming language for statistical computing and graphics. In this workshop, I will guide you through the installation and setup of RStudio so bring your laptop if you want to follow along. We will also cover basic statistical and data manipulation tools, as well as graphing using the ggplot2 package.

## Today

## Next Week

### Geometry/Geometric Analysis Seminar, Ruobing Zhang, Stony Brook University, MATH 731

Monday, Oct 22 3:30 pm - 4:30 pm

Title: Nilpotent structure and collapsing Ricci-flat metrics on K3 surfaces Abstract: We exhibit families of Ricci-flat Kähler metrics on K3 surfaces which collapse to an interval, with Tian-Yau and Taub-NUT metrics occurring as bubbles. There is a corresponding continuous surjective map from the K3 surface to the interval, with regular fibers diffeomorphic to either 3-tori or Heisenberg nilmanifolds. This is joint work with H. Hein, J. Viaclovsky and S. Song.

Title: Blended Coarse Gradient Descent for Full Quantization of Deep Neural Networks Abstract: Quantized deep neural networks (QDNNs) are attractive due to their much lower memory storage and faster inference speed than their regular full precision counterparts. To maintain the same performance level especially at low bit-widths, QDNNs must be retrained. Their training involves minimizing a piecewise constant non-convex objective in high dimension subject to a discrete constraint, hence mathematical challenges arise. We introduce the notion of coarse derivative and propose the blended coarse gradient descent (BCGD) algorithm. Coarse gradient is generally not a gradient of any function but an artificial descent direction. The network weight update of BCGD goes by coarse gradient correction of an average of the full precision weights and their quantization (the so-called blending), which yields sufficient descent in the objective value and accelerates the training. Our experiments demonstrate that this simple blending technique is very effective for quantization at extremely low bit-width such as binarization. For theoretical understanding, we show convergence analysis of coarse gradient descent on a two-layer neural network model with Gaussian input data, and prove that the expected coarse gradient correlates positively with the underlying true gradient.

### Probability Seminar, Robin Pemantle , University of Pennsylvania, UNIV 101

Wednesday, Oct 24 1:30 pm - 2:20 pm

Title: The heat kernel on a contact manifold degenerating under diabatic limit Abstract: In many situations in mechanics movement is constrained to `horizontal directions'. In the simplest settings this corresponds to moving along submanifolds; the opposite extreme of manifolds with a `completely non-integrable' collection of horizontal directions are known as contact manifolds and arise often, e.g., as level sets of Hamiltonians. Questions in analysis and geometry have natural analogues on contact manifolds but with differentiation and dynamics restricted to horizontal directions.

### Spectral and Scattering Theory Seminar, Chris Kottke, New College of Florida, REC 309

Wednesday, Oct 24 2:30 pm - 3:00 pm

Title: Compactification of monopole moduli spaces Abstract: I will discuss joint work with Michael Singer and Karsten Fritzsch on compactifications of the moduli spaces M_k of SU(2) magnetic monopoles on R^3 . Via a geometric gluing procedure, we construct manifolds with corners compactifying the M_k , the boundaries of which represent monopoles of charge k decomposing into widely separated ‘monopole clusters' of lower charge. The hyperkahler metric on M_k has a complete asymptotic expansion, the leading terms of which generalize the asymptotic metric discovered by Bielawski, Gibbons and Manton in the case that the monopoles are all widely separated. From the structure of the compactification, we are able to make partial progress toward proving Sen's conjecture for L^2 cohomology of the moduli spaces.

### Automorphic Forms and Representation Theory Seminar, Ling Long, Louisiana State University, BRNG B238

Thursday, Oct 25 1:30 pm - 2:30 pm

Title of Talk: Supercongruences for rigid hypergeometric Calabi--Yau threefolds Abstract of Talk: In 2003, Rodriguez-Villegas conjectured some supercongruences for rigid hypergeometric Calabi--Yau threefolds over Q. We will explain what they are and outline the proof of his conjecture based on the theory of hypergeometric motives proposed by Katz and computed extensively by Roberts, Rodriguez-Villegas and Watkins. In particular, we will adapt the techniques from the recent work of Beukers, Cohen and Mellit on finite hypergeometric sums over Q. This is a joint work with Fang-Ting Tu, Noriko Yui and Wadim Zudilin.

Title: How Good Talks Happen Abstract: This meta-talk will be a blend of my two loves: cognitive psychology and telling people what to do. We will explore some of the fundamental things that cognitive psychology tells us about the brain and human behavior and how to exploit them for the nefarious purpose of making people enjoy our talks. Through examples, I'll provide a framework that you can use to construct successful talks for any purpose. I will primarily focus on how to give academic talks, but most of the advice I give is general enough that it could be applied to any form of public speaking, including your teaching!

## Two Weeks

### CCAM Seminar, Prof. Mimi Dai , University of Illinois, Chicago, REC 114

Monday, Oct 29 4:30 pm - 5:30 pm

TBA

TITLE: Almost orthogonality in Fourier analysis: From singular integrals, to function spaces, to Leibniz rules for fractional derivatives ABSTRACT: Decomposition techniques such as atomic, molecular, wavelet and wave-packet expansions provide a multi-scale refinement of Fourier analysis and exploit a rather simple concept: “waves with very different frequencies are almost invisible to each other”. Starting with the classical Calderón-Zygmund and Littlewood-Paley decompositions, many of these useful techniques have been developed around the study of singular integral operators. By breaking an operator or splitting the functions on which it acts into non-interacting almost orthogonal pieces, these tools capture subtle cancelations and quantify properties of an operator in terms of norm estimates in function spaces. This type of analysis has been used to study linear operators with tremendous success. More recently, similar decomposition techniques have been pushed to the analysis of new multilinear operators that arise in the study of (para) product-like operations, commutators, null-forms and other nonlinear functional expressions. In this talk we will present some of our contributions in the study of multilinear singular integrals and function spaces, and their applications to the development of Leibniz rules for fractional derivatives.

## Three Weeks

### Spectral and Scattering Theory Seminar, Chris Judge, Indiana University, REC 309

Wednesday, Nov 7 2:30 pm - 3:20 pm

Title: Triangles have no interior hot spots. Abstract: The temperature distribution $T_t$ of an insulated object tends to a constant function as time $t$ tends to infinity. But there is an expansion of $T_t$ in powers of $e^{-t}$ where the powers are eigenvalues of the Neumann Laplacian and the coefficients involve the corresponding eigenfunctions. The term `hot spot' of the title refers to the maxima of the eigenfunction $u_2$ corresponding to first nonconstant term. About 45 years ago, J. Rauch suggested that $u_2$ has no interior maxima (or minima). We show that this holds for objects which are triangles in the plane. This is joint work with Sugata Mondal.

### CCAM Seminars, Prof. Nathan Kutz, University of Washington, UNIV 201

Friday, Nov 9 11:30 am - 12:30 pm

Title: Networked dynamical systems for function and learning: Paradigms for data-driven control and learning in neurosensory systems Abstract: High-dimensional networked biological systems are ubiquitous and characterized by a large connectivity graph whose structure determines how the system operates as a whole. Typically the connectivity is so complex (and unknown as well) that the functionality, control and robustness of the network of interest is impossible to characterize using currently available methods. A full understanding of this computational process encoded throughout a nervous system that transforms sensory input into motor representations requires the ability to generate proxy models for the activity of sensory neurons, decision-making circuits, and motor circuits in a behaving animal. Our objective is to use emerging data-driven methods to extract the underlying engineering principles of cognitive capability, namely those that allow complex networks to learn and enact control and functionality in the robust manner observed in neurosensory systems. Mathematically, the challenges center around understanding how networked dynamical systems produce robust functionality and coordinated activity.