# Calendar

## Yesterday

### Basic Skills Workshop, Sarah Percival, BRNG B222

Thursday, Oct 18 2:00 pm - 3:00 pm

Title: Introduction to R Abstract: R is a free programming language for statistical computing and graphics. In this workshop, I will guide you through the installation and setup of RStudio so bring your laptop if you want to follow along. We will also cover basic statistical and data manipulation tools, as well as graphing using the ggplot2 package.

### Refreshments, MATH Library Lounge

Thursday, Oct 18 3:00 pm - 3:30 pm

## Today

### Refreshments, MATH Library Lounge

Friday, Oct 19 3:00 pm - 3:30 pm

## Next Week

### Refreshments, MATH Library Lounge

Monday, Oct 22 3:00 pm - 3:30 pm

### Geometry/Geometric Analysis Seminar, Ruobing Zhang, Stony Brook University, MATH 731

Monday, Oct 22 3:30 pm - 4:30 pm

Title: Nilpotent structure and collapsing Ricci-flat metrics on K3 surfaces Abstract: We exhibit families of Ricci-flat Kähler metrics on K3 surfaces which collapse to an interval, with Tian-Yau and Taub-NUT metrics occurring as bubbles. There is a corresponding continuous surjective map from the K3 surface to the interval, with regular fibers diffeomorphic to either 3-tori or Heisenberg nilmanifolds. This is joint work with H. Hein, J. Viaclovsky and S. Song.

### CCAM Seminar, Prof. Jack Xin, Univ. of California, Irvine, REC 114

Monday, Oct 22 4:30 pm - 5:30 pm

Title: Blended Coarse Gradient Descent for Full Quantization of Deep Neural Networks Abstract: Quantized deep neural networks (QDNNs) are attractive due to their much lower memory storage and faster inference speed than their regular full precision counterparts. To maintain the same performance level especially at low bit-widths, QDNNs must be retrained. Their training involves minimizing a piecewise constant non-convex objective in high dimension subject to a discrete constraint, hence mathematical challenges arise. We introduce the notion of coarse derivative and propose the blended coarse gradient descent (BCGD) algorithm. Coarse gradient is generally not a gradient of any function but an artificial descent direction. The network weight update of BCGD goes by coarse gradient correction of an average of the full precision weights and their quantization (the so-called blending), which yields sufficient descent in the objective value and accelerates the training. Our experiments demonstrate that this simple blending technique is very effective for quantization at extremely low bit-width such as binarization. For theoretical understanding, we show convergence analysis of coarse gradient descent on a two-layer neural network model with Gaussian input data, and prove that the expected coarse gradient correlates positively with the underlying true gradient.

### Topology Seminar, Phil Tosteson, U Michigan, MATH 431

Tuesday, Oct 23 1:30 pm - 2:30 pm

### Refreshments, MATH Library Lounge

Tuesday, Oct 23 3:00 pm - 3:30 pm

### Probability Seminar, Robin Pemantle , University of Pennsylvania, UNIV 101

Wednesday, Oct 24 1:30 pm - 2:20 pm

Title: The heat kernel on a contact manifold degenerating under diabatic limit Abstract: In many situations in mechanics movement is constrained to horizontal directions'. In the simplest settings this corresponds to moving along submanifolds; the opposite extreme of manifolds with a completely non-integrable' collection of horizontal directions are known as contact manifolds and arise often, e.g., as level sets of Hamiltonians. Questions in analysis and geometry have natural analogues on contact manifolds but with differentiation and dynamics restricted to horizontal directions.

### Spectral and Scattering Theory Seminar, Chris Kottke, New College of Florida, REC 309

Wednesday, Oct 24 2:30 pm - 3:00 pm

Title: Compactification of monopole moduli spaces Abstract: I will discuss joint work with Michael Singer and Karsten Fritzsch on compactifications of the moduli spaces M_k of SU(2) magnetic monopoles on R^3 . Via a geometric gluing procedure, we construct manifolds with corners compactifying the M_k , the boundaries of which represent monopoles of charge k decomposing into widely separated ‘monopole clusters' of lower charge. The hyperkahler metric on M_k has a complete asymptotic expansion, the leading terms of which generalize the asymptotic metric discovered by Bielawski, Gibbons and Manton in the case that the monopoles are all widely separated. From the structure of the compactification, we are able to make partial progress toward proving Sen's conjecture for L^2 cohomology of the moduli spaces.

### Refreshments, MATH Library Lounge

Wednesday, Oct 24 3:00 pm - 3:30 pm

### Automorphic Forms and Representation Theory Seminar​​, Ling Long, Louisiana State University, BRNG B238

Thursday, Oct 25 1:30 pm - 2:30 pm

Title of Talk: Supercongruences for rigid hypergeometric Calabi--Yau threefolds Abstract of Talk: In 2003, Rodriguez-Villegas conjectured some supercongruences for rigid hypergeometric Calabi--Yau threefolds over Q. We will explain what they are and outline the proof of his conjecture based on the theory of hypergeometric motives proposed by Katz and computed extensively by Roberts, Rodriguez-Villegas and Watkins. In particular, we will adapt the techniques from the recent work of Beukers, Cohen and Mellit on finite hypergeometric sums over Q. This is a joint work with Fang-Ting Tu, Noriko Yui and Wadim Zudilin.

### Refreshments, MATH Library Lounge

Thursday, Oct 25 3:00 pm - 3:30 pm

### Basic Skills Workshop, Zach Letterhos, BRNG B206

Thursday, Oct 25 3:30 pm - 4:30 pm

Title: How Good Talks Happen Abstract: This meta-talk will be a blend of my two loves: cognitive psychology and telling people what to do. We will explore some of the fundamental things that cognitive psychology tells us about the brain and human behavior and how to exploit them for the nefarious purpose of making people enjoy our talks. Through examples, I'll provide a framework that you can use to construct successful talks for any purpose. I will primarily focus on how to give academic talks, but most of the advice I give is general enough that it could be applied to any form of public speaking, including your teaching!

### Refreshments, MATH Library Lounge

Friday, Oct 26 3:00 pm - 3:30 pm

## Two Weeks

### Refreshments, MATH Library Lounge

Monday, Oct 29 3:00 pm - 3:30 pm

### CCAM Seminar, Prof. Mimi Dai , University of Illinois, Chicago, REC 114

Monday, Oct 29 4:30 pm - 5:30 pm

TBA

### Refreshments, MATH Library Lounge

Tuesday, Oct 30 3:00 pm - 3:30 pm

### Colloquium, Rodolfo Torres, University of Kansas, MATH 175

Tuesday, Oct 30 3:30 pm - 4:30 pm

TITLE: Almost orthogonality in Fourier analysis: From singular integrals, to function spaces, to Leibniz rules for fractional derivatives ABSTRACT: Decomposition techniques such as atomic, molecular, wavelet and wave-packet expansions provide a multi-scale refinement of Fourier analysis and exploit a rather simple concept: “waves with very different frequencies are almost invisible to each other”. Starting with the classical Calderón-Zygmund and Littlewood-Paley decompositions, many of these useful techniques have been developed around the study of singular integral operators. By breaking an operator or splitting the functions on which it acts into non-interacting almost orthogonal pieces, these tools capture subtle cancelations and quantify properties of an operator in terms of norm estimates in function spaces. This type of analysis has been used to study linear operators with tremendous success. More recently, similar decomposition techniques have been pushed to the analysis of new multilinear operators that arise in the study of (para) product-like operations, commutators, null-forms and other nonlinear functional expressions. In this talk we will present some of our contributions in the study of multilinear singular integrals and function spaces, and their applications to the development of Leibniz rules for fractional derivatives.

### Refreshments, MATH Library Lounge

Wednesday, Oct 31 3:00 pm - 3:30 pm

### Refreshments, MATH Library Lounge

Thursday, Nov 1 3:00 pm - 3:30 pm

### Refreshments, MATH Library Lounge

Friday, Nov 2 3:00 pm - 3:30 pm

## Three Weeks

### Refreshments, MATH Library Lounge

Monday, Nov 5 3:00 pm - 3:30 pm

### Refreshments, MATH Library Lounge

Tuesday, Nov 6 3:00 pm - 3:30 pm

### Colloquium, Gabriel Paternain, University of Cambridge, MATH 175

Tuesday, Nov 6 3:30 pm - 4:30 pm

### Spectral and Scattering Theory Seminar, Chris Judge, Indiana University, REC 309

Wednesday, Nov 7 2:30 pm - 3:20 pm

Title: Triangles have no interior hot spots. Abstract: The temperature distribution $T_t$ of an insulated object tends to a constant function as time $t$ tends to infinity. But there is an expansion of $T_t$ in powers of $e^{-t}$ where the powers are eigenvalues of the Neumann Laplacian and the coefficients involve the corresponding eigenfunctions. The term `hot spot' of the title refers to the maxima of the eigenfunction $u_2$ corresponding to first nonconstant term. About 45 years ago, J. Rauch suggested that $u_2$ has no interior maxima (or minima). We show that this holds for objects which are triangles in the plane. This is joint work with Sugata Mondal.

### Refreshments, MATH Library Lounge

Wednesday, Nov 7 3:00 pm - 3:30 pm

### Refreshments, MATH Library Lounge

Thursday, Nov 8 3:00 pm - 3:30 pm

### CCAM Seminars, Prof. Nathan Kutz, University of Washington, UNIV 201

Friday, Nov 9 11:30 am - 12:30 pm

Title: Networked dynamical systems for function and learning: Paradigms for data-driven control and learning in neurosensory systems Abstract: High-dimensional networked biological systems are ubiquitous and characterized by a large connectivity graph whose structure determines how the system operates as a whole. Typically the connectivity is so complex (and unknown as well) that the functionality, control and robustness of the network of interest is impossible to characterize using currently available methods. A full understanding of this computational process encoded throughout a nervous system that transforms sensory input into motor representations requires the ability to generate proxy models for the activity of sensory neurons, decision-making circuits, and motor circuits in a behaving animal. Our objective is to use emerging data-driven methods to extract the underlying engineering principles of cognitive capability, namely those that allow complex networks to learn and enact control and functionality in the robust manner observed in neurosensory systems. Mathematically, the challenges center around understanding how networked dynamical systems produce robust functionality and coordinated activity.

### Refreshments, MATH Library Lounge

Friday, Nov 9 3:00 pm - 3:30 pm

## November

### Refreshments, MATH Library Lounge

Monday, Nov 12 3:00 pm - 3:30 pm

### Refreshments, MATH Library Lounge

Tuesday, Nov 13 3:00 pm - 3:30 pm

### Refreshments, MATH Library Lounge

Wednesday, Nov 14 3:00 pm - 3:30 pm

### Refreshments, MATH Library Lounge

Thursday, Nov 15 3:00 pm - 3:30 pm

### Refreshments, MATH Library Lounge

Friday, Nov 16 3:00 pm - 3:30 pm

### Refreshments, MATH Library Lounge

Monday, Nov 19 3:00 pm - 3:30 pm

### Refreshments, MATH Library Lounge

Tuesday, Nov 20 3:00 pm - 3:30 pm

### Thanksgiving Break

Wednesday, Nov 21 - Friday, Nov 23

### Refreshments, MATH Library Lounge

Monday, Nov 26 3:00 pm - 3:30 pm

### Refreshments, MATH Library Lounge

Tuesday, Nov 27 3:00 pm - 3:30 pm

### Refreshments, MATH Library Lounge

Wednesday, Nov 28 3:00 pm - 3:30 pm

### Refreshments, MATH Library Lounge

Thursday, Nov 29 3:00 pm - 3:30 pm

### Refreshments, MATH Library Lounge

Friday, Nov 30 3:00 pm - 3:30 pm

## December

### Refreshments, MATH Library Lounge

Monday, Dec 3 3:00 pm - 3:30 pm

### Refreshments, MATH Library Lounge

Tuesday, Dec 4 3:00 pm - 3:30 pm

### Refreshments, MATH Library Lounge

Wednesday, Dec 5 3:00 pm - 3:30 pm

### Refreshments, MATH Library Lounge

Thursday, Dec 6 3:00 pm - 3:30 pm

### Refreshments, MATH Library Lounge

Friday, Dec 7 3:00 pm - 3:30 pm

### Finals

Monday, Dec 10 - Saturday, Dec 15

### University Closed: Winter Holiday

Monday, Dec 24 - Tuesday, Jan 1

## 2019

### Colloquium, Francesco Maggi, University of Texas at Austin, MATH 175

Tuesday, Apr 16 3:30 pm - 4:30 pm