Nick Winovich is a Ph.D. candidate in the Department of Mathematics at Purdue University. He majored in mathematics and Spanish at the University of Notre Dame
and received funding from the *Kellog Institute for International Studies* to teach English at a primary school in *Pacuare, Costa Rica* through the *World Teach* program.
He subsequently received a master's degree in mathematics at the University of Oregon
and is currently funded as an NSF IGERT
fellow as a part of the *sustainable electronics program* headed by Dr. Carol Handwerker.

He is advised by Dr. Guang Lin in the Department of Mathematics and co-advised by Dr. Karthik Ramani in the Department of Mechanical Engineering.
His current research focuses on applications of machine learning to differential equations, uncertainty quantification, and continuum mechanics with publications in the *Journal of Computational Physics* and the *Journal of Peridynamics and Nonlocal Modeling*.

He is currently participating in a year-round student internship through Sandia National Laboratories,
and volunteers as a student consultant for the Purdue Data Science Consulting Service.
He has also served as a mentor for undergraduate students through the NSF *Summer Undergraduate Research Fellowship* (SURF) and *Network for Computational Nanotechnology* (NCN) *Undergraduate Research Experience* (URE) programs.

Guassian processes are a special case of stochastic processes with the defining property of having multivariate normal joint distributions corresponding to any finite collection of input points. This is of particular importance for numerical implementations of Gaussian processes, since calculations for the discretized process can be carried out efficiently using standard linear algebra techniques.

Gaussian processes also provide a natural, rigorous form of uncertainty quantification for regression problems. For example, given a collection of noisy input observations, a Gaussian process can be fit to the data to provide an estimated mean as well as predictive variances, as illustrated below:

A great introductory reference for Gaussian processes is the definitive text Gaussian Processes for Machine Learning by Carl Rasmussen and Christopher Williams (the text is also freely available on the book webpage).

Feed-forward neural networks provide a rich class of modeling tools which are particularly well-suited for data analysis and regression. In particular, these networks are *universal approximators* in the sense that they are capabale of approximating any continuous function on a bounded domain to any specified level of accuracy.

The training procedure for fitting neural network models is particularly suited for scalability as well; this is primarily a result of efficient batch stochastic gradient descent algorithms and recent developments in automatic differentiation. Backpropagation also played a crucial rule in the advancement of neural network architectures, as it allowed for much deeper networks to be trained by "propagating" gradient information back through the network; a simple illustration of the associated computational flow is given in the animation below:

An abundance of deep learning resources and references can be found at the Awesome Deep Learning and Awesome - Most Cited Deep Learning Papers GitHub repositories. A basic introduction to the foundational principles and underlying mathematical framework for neural networks is also provided in the Deep Learning section of this site.

The theory of differential equations involves the study of mathematical systems which specify relations between functions and their derivatives. These systems arise naturally in many physical systems as well as in economics. In order to guarantee the uniqueness of a solution, these systems typically include a set of initial and/or boundary conditions associated with the specific function of interest.

While the exact solutions of systems of differential equations are often intractable analytically, a wide range of effective numerical techniques exist for computing approximate solutions. Among the most prominent numerical methods are the finite element method (FEM) and the multigrid method (MG); an example of an approximate FEM solution for a parabolic differential equation is illustrated below: