Abstract
Consider the following randomly forced reaction-diffusion equation: \(\partial_t u = \tfrac12\partial^2_x u + b(u) + \sigma(u)\dot{W},\)subject to "nice initial data" where
(the random forcing) denotes a "space-time white noise." The function
(the interaction term) is assumed to be positive, bounded, globally Lipschitz continuous, and bounded uniformly away from the origin, and the function
(the sink/source term) is assumed to be positive, locally Lipschitz and nondecreasing. We prove that the classical Osgood condition \(\int_1^\infty {\rm d}y/b(y)<\infty\) implies that, with probability one, any reasonable solution theory leads us to instantaneous and complete blowup. In light of the celebrated work of Fujita (1963) and many others on the blowup problem for nonrandom PDEs, our result warns how the introduction of an iota of random forcing can fundamentally change the qualitative behavior of the solution to a near-critical reaction-diffusion equation. The main ingredients of the proof involve a hitting-time bound for a class of differential inequalities, and the study of the "spatial ergodicity of stochastic convolutions" using Malliavin's calculus and Poincaré inequalities that were recently developed by Le Chen, David Nualart, Fei Pu, and the speaker (2021, 2022). This is based on joint work with Mohammud Foondun and Eulalia Nualart.