Coffee-Break Problem, 10/01/2022

This is a twist on the well-known Monty Hall problem. If you aren’t familiar with this, solve this first.

The rules are the same as ever: there are three doors, behind one of which is a car, with only goats behind the other two. The host knows which is which, but I do not; I will pick a door, and then the host opens a door other than the one I picked to reveal a goat, before offering me the chance to change my choice to the one remaining door. This time, when I am walking up onto the stage, I notice that exactly one of the three doors smells faintly of petrol: I estimate that the probability of a door smelling of petrol, given that the car is behind it, is $p \in (0,1)$, while the probability of a door smelling of petrol, given that a goat is behind it, is $q \in (0, p)$.

In light of this new information, what is my best strategy?

Coffee-Break Problem, 20/12/2021

Let $Q=(q_{xy})_{x,y\in E}$ be the generator of a continuous time, irreducible Markov chain $(X_t)_{t\ge 0}$ on a finite space $E$, which is reversible with respect to the invariant distribution $\pi=(\pi(x))_{x\in E}$, and let $P_t$ be the associated transition semigroup. Verify that the relative entropy

$H(\mu P_t|\pi)=\sum_x \frac{(\mu P_t)(x)}{\pi(x)}\log \left(\frac{(\mu P_t)(x)}{\pi(x)}\right)\pi(x)$

is decreasing in time, with

$-\frac{d}{dt} H(\mu P_t|\pi) = D_Q(\mu P_t) = \sum_{x\neq y} \left(\frac{(\mu P_t)(x)}{\pi(x)} - \frac{(\mu P_t)(y)}{\pi(y)}\right) \left(\log \frac{(\mu P_t)(x)}{\pi(x)} - \log \frac{(\mu P_t)(y)}{\pi(y)} \right) \pi(x)q_{xy}.$

Now, observe that if $\widetilde{Q}$ is the generator of a different chain $Y$, which is also reversible and has the same invariant distribution, and $\widetilde{q}_{xy}\ge q_{xy}$ for any $x\neq y$, then it follows that $D_{\widetilde{Q}}(\mu)\ge D_Q(\mu)$ for any $\mu$.

Does it follow that the corresponding semigroup satisfies $H(\mu \widetilde{P}_t|\pi)\le H(\mu P_t|\pi)$ for any $t\ge 0$?

Coffee-Break Problem, 13/12/2021

Let $\rho$ be a probability measure on the two-dimensional torus $\mathbb{T}^2$ with finite entropy $\int_{\mathbb{T}^2} f(x)\log f(x) dx<\infty$, where $f=\frac{d\rho}{d\lambda}$ is the density of $\rho$ with respect to the Lebesgue measure $\lambda$. Must it be true that $\rho \in H^{-1}(\mathbb{T}^2)$ – that is, must $\rho$ define a bounded linear map on the Sobolev space $H^1(\mathbb{T}^2)$?

Coffee-Break Problem, 06/12/2021

Consider spinning a roulette wheel: while the wheel spins in one direction while a ball runs along a track on the outside of the wheel in the opposite direction, and as the ball loses momentum it eventually falls into one of $n$ equally sized pockets.

We often think of the outcome $X$ of a roulette spin as being a uniform random variable over the $n$ bins. Of course, the roulette wheel is completely deterministic: the outcome of the spin is dictated by Newton’s laws of motion. Moreover, at very low speeds, the outcome will not be at all surprising – the ball will fall the $0^\text{th}$ or $1^\text{st}$ pocket if the relative speeds are very small.

What simple mathematical fact justifies treating the outcome $X$ as a random variable when the speeds of the wheel and the ball are very large?

Coffee-Break Problem 18/10/2021

For two probability measures on a common space $E$, the relative entropy is defined by

$H(\mu|\nu)=\begin{cases} \int_E h(x)\log h(x)\nu(dx) & \text{ if }\mu \ll \nu, h=\frac{d\mu}{d\nu}; \\ \infty & \text{else.} \end{cases}$

Let $\gamma$ be the standard Gaussian distribution $\gamma(dx)=\frac{1}{(2\pi)^{d/2}} e^{-|x|^2/2} dx$ in $E=\mathbb{R}^d$. Show that any probability measure $\mu$ with finite entropy $H(\mu|\gamma)$ has finite second moment $\int_{\mathbb{R}^d} |x|^2 \mu(dx)<\infty$, and in fact, for any $a\in (0,\infty)$,

$\sup\left\{\int_{\mathbb{R}^d} |x|^2 \mu(dx): H(\mu|\gamma) \le a\right\} <\infty.$

Is it true that

$\sup\left\{\int_{|x|\ge R} |x|^2 \mu(dx): H(\mu|\gamma) \le a\right\} \to 0 \hspace{1cm} \text{as } R\to \infty?$

Coffee-Break Problem, 22/03/2021

Toss two coins, $A$ and $B$, independently of one another, with probabilities $p, q \in (0,1)$ respectively of giving heads. What is the probability

$\mathbb{P}(\text{Coin }A\text{ is heads }\implies \text{ coin }B\text{ is heads})?$

…but the coins are independent! Explain.

Coffee-Break Problem, 15/03/2021

Construct a complex-analytic function $f: D(0,1)\to \mathbb{C}$ on the open unit disk $D(0,1)$ such that all derivatives $f^{(n)}$ admit continuous extensions to the closed disk $\overline{D(0,1)}$, but such that $f$ cannot be analytically continued to any strictly larger domain $\Omega \supsetneq D(0,1)$.

Coffee-Break Problem, 08/03/2021

I mentioned Sanov’s theorem in a previous post. This is a fun variant on the same argument.

Let $p_x, x\ge 0$ be a Geometric distribution on $\{0,1,....\}$ given by: $p_x=2^{-1-x}$, so that the mean is $\sum_x xp_x = 1$, and let $X_1, X_2,....X_N,....$ be independent and identical samples from this distribution. For $N\ge 1$, let $p^N$ be the empirical distribution of $X_1,....X_N$, that is, $p^N_x = N^{-1} \#\{i\le N: X_i=x\}.$

Construct explicitly probability measures $\mathbb{Q}^N \ll \mathbb{P}$ and $a \in (0, \infty)$ such that, for all $\epsilon>0$,

$\mathbb{Q}^N\left(\sum_x |p_x - p^N_x| \le \epsilon, |\sum_x x p^N_x - 2| \le \epsilon, \frac{1}{N} \log \frac{d\mathbb{Q}^N}{d\mathbb{P}}\le a\right) \to 1.$

If you know about entropy of probability measures, show also that these changes of measure have asymptotically finite entropy:

$\limsup_N \frac{1}{N}H\left(\mathbb{Q}^N \big| \mathbb{P} \right) < \infty.$

Coffee-Break Problem, 01/03/2021

There are more counterexamples in heaven and earth than are dreamt of in your real analysis course – Hamlet, deleted scene.

Show that there exists a smooth function $f:\mathbb{R}\to\mathbb{R}$ such that, for all $x_0$, the Taylor series $\sum_{n=0}^\infty (x-x_0)^n f^{(n)}(x_0)/n!$ has zero radius of convergence.

Note that this is different from this previous problem.Why?

Bonus points for an explicit construction.

Coffee-Break Problem, 22/02/2021

Let $\mu$ be a compactly supported probability measure on $\mathbb{R}$. For $t>0$, let $g_t(x)=\exp(-x^2/2t)/\sqrt{2\pi t}$ be the Gaussian distribution of variance $t$.

Fix a continuous function $\varphi: \mathbb{R} \to \mathbb{R}$, and for $x\in \mathbb{R}, t>0$, define a local average

$\varphi^t(x):=\left.\int_\mathbb{R} \varphi(y)g_t(y-x)\mu(dy) \right/ \int_\mathbb{R} g(z-y) \mu(dz)$

Suppose that $\mu$ has a density $f\ge 0$. Making any assumptions necessary on the regularity of $f$, show that $\varphi^t(x)$ converges to a limit $\varphi^0(x)$ for all $x\in \mathbb{R}$. Must $\varphi^0$ be continuous?

What happens if we work only with a compactly supported probability measure $\mu$?