Coffee-Break Problem, 10/01/2022

This is a twist on the well-known Monty Hall problem. If you aren’t familiar with this, solve this first.

The rules are the same as ever: there are three doors, behind one of which is a car, with only goats behind the other two. The host knows which is which, but I do not; I will pick a door, and then the host opens a door other than the one I picked to reveal a goat, before offering me the chance to change my choice to the one remaining door. This time, when I am walking up onto the stage, I notice that exactly one of the three doors smells faintly of petrol: I estimate that the probability of a door smelling of petrol, given that the car is behind it, is p \in (0,1) , while the probability of a door smelling of petrol, given that a goat is behind it, is q \in (0, p) .

In light of this new information, what is my best strategy?

Coffee-Break Problem, 20/12/2021

Let Q=(q_{xy})_{x,y\in E} be the generator of a continuous time, irreducible Markov chain (X_t)_{t\ge 0} on a finite space E, which is reversible with respect to the invariant distribution \pi=(\pi(x))_{x\in E} , and let P_t be the associated transition semigroup. Verify that the relative entropy

H(\mu P_t|\pi)=\sum_x \frac{(\mu P_t)(x)}{\pi(x)}\log \left(\frac{(\mu P_t)(x)}{\pi(x)}\right)\pi(x)

is decreasing in time, with

-\frac{d}{dt} H(\mu P_t|\pi) = D_Q(\mu P_t) = \sum_{x\neq y} \left(\frac{(\mu P_t)(x)}{\pi(x)} - \frac{(\mu P_t)(y)}{\pi(y)}\right) \left(\log \frac{(\mu P_t)(x)}{\pi(x)} - \log \frac{(\mu P_t)(y)}{\pi(y)} \right) \pi(x)q_{xy}.

Now, observe that if \widetilde{Q} is the generator of a different chain Y , which is also reversible and has the same invariant distribution, and \widetilde{q}_{xy}\ge q_{xy} for any x\neq y , then it follows that D_{\widetilde{Q}}(\mu)\ge D_Q(\mu) for any \mu .

Does it follow that the corresponding semigroup satisfies H(\mu \widetilde{P}_t|\pi)\le H(\mu P_t|\pi) for any t\ge 0?

Coffee-Break Problem, 06/12/2021

Consider spinning a roulette wheel: while the wheel spins in one direction while a ball runs along a track on the outside of the wheel in the opposite direction, and as the ball loses momentum it eventually falls into one of n equally sized pockets.

We often think of the outcome X of a roulette spin as being a uniform random variable over the n bins. Of course, the roulette wheel is completely deterministic: the outcome of the spin is dictated by Newton’s laws of motion. Moreover, at very low speeds, the outcome will not be at all surprising – the ball will fall the 0^\text{th} or 1^\text{st} pocket if the relative speeds are very small.

What simple mathematical fact justifies treating the outcome X as a random variable when the speeds of the wheel and the ball are very large?

Coffee-Break Problem, 08/03/2021

I mentioned Sanov’s theorem in a previous post. This is a fun variant on the same argument.

Let p_x, x\ge 0 be a Geometric distribution on \{0,1,....\} given by: p_x=2^{-1-x} , so that the mean is \sum_x xp_x = 1 , and let X_1, X_2,....X_N,.... be independent and identical samples from this distribution. For N\ge 1 , let p^N be the empirical distribution of X_1,....X_N, that is, p^N_x = N^{-1} \#\{i\le N: X_i=x\}.

Construct explicitly probability measures \mathbb{Q}^N \ll \mathbb{P} and a \in (0, \infty) such that, for all \epsilon>0 ,

\mathbb{Q}^N\left(\sum_x |p_x - p^N_x| \le \epsilon, |\sum_x x p^N_x - 2| \le \epsilon, \frac{1}{N} \log \frac{d\mathbb{Q}^N}{d\mathbb{P}}\le a\right) \to 1.

If you know about entropy of probability measures, show also that these changes of measure have asymptotically finite entropy:

\limsup_N \frac{1}{N}H\left(\mathbb{Q}^N \big| \mathbb{P} \right) < \infty.

Coffee-Break Problem, 22/02/2021

Let \mu be a compactly supported probability measure on \mathbb{R} . For t>0 , let g_t(x)=\exp(-x^2/2t)/\sqrt{2\pi t} be the Gaussian distribution of variance t .

Fix a continuous function \varphi: \mathbb{R} \to \mathbb{R} , and for x\in \mathbb{R}, t>0 , define a local average

\varphi^t(x):=\left.\int_\mathbb{R} \varphi(y)g_t(y-x)\mu(dy) \right/ \int_\mathbb{R} g(z-y) \mu(dz)

Suppose that \mu has a density f\ge 0 . Making any assumptions necessary on the regularity of f, show that \varphi^t(x) converges to a limit \varphi^0(x) for all x\in \mathbb{R} . Must \varphi^0 be continuous?

What happens if we work only with a compactly supported probability measure \mu?