Sponsored Post Learn from the experts: Create a successful blog with our brand new courseThe WordPress.com Blog

WordPress.com is excited to announce our newest offering: a course just for beginning bloggers where you’ll learn everything you need to know about blogging from the most trusted experts in the industry. We have helped millions of blogs get up and running, we know what works, and we want you to to know everything we know. This course provides all the fundamental skills and inspiration you need to get your blog started, an interactive community forum, and content updated annually.

Coffee-Break Problem, 08/03/2021

I mentioned Sanov’s theorem in a previous post. This is a fun variant on the same argument.

Let p_x, x\ge 0 be a Geometric distribution on \{0,1,....\} given by: p_x=2^{-1-x} , so that the mean is \sum_x xp_x = 1 , and let X_1, X_2,....X_N,.... be independent and identical samples from this distribution. For N\ge 1 , let p^N be the empirical distribution of X_1,....X_N, that is, p^N_x = N^{-1} \#\{i\le N: X_i=x\}.

Construct explicitly probability measures \mathbb{Q}^N \ll \mathbb{P} and a \in (0, \infty) such that, for all \epsilon>0 ,

\mathbb{Q}^N\left(\sum_x |p_x - p^N_x| \le \epsilon, |\sum_x x p^N_x - 2| \le \epsilon, \frac{1}{N} \log \frac{d\mathbb{Q}^N}{d\mathbb{P}}\le a\right) \to 1.

If you know about entropy of probability measures, show also that these changes of measure have asymptotically finite entropy:

\limsup_N \frac{1}{N}H\left(\mathbb{Q}^N \big| \mathbb{P} \right) < \infty.

Coffee-Break Problem, 22/02/2021

Let \mu be a compactly supported probability measure on \mathbb{R} . For t>0 , let g_t(x)=\exp(-x^2/2t)/\sqrt{2\pi t} be the Gaussian distribution of variance t .

Fix a continuous function \varphi: \mathbb{R} \to \mathbb{R} , and for x\in \mathbb{R}, t>0 , define a local average

\varphi^t(x):=\left.\int_\mathbb{R} \varphi(y)g_t(y-x)\mu(dy) \right/ \int_\mathbb{R} g(z-y) \mu(dz)

Suppose that \mu has a density f\ge 0 . Making any assumptions necessary on the regularity of f, show that \varphi^t(x) converges to a limit \varphi^0(x) for all x\in \mathbb{R} . Must \varphi^0 be continuous?

What happens if we work only with a compactly supported probability measure \mu?

Coffee-Break Problem, 15/02/20

You are offered a choice between two envelopes: envelope A contains \pounds x , while envelope B contains \pounds y with probabilitity p\in (0,1) , and is otherwise empty, with py>x . With constant average risk aversion U(x)=-\exp(-\gamma x) , which option do you prefer?

A prankster wishes to waste your time by making you make a large number of choices, while still making the same amount of money available for prizes. You will be given a large number N choices between envelopes A, which contain \pounds x/N , or envelopes B, which contain \pounds y/N with probability p , independently of each other. What is your strategy in this case? Do you complain about this prank?

Does your answer change if you are allowed to open the envelopes before making the next choice, or if you only open all of them at the end?

What is your optimal strategy if you are instead told that exactly \lfloor pN\rfloor of the B envelopes contain the prize, but you do not find out about those you do not open?