Fermat-Weber Points of Probability Measures μ
We've been reviewing a Mathoverflow post here about finding formulas for the Fermat-Weber points of a probability measure $\mu$ on ${\bf{R}}^n$. The FW point is that point $\bar{x}$ which minimizes the average $L^1$ distance to the measure, i.e. $FW$ is the minimizing argument of $$\min_x f_\mu(x)=\min_{x} \int ||x-y|| ~d\mu(y),$$ where $x$ ranges over ${\bf{R}}^n$. It's remarkable that this problem admits no good formula. The problem is that $f_\mu(x)$ defined above is not everywhere differentiable on the domain, specifically being not differentiable along the support of $\mu$.
Now the minimizing program is readily computed approximately, i.e. numerically. This means we can basically start with any initial point $x_0$ and find a type of gradient flow to find sequence $x_1$, $x_2$, $\ldots$ which converges to a minimizer $\lim_{k\to +\infty}x_k= x_\infty$. In practice, this is good enough.
But what about the formula? This is typical of naive mathematicians, to always seek out a "formula". But formulas are the exception. There's probably an easy argument that numbers given by "formula" are exceedingly rare, i.e. measure zero on the real number line ${\bf{R}}$. For example, consider the problem frequently arising in probability: if $y=F(x):=Prob(X \leq x)=\int_{\infty}^x f(z)dz$ is the cumulative distribution function which is monotonically increasing, then it's known that the inverse $x=F^{-1}(y)$ exists, but does not typically admit a formula.
Let's consider the Legendre-Fenchel transform ${f_\mu}^*(n)$ of the function $f_\mu(x):=\int ||x-y|| d\mu(y)$. Is a formula for $f^*_\mu$ available? Here we must acknowledge the unfortunate fact that Legendre transforms of convex combinations are not immediately computable. That is, we find only the general inequality $$(\frac{1}{2}f_0 + \frac{1}{2}f_1)^* \leq \frac{1}{2}f_0^* + \frac{1}{2} f_1^*.$$
This implies ${f^*_\mu}(n)$ satisfies the pointwise inequality $${f^*_\mu}(n)\leq \nu(n)$$ where $\nu$ is the "naive" convex recombination $$\nu(n):=\int \langle n, y\rangle ~d\mu(y)$$ if $||n||\leq 1$, and $$\nu(n)=+\infty$$ otherwise. Notice that the naive transform $\nu$ is basically a linear function on it's domain, namely $\langle n, \bar{y} \rangle$ where $\bar{y} :=\int y ~d\mu(y)$. It would be convenient to have a general rule or formula for $f^*_\mu$, but this is not achievable at this stage.
It proves again that the naive $L^1$ cost $c=d^1$ might seem convenient, but it's really not due to its nondifferentiability along the diagonal. For example, this implies that $f_\mu$ is not differentiable along the support of $\mu$. It's clear that $f_\mu$ is finite and continuous along $spt(\mu)$, however the gradient formula $$\nabla_x f_\mu(x) = \int \frac{x-y}{|x-y|} d\mu(y), $$ and find the integrand undefined whenever $x=y$ for some $y\in spt(\mu)$. Notice the integral defining $\nabla_x f_\mu$ is a sum of unit vectors.
Here arises a question from first year calculus. What are the critical points of $f_\mu$? Strictly speaking, the critical points consist of those points $\bar{x}$ where $\nabla_x f_\mu(\bar{x})=0$ and those points where the function $f_\mu$ is not differentiable. Therefore we find $$Crit(f_\mu) = \{\bar{x}~|~\nabla f_\mu(\bar{x})=0\} \cup spt(\mu).$$
This makes the discovery of the minimizer rather difficult, because we also need to compare the critical value at $\bar{x}$ with the values of $f_\mu$ restricted to $spt(\mu)$. But this restriction $f_\mu|_{spt(\mu)}$ is not differentiable, so there are no critical points to speak of, i.e. every point $x\in spt(\mu)$ is a critical point according to the strict definition! And here is the basic difficulty in using $L^1$ distances, and the inability to identify the minimizer.
The question arises: "Is the restriction of ${f_\mu}$ to the support $spt(\mu)$ really nondifferentiable?"
Here the properties of $spt(\mu)$ as a subset of ${\bf{R}}^n$ has to play a role in deciding the behaviour of $f_\mu$ when restricted to the support $spt(\mu)$. But this is a question for another time...