$$ \newcommand{\st}{\text{ s.t. }} \newcommand{\and}{\text{ and }} \DeclareMathOperator*{\argmin}{arg\,min} \DeclareMathOperator*{\argmax}{arg\,max} \newcommand{\R}{\mathbb{R}} \newcommand{\N}{\mathbb{N}} \newcommand{\O}{\mathcal{O}} \newcommand{\dist}{\text{dist}} \newcommand{\vec}[1]{\mathbf{#1}} \newcommand{\diag}{\mathrm{diag}} \newcommand{\d}{\mathrm{d}} \newcommand{\L}{\mathcal{L}} \newcommand{\Tr}{\mathrm{\mathbf{Tr}}} \newcommand{\E}{\mathbb{E}} \newcommand{\Var}{\mathrm{Var}} \newcommand{\Cov}{\mathrm{Cov}} \newcommand{\indep}{\perp \!\!\! \perp} \newcommand{\KL}[2]{\mathrm{KL}(#1 \parallel #2)} \newcommand{\W}{\mathbf{W}} % Wasserstein distance \newcommand{\SW}{\mathbf{SW}} % Sliced-Wasserstein distance $$

Sinkhorn: Entropic Regularization

Discrete Measures Definition: Shannon-Boltzmann entropy Let $P \in U(a, b)$ be a coupling matrix for discrete measures with vectors $a$ and $b$. Then, the Shannon-Boltzmann entropy is $$ H(P) = - \sum_{i, j} P_{i, j} \log{(P_{i, j})}, $$ where $0 = \log{0}$. Note that $$ \nabla^2 H(P) = - \diag(P_{i, j}^{-1}). $$ So, $H$ is strictly concave. Let us add a regularization term to the discrete Kantorovich problem. $\varepsilon$ will be our regularization weight and it works as a kind of “temperature”. ...

November 12, 2025 · 19 min

Dynamic Optimal Transport and Flow Matching

Summary Up to this point we have studied Monge and Kantorovich problems. Duality of Kantorovich and the Wasserstein metric. Slicing OT as a way to lower bound $\W_p$. Now, we are interested in solving the Kantorovich problem in an alternative way. The goal here is to find a condition that is sufficient to retrieve a map from $\mathcal{X}$ to $\mathcal{Y}$. Then, the optimal transport will be given by the objects that minimize a certain criterion over this condition. ...

November 5, 2025 · 15 min