pith. machine review for the scientific record. sign in

arxiv: 2502.03749 · v2 · submitted 2025-02-06 · 💻 cs.LG · math.OC

Recognition: unknown

PINS: Proximal Iterations with Sparse Newton and Sinkhorn for Optimal Transport

Authors on Pith no claims yet
classification 💻 cs.LG math.OC
keywords sinkhornentropicpinscostinstancesoptimalouterplateau
0
0 comments X
read the original abstract

Optimal transport (OT) is a widely used tool in machine learning, but computing high-accuracy solutions for large instances remains costly. Entropic regularization and the Sinkhorn algorithm improve scalability; however, when the regularization parameter is small, Sinkhorn convergence slows, and the iterates approach an entropic solution that remains separated from the true OT plan by an entropic-bias plateau. We introduce PINS (Proximal Iterations with sparse Newton and Sinkhorn), a two-loop solver designed to move beyond this plateau. The outer loop applies an entropic proximal-point method, solving the original OT problem through a sequence of entropic subproblems with shifted cost matrices. Each inner subproblem is then solved by a Sinkhorn warm-up followed by sparse-Newton refinement. We prove that PINS converges globally to an optimal solution of the unregularized OT problem and that the inner Hessian admits a sparsification at every outer iteration with a structure independent of the cost matrix. On synthetic and augmented-MNIST instances, PINS achieves much lower relative cost errors than Sinkhorn-type baselines, which stall at the entropic-bias plateau, and is $5$--$73\times$ faster than Sinkhorn with the same outer loop at matched accuracy. On large-scale DOTmark instances, a streaming implementation reduces peak memory by $24$--$54\%$ compared with the network-simplex linear programming (LP) solver and remains feasible under per-process memory budgets for which the LP solver fails.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 2 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Beyond Expected Information Gain: Stable Bayesian Optimal Experimental Design with Integral Probability Metrics and Plug-and-Play Extensions

    stat.ML 2026-04 unverdicted novelty 7.0

    An IPM-based framework for Bayesian optimal experimental design is proposed that replaces KL-based expected information gain with Wasserstein, MMD, and energy distances, delivering stronger stability guarantees and pl...

  2. A Provably Convergent and Practical Algorithm for Gromov--Wasserstein Optimal Transport

    cs.LG 2026-05 unverdicted novelty 6.0

    An inexact projected-gradient framework for GWOT with a verifiable feasibility-residual condition that proves subsequential and full-sequence convergence to stationary points under a mild tolerance decay.