pith. machine review for the scientific record. sign in

hub

Generative Modeling via Drifting

24 Pith papers cite this work. Polarity classification is still indexing.

24 Pith papers citing it
abstract

Generative modeling can be formulated as learning a mapping f such that its pushforward distribution matches the data distribution. The pushforward behavior can be carried out iteratively at inference time, for example in diffusion and flow-based models. In this paper, we propose a new paradigm called Drifting Models, which evolve the pushforward distribution during training and naturally admit one-step inference. We introduce a drifting field that governs the sample movement and achieves equilibrium when the distributions match. This leads to a training objective that allows the neural network optimizer to evolve the distribution. In experiments, our one-step generator achieves state-of-the-art results on ImageNet at 256 x 256 resolution, with an FID of 1.54 in latent space and 1.61 in pixel space. We hope that our work opens up new opportunities for high-quality one-step generation.

hub tools

years

2026 24

representative citing papers

Representation Fr\'echet Loss for Visual Generation

cs.CV · 2026-04-30 · unverdicted · novelty 8.0

Fréchet Distance optimized as FD-loss in representation space by decoupling population size from batch size improves generator quality, enables one-step generation from multi-step models, and motivates a multi-representation metric FDr^k.

One-Step Generative Modeling via Wasserstein Gradient Flows

cs.LG · 2026-05-12 · conditional · novelty 7.0

W-Flow achieves state-of-the-art one-step ImageNet 256x256 generation at 1.29 FID by training a static neural network to follow a Wasserstein gradient flow that minimizes Sinkhorn divergence, delivering roughly 100x faster sampling than comparable multi-step models.

Geometry-Aware Discretization Error of Diffusion Models

cs.LG · 2026-05-08 · unverdicted · novelty 7.0

First-order asymptotic expansions of weak and Fréchet discretization errors in diffusion sampling are derived, explicit under Gaussian data through covariance geometry and robust to other data geometries.

Speech Enhancement Based on Drifting Models

cs.SD · 2026-04-27 · unverdicted · novelty 7.0 · 2 refs

DriftSE achieves one-step speech enhancement by evolving the pushforward distribution of a mapping function to match the clean speech distribution using a learned drifting field.

Receding-Horizon Control via Drifting Models

cs.AI · 2026-04-06 · unverdicted · novelty 7.0

Drifting MPC produces a unique distribution over trajectories that trades off data support against optimality and enables efficient receding-horizon planning under unknown dynamics.

Continuous Latent Diffusion Language Model

cs.CL · 2026-05-07 · unverdicted · novelty 6.0

Cola DLM proposes a hierarchical latent diffusion model that learns a text-to-latent mapping, fits a global semantic prior in continuous space with a block-causal DiT, and performs conditional decoding, establishing latent prior modeling as an alternative to token-level autoregressive language model

SymDrift: One-Shot Generative Modeling under Symmetries

cs.LG · 2026-05-07 · unverdicted · novelty 6.0

SymDrift makes drifting models produce symmetry-invariant samples in one step via symmetrized coordinate drifts or G-invariant embeddings, outperforming prior one-shot baselines on molecular benchmarks and cutting compute by up to 40x.

Energy Generative Modeling: A Lyapunov-based Energy Matching Perspective

cs.LG · 2026-05-07 · unverdicted · novelty 6.0

Training and sampling in static scalar energy generative models are two instances of the same Lyapunov-driven density transport dynamics on Wasserstein space, differing only by initial condition, which yields a finite stopping criterion for Langevin sampling and additive composition rules that keep

Generative Drifting for Conditional Medical Image Generation

cs.CV · 2026-04-21 · unverdicted · novelty 6.0

GDM reformulates 3D conditional medical image generation as attractive-repulsive drifting with multi-level feature banks to balance distribution plausibility, patient fidelity, and one-step inference, outperforming GANs, flows, and SDEs on MRI-to-CT and sparse CT tasks.

Positive-Only Drifting Policy Optimization

cs.LG · 2026-04-15 · unverdicted · novelty 6.0

PODPO is a likelihood-free generative policy optimization method for online RL that steers actions to high-return regions using only positive-advantage samples and local contrastive drifting.

Lookahead Drifting Model

cs.LG · 2026-04-10 · unverdicted · novelty 6.0

The lookahead drifting model improves upon the drifting model by sequentially computing multiple drifting terms that incorporate higher-order gradient information, leading to better performance on toy examples and CIFAR10.

ELT: Elastic Looped Transformers for Visual Generation

cs.CV · 2026-04-10 · unverdicted · novelty 6.0

Elastic Looped Transformers share weights across recurrent blocks and apply intra-loop self-distillation to deliver 4x parameter reduction while matching competitive FID and FVD scores on ImageNet and UCF-101.

Drifting Fields are not Conservative

cs.LG · 2026-04-07 · unverdicted · novelty 6.0 · 2 refs

Drift fields are not conservative except for Gaussian kernels; sharp normalization makes them conservative for any radial kernel by equating them to score differences of kernel density estimates.

MRI-to-CT synthesis using drifting models

eess.IV · 2026-03-30 · unverdicted · novelty 6.0

Drifting models outperform diffusion, CNN, VAE, and GAN baselines in MRI-to-CT synthesis on two pelvis datasets with higher SSIM/PSNR, lower RMSE, and millisecond one-step inference.

Consistency Regularised Gradient Flows for Inverse Problems

stat.ML · 2026-05-08 · unverdicted · novelty 5.0

A consistency-regularized Euclidean-Wasserstein-2 gradient flow performs joint posterior sampling and prompt optimization in latent space for efficient low-NFE inverse problem solving with diffusion models.

On the Wasserstein Gradient Flow Interpretation of Drifting Models

cs.LG · 2026-05-06 · unverdicted · novelty 5.0

GMD algorithms correspond to limiting points of Wasserstein gradient flows on the KL divergence with Parzen smoothing and bear resemblance to Sinkhorn divergence fixed points, with extensions to MMD and other divergences.

citing papers explorer

Showing 24 of 24 citing papers.

  • Representation Fr\'echet Loss for Visual Generation cs.CV · 2026-04-30 · unverdicted · none · ref 6 · internal anchor

    Fréchet Distance optimized as FD-loss in representation space by decoupling population size from batch size improves generator quality, enables one-step generation from multi-step models, and motivates a multi-representation metric FDr^k.

  • DriftXpress: Faster Drifting Models via Projected RKHS Fields cs.LG · 2026-05-12 · unverdicted · none · ref 5 · internal anchor

    DriftXpress approximates drifting kernels via projected RKHS fields to lower training cost of one-step generative models while matching original FID scores.

  • One-Step Generative Modeling via Wasserstein Gradient Flows cs.LG · 2026-05-12 · conditional · none · ref 11 · internal anchor

    W-Flow achieves state-of-the-art one-step ImageNet 256x256 generation at 1.29 FID by training a static neural network to follow a Wasserstein gradient flow that minimizes Sinkhorn divergence, delivering roughly 100x faster sampling than comparable multi-step models.

  • Geometry-Aware Discretization Error of Diffusion Models cs.LG · 2026-05-08 · unverdicted · none · ref 5 · internal anchor

    First-order asymptotic expansions of weak and Fréchet discretization errors in diffusion sampling are derived, explicit under Gaussian data through covariance geometry and robust to other data geometries.

  • Speech Enhancement Based on Drifting Models cs.SD · 2026-04-27 · unverdicted · none · ref 29 · 2 links · internal anchor

    DriftSE achieves one-step speech enhancement by evolving the pushforward distribution of a mapping function to match the clean speech distribution using a learned drifting field.

  • Identifiability and Stability of Generative Drifting with Companion-Elliptic Kernel Families stat.ML · 2026-04-27 · conditional · none · ref 1 · 2 links · internal anchor

    For companion-elliptic kernels vanishing drifting fields identify target measures exactly, and field convergence yields weak convergence once mass escape to infinity is detected by a single C0 scalar.

  • MISTY: High-Throughput Motion Planning via Mixer-based Single-step Drifting cs.RO · 2026-04-23 · unverdicted · none · ref 11 · internal anchor

    MISTY delivers state-of-the-art closed-loop scores on nuPlan Test14-hard (80.32 non-reactive, 82.21 reactive) at 10.1 ms latency via single-step MLP-Mixer inference and a latent drifting loss that encourages proactive maneuvers.

  • Receding-Horizon Control via Drifting Models cs.AI · 2026-04-06 · unverdicted · none · ref 17 · internal anchor

    Drifting MPC produces a unique distribution over trajectories that trades off data support against optimality and enables efficient receding-horizon planning under unknown dynamics.

  • Drifting Field Policy: A One-Step Generative Policy via Wasserstein Gradient Flow cs.LG · 2026-05-08 · unverdicted · none · ref 10 · internal anchor

    DFP is a one-step generative policy using Wasserstein gradient flow on a drifting model backbone, with a top-K behavior cloning surrogate, that reaches SOTA on Robomimic and OGBench manipulation tasks.

  • Continuous Latent Diffusion Language Model cs.CL · 2026-05-07 · unverdicted · none · ref 19 · internal anchor

    Cola DLM proposes a hierarchical latent diffusion model that learns a text-to-latent mapping, fits a global semantic prior in continuous space with a block-causal DiT, and performs conditional decoding, establishing latent prior modeling as an alternative to token-level autoregressive language model

  • SymDrift: One-Shot Generative Modeling under Symmetries cs.LG · 2026-05-07 · unverdicted · none · ref 3 · internal anchor

    SymDrift makes drifting models produce symmetry-invariant samples in one step via symmetrized coordinate drifts or G-invariant embeddings, outperforming prior one-shot baselines on molecular benchmarks and cutting compute by up to 40x.

  • Energy Generative Modeling: A Lyapunov-based Energy Matching Perspective cs.LG · 2026-05-07 · unverdicted · none · ref 47 · internal anchor

    Training and sampling in static scalar energy generative models are two instances of the same Lyapunov-driven density transport dynamics on Wasserstein space, differing only by initial condition, which yields a finite stopping criterion for Langevin sampling and additive composition rules that keep

  • ReflectDrive-2: Reinforcement-Learning-Aligned Self-Editing for Discrete Diffusion Driving cs.RO · 2026-05-06 · unverdicted · none · ref 22 · 2 links · internal anchor

    ReflectDrive-2 combines masked discrete diffusion with RL-aligned self-editing to generate and refine driving trajectories, reaching 91.0 PDMS on NAVSIM camera-only and 94.8 in best-of-6.

  • Generative Drifting for Conditional Medical Image Generation cs.CV · 2026-04-21 · unverdicted · none · ref 14 · internal anchor

    GDM reformulates 3D conditional medical image generation as attractive-repulsive drifting with multi-level feature banks to balance distribution plausibility, patient fidelity, and one-step inference, outperforming GANs, flows, and SDEs on MRI-to-CT and sparse CT tasks.

  • Attraction, Repulsion, and Friction: Introducing DMF, a Friction-Augmented Drifting Model cs.LG · 2026-04-20 · unverdicted · none · ref 3 · internal anchor

    DMF augments kernel-based drifting models with scheduled friction to guarantee convergence and matches Optimal Flow Matching on FFHQ adult-to-child translation at 16x lower training cost.

  • Positive-Only Drifting Policy Optimization cs.LG · 2026-04-15 · unverdicted · none · ref 1 · internal anchor

    PODPO is a likelihood-free generative policy optimization method for online RL that steers actions to high-return regions using only positive-advantage samples and local contrastive drifting.

  • Lookahead Drifting Model cs.LG · 2026-04-10 · unverdicted · none · ref 5 · internal anchor

    The lookahead drifting model improves upon the drifting model by sequentially computing multiple drifting terms that incorporate higher-order gradient information, leading to better performance on toy examples and CIFAR10.

  • ELT: Elastic Looped Transformers for Visual Generation cs.CV · 2026-04-10 · unverdicted · none · ref 11 · internal anchor

    Elastic Looped Transformers share weights across recurrent blocks and apply intra-loop self-distillation to deliver 4x parameter reduction while matching competitive FID and FVD scores on ImageNet and UCF-101.

  • Drifting Fields are not Conservative cs.LG · 2026-04-07 · unverdicted · none · ref 2 · 2 links · internal anchor

    Drift fields are not conservative except for Gaussian kernels; sharp normalization makes them conservative for any radial kernel by equating them to score differences of kernel density estimates.

  • MRI-to-CT synthesis using drifting models eess.IV · 2026-03-30 · unverdicted · none · ref 31 · internal anchor

    Drifting models outperform diffusion, CNN, VAE, and GAN baselines in MRI-to-CT synthesis on two pelvis datasets with higher SSIM/PSNR, lower RMSE, and millisecond one-step inference.

  • MicroDiffuse3D: A Foundation Model for 3D Microscopy Imaging Restoration cs.CV · 2026-05-08 · unverdicted · none · ref 44 · internal anchor

    MicroDiffuse3D is a foundation model that restores 3D microscopy images under sparse super-resolution, joint degradation, and low-SNR denoising, reporting 10.58% segmentation and 15.59% line-profile gains over baselines.

  • Consistency Regularised Gradient Flows for Inverse Problems stat.ML · 2026-05-08 · unverdicted · none · ref 17 · internal anchor

    A consistency-regularized Euclidean-Wasserstein-2 gradient flow performs joint posterior sampling and prompt optimization in latent space for efficient low-NFE inverse problem solving with diffusion models.

  • Teacher-Feature Drifting: One-Step Diffusion Distillation with Pretrained Diffusion Representations cs.CV · 2026-05-08 · unverdicted · none · ref 2 · internal anchor

    A simplified one-step diffusion distillation uses pretrained teacher features directly for drifting loss plus a mode coverage term, achieving FID 1.58 on ImageNet-64 and 18.4 on SDXL.

  • On the Wasserstein Gradient Flow Interpretation of Drifting Models cs.LG · 2026-05-06 · unverdicted · none · ref 9 · internal anchor

    GMD algorithms correspond to limiting points of Wasserstein gradient flows on the KL divergence with Parzen smoothing and bear resemblance to Sinkhorn divergence fixed points, with extensions to MMD and other divergences.