pith. machine review for the scientific record. sign in

arxiv: 1810.01367 · v3 · submitted 2018-10-02 · 💻 cs.LG · cs.CV· stat.ML

Recognition: unknown

FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models

Authors on Pith no claims yet
classification 💻 cs.LG cs.CVstat.ML
keywords generativemodelsarchitecturesdensitydistributionestimationinvertiblejacobian
0
0 comments X
read the original abstract

A promising class of generative models maps points from a simple distribution to a complex distribution through an invertible neural network. Likelihood-based training of these models requires restricting their architectures to allow cheap computation of Jacobian determinants. Alternatively, the Jacobian trace can be used if the transformation is specified by an ordinary differential equation. In this paper, we use Hutchinson's trace estimator to give a scalable unbiased estimate of the log-density. The result is a continuous-time invertible generative model with unbiased density estimation and one-pass sampling, while allowing unrestricted neural network architectures. We demonstrate our approach on high-dimensional density estimation, image generation, and variational inference, achieving the state-of-the-art among exact likelihood methods with efficient sampling.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 12 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Kurtosis-Guided Denoising Score Matching for Tabular Anomaly Detection

    cs.LG 2026-05 unverdicted novelty 7.0

    K-DSM uses per-feature kurtosis to set noise scales in DSM, enabling effective single-scale anomaly detection on tabular benchmarks in both semi-supervised and unsupervised settings.

  2. Generative Modeling with Orbit-Space Particle Flow Matching

    cs.GR 2026-05 unverdicted novelty 7.0

    OGPP is a particle flow-matching method using orbit-space canonicalization and geometric paths that achieves lower error and fewer steps than prior approaches on 3D benchmarks.

  3. Differentiable free energy surface: a variational approach to directly observing rare events using generative deep-learning models

    physics.comp-ph 2026-04 unverdicted novelty 7.0

    VaFES constructs a latent space from reversible collective variables and variationally optimizes a tractable-density generative model to produce a continuous free energy surface from which rare events are directly sampled.

  4. Training-Free Refinement of Flow Matching with Divergence-based Sampling

    cs.CV 2026-04 unverdicted novelty 7.0

    Flow Divergence Sampler refines flow matching by computing velocity field divergence to correct ambiguous intermediate states during inference, improving fidelity in text-to-image and inverse problem tasks.

  5. Progressive Distillation for Fast Sampling of Diffusion Models

    cs.LG 2022-02 unverdicted novelty 7.0

    Progressive distillation halves sampling steps repeatedly in diffusion models, reaching 4 steps with FID 3.0 on CIFAR-10 from 8192-step samplers.

  6. Conservative Flows: A New Paradigm of Generative Models

    cs.LG 2026-05 unverdicted novelty 6.0

    Conservative flows generate by running probability-preserving stochastic dynamics initialized at data points rather than noise, using corrected Langevin or predictor-corrector mechanisms on top of any pretrained flow ...

  7. Quantum Dynamics via Score Matching on Bohmian Trajectories

    quant-ph 2026-04 unverdicted novelty 6.0

    Neural networks learn the score of the probability density on Bohmian trajectories to recover exact Schrödinger dynamics via self-consistent minimization for nodeless wave functions, demonstrated on double-well splitt...

  8. Region-Constrained Group Relative Policy Optimization for Flow-Based Image Editing

    cs.CV 2026-04 unverdicted novelty 6.0

    RC-GRPO-Editing constrains GRPO exploration to editing regions via localized noise and attention rewards, improving instruction adherence and non-target preservation in flow-based image editing.

  9. MIOFlow 2.0: A unified framework for inferring cellular stochastic dynamics from single cell and spatial transcriptomics data

    cs.LG 2026-03 unverdicted novelty 6.0

    MIOFlow 2.0 learns stochastic cellular trajectories from transcriptomics data via neural SDEs, unbalanced optimal transport for growth, and a joint latent space unifying gene expression with spatial features.

  10. Behavioral Mode Discovery for Fine-tuning Multimodal Generative Policies

    cs.LG 2026-05 unverdicted novelty 5.0

    Unsupervised behavioral mode discovery combined with mutual information rewards enables RL fine-tuning of multimodal generative policies that achieves higher success rates without losing action diversity.

  11. A Unified Measure-Theoretic View of Diffusion, Score-Based, and Flow Matching Generative Models

    cs.LG 2026-05 unverdicted novelty 4.0

    Diffusion, score-based, and flow matching models are unified as instances of learning time-dependent vector fields inducing marginal distributions governed by continuity and Fokker-Planck equations.

  12. Flow Matching Guide and Code

    cs.LG 2024-12 unverdicted novelty 2.0

    Flow Matching is a generative modeling framework with mathematical foundations, design choices, extensions, and open-source PyTorch code for applications like image and text generation.