pith. machine review for the scientific record. sign in

arxiv: 1808.04730 · v3 · submitted 2018-08-14 · 💻 cs.LG · stat.ML

Recognition: unknown

Analyzing Inverse Problems with Invertible Neural Networks

Authors on Pith no claims yet
classification 💻 cs.LG stat.ML
keywords innsinversenetworksneuralparametermeasurementparametersambiguous
0
0 comments X
read the original abstract

In many tasks, in particular in natural science, the goal is to determine hidden system parameters from a set of measurements. Often, the forward process from parameter- to measurement-space is a well-defined function, whereas the inverse problem is ambiguous: one measurement may map to multiple different sets of parameters. In this setting, the posterior parameter distribution, conditioned on an input measurement, has to be determined. We argue that a particular class of neural networks is well suited for this task -- so-called Invertible Neural Networks (INNs). Although INNs are not new, they have, so far, received little attention in literature. While classical neural networks attempt to solve the ambiguous inverse problem directly, INNs are able to learn it jointly with the well-defined forward process, using additional latent output variables to capture the information otherwise lost. Given a specific measurement and sampled latent variables, the inverse pass of the INN provides a full distribution over parameter space. We verify experimentally, on artificial data and real-world problems from astrophysics and medicine, that INNs are a powerful analysis tool to find multi-modalities in parameter space, to uncover parameter correlations, and to identify unrecoverable parameters.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 6 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. DualTCN: A Physics-Constrained Temporal Convolutional Network for 2 Time-Domain Marine CSEM Inversion

    cs.LG 2026-05 unverdicted novelty 7.0

    DualTCN is the first deep-learning model for time-domain marine CSEM inversion that regresses four earth parameters, achieves high accuracy on simulated data, and runs up to 21,000 times faster than classical optimizers.

  2. Designing Solutions to Geophysical Inverse Problems by Changing Variables

    physics.geo-ph 2026-04 unverdicted novelty 6.0

    Different parametrizations of the same geophysical inverse problem yield inconsistent Bayesian posterior distributions and deterministic inversion results even when they encode identical information.

  3. Reversible Deep Learning for 13C NMR in Chemoinformatics: On Structures and Spectra

    cs.LG 2026-02 unverdicted novelty 6.0

    A conditional invertible neural network unifies forward prediction of 13C NMR spectra from structures and inverse generation of structure candidates from spectra.

  4. Beyond Structure: Revolutionising Materials Discovery via AI-Driven Synthesis Protocol-Property Relationships

    cond-mat.mtrl-sci 2026-05 unverdicted novelty 5.0

    A perspective proposes a synthesis-first paradigm for AI-driven materials discovery, treating protocols rather than structures as the key variables to close the synthesizability gap via machine-readable recipes, gener...

  5. Generative Design of a Gas Turbine Combustor Using Invertible Neural Networks

    cs.AI 2026-04 unverdicted novelty 5.0

    Invertible Neural Networks are used to generate gas turbine combustor designs that meet specified performance criteria from a training database of parameterized designs and simulations.

  6. Generative models for decision-making under distributional shift

    cs.LG 2026-04 unverdicted novelty 3.0

    Generative models via pushforward maps, Fokker-Planck equations, and Wasserstein geometry enable learning nominal uncertainty, stressed distributions for robustness, and conditional posteriors under distributional shift.