Real NVP uses affine coupling layers to create invertible transformations that support exact density estimation, sampling, and latent inference without approximations.
Importance weighted autoencoders
5 Pith papers cite this work. Polarity classification is still indexing.
representative citing papers
Identifiability is proven for recurrent nonlinear switching dynamical systems under flexible assumptions, and ΩSDS is introduced as a flow-based estimator that improves disentanglement and forecasting over VAE-based methods.
RG-inspired lattice models for piecewise GLMs provide explicit interpretable partitions and a replica-analysis-derived scaling law for regularization that allows increasing complexity without expected rise in generalization loss.
NEO induces compositional latent programs as world theories from observations and executes them to enable explanation-driven generalization.
QHyer replaces return-to-go with a state-conditioned Q-estimator and adds a gated hybrid attention-mamba backbone to achieve state-of-the-art performance in offline goal-conditioned RL on both Markovian and non-Markovian datasets.
citing papers explorer
-
Density estimation using Real NVP
Real NVP uses affine coupling layers to create invertible transformations that support exact density estimation, sampling, and latent inference without approximations.
-
End-to-End Identifiable and Consistent Recurrent Switching Dynamical Systems
Identifiability is proven for recurrent nonlinear switching dynamical systems under flexible assumptions, and ΩSDS is introduced as a flow-based estimator that improves disentanglement and forecasting over VAE-based methods.
-
A renormalization-group inspired lattice-based framework for piecewise generalized linear models
RG-inspired lattice models for piecewise GLMs provide explicit interpretable partitions and a replica-analysis-derived scaling law for regularization that allows increasing complexity without expected rise in generalization loss.
-
Learning to Theorize the World from Observation
NEO induces compositional latent programs as world theories from observations and executes them to enable explanation-driven generalization.
-
QHyer: Q-conditioned Hybrid Attention-mamba Transformer for Offline Goal-conditioned RL
QHyer replaces return-to-go with a state-conditioned Q-estimator and adds a gated hybrid attention-mamba backbone to achieve state-of-the-art performance in offline goal-conditioned RL on both Markovian and non-Markovian datasets.