REMIX uses Laplace kernel parameterization to enable scalable full-covariance modeling in model inversion, improving synthetic sample quality and performance in data-free continual learning.
Catastrophic interference in connectionist networks: The sequential learning problem
2 Pith papers cite this work. Polarity classification is still indexing.
2
Pith papers citing it
years
2026 2verdicts
UNVERDICTED 2representative citing papers
Online kernel regression equals offline regression with shifted targets; correcting the targets lets online learning match offline performance and outperform true targets in continual image classification.
citing papers explorer
-
Stop Marginalizing My Dreams: Model Inversion via Laplace Kernel for Continual Learning
REMIX uses Laplace kernel parameterization to enable scalable full-covariance modeling in model inversion, improving synthetic sample quality and performance in data-free continual learning.
-
Characterizing and Correcting Effective Target Shift in Online Learning
Online kernel regression equals offline regression with shifted targets; correcting the targets lets online learning match offline performance and outperform true targets in continual image classification.