Dataset distillation introduces fairness gaps from subgroup pattern mismatches rather than just imbalance; distilling to a group-agnostic barycenter of predictive information reduces these gaps.
The variational fair autoencoder
2 Pith papers cite this work. Polarity classification is still indexing.
years
2026 2verdicts
UNVERDICTED 2representative citing papers
GPP trains local variational encoders in federated settings to release representations that keep utility within 1% of an autoencoder baseline while driving adversary AUC on sensitive attributes to near-random levels on MNIST, CelebA, and HAPT data.
citing papers explorer
-
Fair Dataset Distillation via Cross-Group Barycenter Alignment
Dataset distillation introduces fairness gaps from subgroup pattern mismatches rather than just imbalance; distilling to a group-agnostic barycenter of predictive information reduces these gaps.
-
Distributed Deep Variational Approach for Privacy-preserving Data Release
GPP trains local variational encoders in federated settings to release representations that keep utility within 1% of an autoencoder baseline while driving adversary AUC on sensitive attributes to near-random levels on MNIST, CelebA, and HAPT data.