FedGMC introduces dual manifold calibration to balance global commonalities and local personalization in graph federated learning, outperforming rigid alignment baselines on eleven homophilic and heterophilic graphs.
hub
Federated Learning with Personalization Layers
20 Pith papers cite this work. Polarity classification is still indexing.
hub tools
years
2026 20verdicts
UNVERDICTED 20representative citing papers
FedOBP introduces a quantile-thresholded importance score based on a federated first-order Taylor approximation to select a small set of parameters for personalization, claiming better performance than prior PFL methods.
S2-WEF detects dynamic free-riders in federated learning by simulating attack WEF patterns from prior global models, combining them with mutual deviation scores, and using two-dimensional clustering without proxy data or pre-training.
A two-layer privacy system using skeletal abstraction and federated learning enables multi-site training for child autism behavior recognition and outperforms standard federated baselines on the MMASD benchmark.
Low-resolution data improves high-resolution model performance when high-resolution samples are limited, via KL-divergence bounds and experiments on vision transformers and CNNs.
MuCALD-SplitFed adds causal-latent diffusion to multi-task split federated learning to raise segmentation accuracy and cut reconstruction and membership-inference leakage compared with standard SplitFed and personalized FL baselines.
Harmonization works better than personalization for appearance-based domain shifts in federated medical imaging while personalization is superior for structural shifts, with both performing similarly when shifts are small.
RCSR is a personalization-friendly federated framework that improves cross-modal retrieval accuracy and stability under missing modalities via semantic routing and adapters.
HierFedCEA delivers a hierarchical federated learning framework for privacy-preserving climate control optimization across heterogeneous CEA facilities, reaching 94% of centralized performance with under 1 MB communication.
A prototype-regularized federated learning framework exchanges class-level prototypes and applies contrastive regularization to achieve better cross-domain ASTE performance while cutting communication costs.
FedMM applies a residual quantized VAE with a global federated codebook and local market-specific codebooks to transmit discrete codes that capture shared and specific collaborative patterns for improved CTR prediction across markets.
Experiments on real industrial time series show that partial model sharing improves diffusion model performance in bandwidth-limited non-IID settings, while full sharing stabilizes GAN training but offers less robustness than VAE or DDPM alternatives.
FedFrozen improves stability in heterogeneous federated Transformer training by warming up the full model then freezing the attention kernel (query/key) while optimizing the value block under a fixed kernel.
Fine-tuning impairs the class balance of foundation models in long-tailed personalized federated learning, which FedPuReL addresses through gradient purification using zero-shot predictions and residual-based personalization to achieve better global and local performance.
FRAMP generates client-specific models from compact descriptors in federated learning, trains tailored submodels, and aligns representations to balance personalization with global consistency.
FedRio is a new federated framework that outperforms standard federated baselines in social bot detection accuracy and efficiency while staying competitive with centralized models under stronger privacy constraints.
PERFECT applies personalized federated learning to achieve 99% radar detection recall matching centralized performance in non-IID settings while preserving privacy.
FedKPer improves the generalization-personalization trade-off in medical federated learning via local knowledge personalization and selective aggregation that emphasizes reliable updates.
Federated aggregation strategies show distinct performance trade-offs in accuracy, loss, and efficiency depending on whether client data distributions are homogeneous or heterogeneous.
A federated learning framework lets distributed weather sensors train shared deep learning models for forecasting and anomaly detection while keeping raw data private.
citing papers explorer
-
Beyond Rigid Alignment: Graph Federated Learning via Dual Manifold Calibration
FedGMC introduces dual manifold calibration to balance global commonalities and local personalization in graph federated learning, outperforming rigid alignment baselines on eleven homophilic and heterophilic graphs.
-
FedOBP: Federated Optimal Brain Personalization through Cloud-Edge Element-wise Decoupling
FedOBP introduces a quantile-thresholded importance score based on a federated first-order Taylor approximation to select a small set of parameters for personalization, claiming better performance than prior PFL methods.
-
Dynamic Free-Rider Detection in Federated Learning via Simulated Attack Patterns
S2-WEF detects dynamic free-riders in federated learning by simulating attack WEF patterns from prior global models, combining them with mutual deviation scores, and using two-dimensional clustering without proxy data or pre-training.
-
Unlocking Multi-Site Clinical Data: A Federated Approach to Privacy-First Child Autism Behavior Analysis
A two-layer privacy system using skeletal abstraction and federated learning enables multi-site training for child autism behavior recognition and outperforms standard federated baselines on the MMASD benchmark.
-
On What We Can Learn from Low-Resolution Data
Low-resolution data improves high-resolution model performance when high-resolution samples are limited, via KL-divergence bounds and experiments on vision transformers and CNNs.
-
MuCALD-SplitFed: Causal-Latent Diffusion for Privacy-Preserving Multi-Task Split-Federated Medical Image Segmentation
MuCALD-SplitFed adds causal-latent diffusion to multi-task split federated learning to raise segmentation accuracy and cut reconstruction and membership-inference leakage compared with standard SplitFed and personalized FL baselines.
-
When To Adapt? Adapting the Model or Data in Federated Medical Imaging
Harmonization works better than personalization for appearance-based domain shifts in federated medical imaging while personalization is superior for structural shifts, with both performing similarly when shifts are small.
-
Federated Cross-Modal Retrieval with Missing Modalities via Semantic Routing and Adapter Personalization
RCSR is a personalization-friendly federated framework that improves cross-modal retrieval accuracy and stability under missing modalities via semantic routing and adapters.
-
HierFedCEA: Hierarchical Federated Edge Learning for Privacy-Preserving Climate Control Optimization Across Heterogeneous Controlled Environment Agriculture Facilities
HierFedCEA delivers a hierarchical federated learning framework for privacy-preserving climate control optimization across heterogeneous CEA facilities, reaching 94% of centralized performance with under 1 MB communication.
-
Prototype-Regularized Federated Learning for Cross-Domain Aspect Sentiment Triplet Extraction
A prototype-regularized federated learning framework exchanges class-level prototypes and applies contrastive regularization to achieve better cross-domain ASTE performance while cutting communication costs.
-
FedMM: Federated Collaborative Signal Quantization for Multi-Market CTR Prediction
FedMM applies a residual quantized VAE with a global federated codebook and local market-specific codebooks to transmit discrete codes that capture shared and specific collaborative patterns for improved CTR prediction across markets.
-
On the Tradeoffs of On-Device Generative Models in Federated Predictive Maintenance Systems
Experiments on real industrial time series show that partial model sharing improves diffusion model performance in bandwidth-limited non-IID settings, while full sharing stabilizes GAN training but offers less robustness than VAE or DDPM alternatives.
-
FedFrozen: Two-Stage Federated Optimization via Attention Kernel Freezing
FedFrozen improves stability in heterogeneous federated Transformer training by warming up the full model then freezing the attention kernel (query/key) while optimizing the value block under a fixed kernel.
-
Fine-Tuning Impairs the Balancedness of Foundation Models in Long-tailed Personalized Federated Learning
Fine-tuning impairs the class balance of foundation models in long-tailed personalized federated learning, which FedPuReL addresses through gradient purification using zero-shot predictions and residual-based personalization to achieve better global and local performance.
-
Representation-Aligned Multi-Scale Personalization for Federated Learning
FRAMP generates client-specific models from compact descriptors in federated learning, trains tailored submodels, and aligns representations to balance personalization with global consistency.
-
FedRio: Personalized Federated Social Bot Detection via Cooperative Reinforced Contrastive Adversarial Distillation
FedRio is a new federated framework that outperforms standard federated baselines in social bot detection accuracy and efficiency while staying competitive with centralized models under stronger privacy constraints.
-
PERFECT: Personalized Federated Learning for CBRS Radar Detection
PERFECT applies personalized federated learning to achieve 99% radar detection recall matching centralized performance in non-IID settings while preserving privacy.
-
FedKPer: Tackling Generalization and Personalization in Medical Federated Learning via Knowledge Personalization
FedKPer improves the generalization-personalization trade-off in medical federated learning via local knowledge personalization and selective aggregation that emphasizes reliable updates.
-
A Comparative Study of Federated Learning Aggregation Strategies under Homogeneous and Heterogeneous Data Distributions
Federated aggregation strategies show distinct performance trade-offs in accuracy, loss, and efficiency depending on whether client data distributions are homogeneous or heterogeneous.
-
Federated Weather Modeling on Sensor Data
A federated learning framework lets distributed weather sensors train shared deep learning models for forecasting and anomaly detection while keeping raw data private.