EdgeFlowerTune is a real-device benchmark that jointly assesses model quality and system costs for federated LLM fine-tuning on edge hardware using three protocols: Quality-under-Budget, Cost-to-Target, and Robustness.
Towards building the federated GPT: Federated instruction tuning
3 Pith papers cite this work. Polarity classification is still indexing.
years
2026 3verdicts
UNVERDICTED 3representative citing papers
FedAttr enables privacy-preserving client-level attribution of watermarked data in federated LLM fine-tuning via paired-subset differencing, differential scoring, and Stouffer combination, achieving 100% TPR and 0% FPR with bounded leakage.
FedDetox uses on-device knowledge-distilled classifiers to sanitize toxic data in federated SLM training, preserving safety alignment comparable to centralized baselines.
citing papers explorer
-
EdgeFlowerTune: Evaluating Federated LLM Fine-Tuning Under Realistic Edge System Constraints
EdgeFlowerTune is a real-device benchmark that jointly assesses model quality and system costs for federated LLM fine-tuning on edge hardware using three protocols: Quality-under-Budget, Cost-to-Target, and Robustness.
-
FedAttr: Towards Privacy-preserving Client-Level Attribution in Federated LLM Fine-tuning
FedAttr enables privacy-preserving client-level attribution of watermarked data in federated LLM fine-tuning via paired-subset differencing, differential scoring, and Stouffer combination, achieving 100% TPR and 0% FPR with bounded leakage.
-
FedDetox: Robust Federated SLM Alignment via On-Device Data Sanitization
FedDetox uses on-device knowledge-distilled classifiers to sanitize toxic data in federated SLM training, preserving safety alignment comparable to centralized baselines.