Privatar uses horizontal frequency partitioning and distribution-aware minimal perturbation to enable private offloading of VR avatar reconstruction, supporting 2.37x more users with modest overhead.
Title resolution pending
2 Pith papers cite this work. Polarity classification is still indexing.
2
Pith papers citing it
verdicts
UNVERDICTED 2representative citing papers
Gopher, a 280 billion parameter language model, achieves state-of-the-art performance on the majority of 152 tasks with largest gains in reading comprehension, fact-checking, and toxic language detection.
citing papers explorer
-
Privatar: Scalable Privacy-preserving Multi-user VR via Secure Offloading
Privatar uses horizontal frequency partitioning and distribution-aware minimal perturbation to enable private offloading of VR avatar reconstruction, supporting 2.37x more users with modest overhead.
-
Scaling Language Models: Methods, Analysis & Insights from Training Gopher
Gopher, a 280 billion parameter language model, achieves state-of-the-art performance on the majority of 152 tasks with largest gains in reading comprehension, fact-checking, and toxic language detection.