Large-model adaptation with Tibetan text handling produces natural speech from limited data, outperforming commercial systems.
Title resolution pending
3 Pith papers cite this work. Polarity classification is still indexing.
years
2026 3verdicts
UNVERDICTED 3representative citing papers
TSN-Affinity enables continual offline RL via similarity-guided parameter reuse in sparse subnetworks, showing better retention than replay baselines on Atari and robotic arm tasks.
The survey reviews the evolution of accent conversion from early DSP approaches to neural models, situating them in linguistic foundations and highlighting constraints, datasets, evaluations, and future directions.
citing papers explorer
-
Tibetan-TTS:Low-Resource Tibetan Speech Synthesis with Large Model Adaptation
Large-model adaptation with Tibetan text handling produces natural speech from limited data, outperforming commercial systems.
-
TSN-Affinity: Similarity-Driven Parameter Reuse for Continual Offline Reinforcement Learning
TSN-Affinity enables continual offline RL via similarity-guided parameter reuse in sparse subnetworks, showing better retention than replay baselines on Atari and robotic arm tasks.
-
Accent Conversion: A Problem-Driven Survey of Sociolinguistic and Technical Constraints
The survey reviews the evolution of accent conversion from early DSP approaches to neural models, situating them in linguistic foundations and highlighting constraints, datasets, evaluations, and future directions.