Recognition: unknown
RAG-HAR: Retrieval Augmented Generation-based Human Activity Recognition
read the original abstract
Human Activity Recognition (HAR) underpins applications in healthcare, rehabilitation, fitness tracking, and smart environments, yet existing deep learning approaches demand dataset-specific training, large labeled corpora, and significant computational resources.We introduce RAG-HAR, a training-free retrieval-augmented framework that leverages large language models (LLMs) for HAR. RAG-HAR computes lightweight statistical descriptors, retrieves semantically similar samples from a vector database, and uses this contextual evidence to make LLM-based activity identification. We further enhance RAG-HAR by first applying prompt optimization and introducing an LLM-based activity descriptor that generates context-enriched vector databases for delivering accurate and highly relevant contextual information. Along with these mechanisms, RAG-HAR achieves state-of-the-art performance across six diverse HAR benchmarks. Most importantly, RAG-HAR attains these improvements without requiring model training or fine-tuning, emphasizing its robustness and practical applicability. RAG-HAR moves beyond known behaviors, enabling the recognition and meaningful labelling of multiple unseen human activities.
This paper has not been read by Pith yet.
Forward citations
Cited by 2 Pith papers
-
KD-Judge: A Knowledge-Driven Automated Judge Framework for Functional Fitness Movements on Edge Devices
KD-Judge structures fitness rules via LLM retrieval and chain-of-thought, then uses pose-guided kinematics for rule-based rep validation with caching for efficient edge deployment, achieving RTF < 1 and speedups up to...
-
TRACE: Temporal Reasoning over Context and Evidence for Activity Recognition in Smart Homes
TRACE improves activity recognition accuracy and temporal coherence in smart homes by integrating multi-source sensor evidence with contextual priors.
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.