CTQWformer fuses continuous-time quantum walks into a graph transformer and recurrent module to outperform standard GNNs and graph kernels on classification benchmarks.
hub
Attention is all you need
10 Pith papers cite this work. Polarity classification is still indexing.
hub tools
years
2026 10representative citing papers
EKD trains lightweight NMT students progressively from a chain of teachers with rising capacity, achieving BLEU scores within 0.08 of the largest teacher on IWSLT-14.
MAG-Net integrates radar dynamics with satellite IR, WV, and BTD channels via dual-stream encoding and uncertainty-weighted decoding to raise CSI40 by 0.083 over prior baselines for intense convective events.
Light-ResKAN reaches 99.09% accuracy on MSTAR SAR images with 82.9 times fewer FLOPs and 163.78 times fewer parameters than VGG16 by combining KAN convolutions, Gram polynomials, and channel-wise parameter sharing.
TSNN matches time series entries to a training-derived memory bank to forecast traffic without any trainable parameters and achieves competitive accuracy on four real-world datasets.
Lightweight transformer predicts iconic gesture placement and intensity from text and emotion for robot co-speech, outperforming GPT-4o on BEAT2 without audio input.
SACF discretizes target direction and distance from audio-visual cues then applies conditioned fusion to improve navigation efficiency and generalization to unheard sounds.
TL-MAPPO enables safe decentralized EV VPP operation under limited visibility, cutting voltage violations by 45% and costs by 10% versus baselines on a 33-bus network.
PrismNet combines text and image modalities with time series via a PID-guided contrastive learning module to boost few-shot power load forecasting accuracy and provide interpretability.
citing papers explorer
-
CTQWformer: A CTQW-based Transformer for Graph Classification
CTQWformer fuses continuous-time quantum walks into a graph transformer and recurrent module to outperform standard GNNs and graph kernels on classification benchmarks.
-
Evolving Knowledge Distillation for Lightweight Neural Machine Translation
EKD trains lightweight NMT students progressively from a chain of teachers with rising capacity, achieving BLEU scores within 0.08 of the largest teacher on IWSLT-14.
-
MAG-Net: Physics-Aware Multi-Modal Fusion of Geostationary Satellite and Radar for Severe Convective Precipitation Nowcasting
MAG-Net integrates radar dynamics with satellite IR, WV, and BTD channels via dual-stream encoding and uncertainty-weighted decoding to raise CSI40 by 0.083 over prior baselines for intense convective events.
-
Light-ResKAN: A Parameter-Sharing Lightweight KAN with Gram Polynomials for Efficient SAR Image Recognition
Light-ResKAN reaches 99.09% accuracy on MSTAR SAR images with 82.9 times fewer FLOPs and 163.78 times fewer parameters than VGG16 by combining KAN convolutions, Gram polynomials, and channel-wise parameter sharing.
-
TSNN: A Non-parametric and Interpretable Framework for Traffic Time Series Forecasting
TSNN matches time series entries to a training-derived memory bank to forecast traffic without any trainable parameters and achieves competitive accuracy on four real-world datasets.
-
Efficient Emotion-Aware Iconic Gesture Prediction for Robot Co-Speech
Lightweight transformer predicts iconic gesture placement and intensity from text and emotion for robot co-speech, outperforming GPT-4o on BEAT2 without audio input.
-
Spatial-Aware Conditioned Fusion for Audio-Visual Navigation
SACF discretizes target direction and distance from audio-visual cues then applies conditioned fusion to improve navigation efficiency and generalization to unheard sounds.
-
Safe Decentralized Operation of EV Virtual Power Plant with Limited Network Visibility via Multi-Agent Reinforcement Learning
TL-MAPPO enables safe decentralized EV VPP operation under limited visibility, cutting voltage violations by 45% and costs by 10% versus baselines on a 33-bus network.
-
PrismNet: Viewing Time Series Through a Multi-Modal Prism for Interpretable Power Load Forecasting
PrismNet combines text and image modalities with time series via a PID-guided contrastive learning module to boost few-shot power load forecasting accuracy and provide interpretability.
- LoKA: Low-precision Kernel Applications for Recommendation Models At Scale