A pair selection strategy based on negative similarity dynamics strengthens contrastive supervision in gloss-free sign language translation by reducing noisy negatives.
citation dossier
Effective approaches to attention- based neural machine translation
why this work matters in Pith
Pith has found this work in 7 reviewed papers. Its strongest current cluster is cs.CL (4 papers). The largest review-status bucket among citing papers is UNVERDICTED (6 papers). For highly cited works, this page shows a dossier first and a bounded explorer second; it never tries to render every citing paper at once.
representative citing papers
Sequential machine learning on jet declustering history trees outperforms static models at identifying jet quenching in heavy-ion collision simulations.
Attention gates added to U-Net automatically focus on target organs in CT images and improve segmentation performance on abdominal datasets.
Pith review generated a malformed one-line summary.
A gated-fusion CSI predictor using GRU, attention, and DSLH reaches -13.84 dB NMSE with 26% fewer parameters and 2.3x higher throughput than a LinFormer baseline on 3GPP channels.
Sentence-level models outperform skeleton-based approaches for narrative coherence despite a new SSN network improving on cosine and Euclidean baselines.
Gemma 2 models achieve leading performance at their sizes by combining established Transformer modifications with knowledge distillation for the 2B and 9B variants.
citing papers explorer
-
Jet Quenching Identification via Supervised Learning in Simulated Heavy-Ion Collisions
Sequential machine learning on jet declustering history trees outperforms static models at identifying jet quenching in heavy-ion collision simulations.