A pair selection strategy based on negative similarity dynamics strengthens contrastive supervision in gloss-free sign language translation by reducing noisy negatives.
citation dossier
arXiv preprint arXiv:1508.04025 (2015)
why this work matters in Pith
Pith has found this work in 7 reviewed papers. Its strongest current cluster is cs.CL (4 papers). The largest review-status bucket among citing papers is UNVERDICTED (6 papers). For highly cited works, this page shows a dossier first and a bounded explorer second; it never tries to render every citing paper at once.
representative citing papers
Sequential machine learning on jet declustering history trees outperforms static models at identifying jet quenching in heavy-ion collision simulations.
Attention gates added to U-Net automatically focus on target organs in CT images and improve segmentation performance on abdominal datasets.
Pith review generated a malformed one-line summary.
A gated-fusion CSI predictor using GRU, attention, and DSLH reaches -13.84 dB NMSE with 26% fewer parameters and 2.3x higher throughput than a LinFormer baseline on 3GPP channels.
Sentence-level models outperform skeleton-based approaches for narrative coherence despite a new SSN network improving on cosine and Euclidean baselines.
Gemma 2 models achieve leading performance at their sizes by combining established Transformer modifications with knowledge distillation for the 2B and 9B variants.
citing papers explorer
-
Selective Contrastive Learning For Gloss Free Sign Language Translation
A pair selection strategy based on negative similarity dynamics strengthens contrastive supervision in gloss-free sign language translation by reducing noisy negatives.
-
Jet Quenching Identification via Supervised Learning in Simulated Heavy-Ion Collisions
Sequential machine learning on jet declustering history trees outperforms static models at identifying jet quenching in heavy-ion collision simulations.
-
Attention U-Net: Learning Where to Look for the Pancreas
Attention gates added to U-Net automatically focus on target organs in CT images and improve segmentation performance on abdominal datasets.
-
Attention Is All You Need
Pith review generated a malformed one-line summary.
-
Resource-Efficient CSI Prediction: A Gated Fusion and Factorized Projection Approach
A gated-fusion CSI predictor using GRU, attention, and DSLH reaches -13.84 dB NMSE with 26% fewer parameters and 2.3x higher throughput than a LinFormer baseline on 3GPP channels.
-
Skeleton-based Coherence Modeling in Narratives
Sentence-level models outperform skeleton-based approaches for narrative coherence despite a new SSN network improving on cosine and Euclidean baselines.
-
Gemma 2: Improving Open Language Models at a Practical Size
Gemma 2 models achieve leading performance at their sizes by combining established Transformer modifications with knowledge distillation for the 2B and 9B variants.