Adaptive scheduling of interventions in discrete diffusion language models, timed to attribute-specific commitment schedules discovered with sparse autoencoders, delivers precise multi-attribute steering up to 93% strength while preserving generation quality.
Bert: Pre-training of deep bidi- rectional transformers for language understanding
2 Pith papers cite this work. Polarity classification is still indexing.
2
Pith papers citing it
fields
cs.LG 2years
2026 2verdicts
UNVERDICTED 2representative citing papers
CellRefine adds a marker-gene-guided post-pretraining stage to single-cell models that refines the cell embedding manifold and improves downstream task performance by up to 15%.
citing papers explorer
-
Steering Without Breaking: Mechanistically Informed Interventions for Discrete Diffusion Language Models
Adaptive scheduling of interventions in discrete diffusion language models, timed to attribute-specific commitment schedules discovered with sparse autoencoders, delivers precise multi-attribute steering up to 93% strength while preserving generation quality.
-
Prototype Guided Post-pretraining for Single-Cell Representation Learning
CellRefine adds a marker-gene-guided post-pretraining stage to single-cell models that refines the cell embedding manifold and improves downstream task performance by up to 15%.