DeCIR decouples endpoint alignment from semantic transition alignment in projection-based ZS-CIR via paired edit tuples, separate low-rank adapters, and LRDM merging, yielding consistent gains on CIRR, CIRCO, FashionIQ, and GeneCIS without added inference cost.
Chatglm: A family of large language models from glm-130b to glm-4 all tools
1 Pith paper cite this work. Polarity classification is still indexing.
1
Pith paper citing it
fields
cs.CV 1years
2026 1verdicts
UNVERDICTED 1representative citing papers
citing papers explorer
-
Decoupling Endpoint and Semantic Transition Learning for Zero-Shot Composed Image Retrieval
DeCIR decouples endpoint alignment from semantic transition alignment in projection-based ZS-CIR via paired edit tuples, separate low-rank adapters, and LRDM merging, yielding consistent gains on CIRR, CIRCO, FashionIQ, and GeneCIS without added inference cost.