CMKL delivers a 60% gain in average precision on continual entity classification in a 129K-entity biomedical KG benchmark by fusing multimodal features and protecting against modality-specific forgetting, while relationship prediction stays comparable to baselines.
MoSE: Modality split and ensemble for multimodal knowledge graph completion
1 Pith paper cite this work. Polarity classification is still indexing.
1
Pith paper citing it
fields
cs.LG 1years
2026 1verdicts
CONDITIONAL 1representative citing papers
citing papers explorer
-
CMKL: Modality-Aware Continual Learning for Evolving Biomedical Knowledge Graphs
CMKL delivers a 60% gain in average precision on continual entity classification in a 129K-entity biomedical KG benchmark by fusing multimodal features and protecting against modality-specific forgetting, while relationship prediction stays comparable to baselines.