MAGE uses a four-subgraph co-evolutionary knowledge graph plus dual bandits to externalize and retrieve experience for stable self-evolution of frozen language-model agents, showing gains on nine diverse benchmarks.
Divide by question, conquer by agent: Split-rag with question-driven graph partitioning
2 Pith papers cite this work. Polarity classification is still indexing.
2
Pith papers citing it
years
2026 2verdicts
UNVERDICTED 2representative citing papers
LogosKG delivers a novel hardware-aligned system for efficient multi-hop retrieval on billion-edge knowledge graphs without sacrificing fidelity, demonstrated via biomedical KG-LLM applications.
citing papers explorer
-
MAGE: Multi-Agent Self-Evolution with Co-Evolutionary Knowledge Graphs
MAGE uses a four-subgraph co-evolutionary knowledge graph plus dual bandits to externalize and retrieve experience for stable self-evolution of frozen language-model agents, showing gains on nine diverse benchmarks.
-
LogosKG: Hardware-Optimized Scalable and Interpretable Knowledge Graph Retrieval
LogosKG delivers a novel hardware-aligned system for efficient multi-hop retrieval on billion-edge knowledge graphs without sacrificing fidelity, demonstrated via biomedical KG-LLM applications.