pith. machine review for the scientific record. sign in

arxiv: 2605.10317 · v1 · submitted 2026-05-11 · 💻 cs.LG · cs.AI

Recognition: 1 theorem link

· Lean Theorem

Relations Are Channels: Knowledge Graph Embedding via Kraus Decompositions

Authors on Pith no claims yet

Pith reviewed 2026-05-12 03:41 UTC · model grok-4.3

classification 💻 cs.LG cs.AI
keywords knowledge graph embeddingKraus channelsrelation operatorscomplete positivitymany-to-many relationsgraph embeddingsquantum channels
0
0 comments X

The pith

Relation operators in knowledge graph embeddings must be linear, trace-preserving, and completely positive, which defines them as Kraus channels.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper shows that any principled way to turn a relation into an operator on entity vectors has to obey three rules: it must act linearly, preserve the trace of matrices, and remain completely positive. These three rules together are exactly what the Kraus representation theorem requires to describe a quantum channel. Once relations are viewed this way, most older embedding methods appear as the simplest possible cases where only one Kraus operator is needed. The same view produces a concrete new model that works on many-to-many links without forcing extra path encoders or forcing entity vectors to have unit length.

Core claim

Linearity, trace preservation, and complete positivity are necessary and jointly sufficient for any relation operator in the KGE setting; by the Kraus representation theorem these axioms are equivalent to the completeness constraint that defines the family of Kraus channels. Existing operator-based models recover as the rank-1 special case under particular embedding choices. The same characterization extends to arbitrary metric spaces via w-Kraus channels that satisfy completeness by construction.

What carries the argument

Kraus channels, which express any completely positive trace-preserving linear map as a sum of Kraus operators acting on the embedding space.

If this is right

  • Existing operator-based KGE models become special cases with Kraus rank one under specific embedding choices.
  • The KrausKGE model handles one-to-many and many-to-many relations without explicit path encoders or norm constraints on entity vectors.
  • A per-relation complexity measure appears for the first time, with a provable lower bound equal to the rank of the empirical relation matrix.
  • Performance gains on N-to-N relations grow monotonically with relation fan-out, matching the theoretical prediction.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The channel view may let researchers import design tools from quantum information to create embeddings that respect additional physical or geometric constraints.
  • The rank-based complexity number could be used at training time to decide how many parameters or how much regularization to allocate to each relation.
  • The same three axioms could be checked or enforced in other graph tasks such as node classification or temporal link prediction.

Load-bearing premise

That linearity, trace preservation, and complete positivity are the necessary and jointly sufficient conditions for any principled relation operator in knowledge-graph embedding.

What would settle it

An embedding method that violates at least one of the three axioms yet matches or exceeds KrausKGE performance on standard link-prediction benchmarks, or a dataset where accuracy on N-to-N relations does not rise with fan-out as predicted by the rank lower bound.

Figures

Figures reproduced from arXiv: 2605.10317 by Sayan Kumar Chaki.

Figure 1
Figure 1. Figure 1: Ablation studies on FB15k-237 isolating each design choice. [PITH_FULL_IMAGE:figures/full_fig_p024_1.png] view at source ↗
read the original abstract

Knowledge graph embedding (KGE) models typically represent each relation as an operator on entity embeddings. In this work, we identify three structural axioms that any principled relation operator must satisfy, linearity, trace preservation, and complete positivity, and show that they characterize a Kraus channel structure via the Kraus representation theorem. The completeness constraint defining this family is equivalent to these axioms, providing a principled foundation rather than an externally imposed condition. Under this formulation, most existing operator-based KGE models are recoverable as special cases with Kraus rank $\kappa = 1$ under specific embedding choices. We further generalize this characterization to arbitrary metric geometries by introducing \mbox{w-Kraus} channels, which satisfy completeness by construction within their respective spaces. Building on this theory, we propose \textsc{KrausKGE}, a principled KGE model that naturally handles $1$-to-$N$ and $N$-to-$N$ relations, supports $k$-hop reasoning without requiring explicit path encoders, and eliminates the need for norm constraints on entity embeddings. Additionally, our framework yields the first theoretically grounded per-relation complexity measure in the KGE literature, with a provable lower bound in terms of the empirical relation matrix rank. Empirical evaluation demonstrates that \textsc{KrausKGE} consistently outperforms strong baselines on $N$-to-$N$ relations, with performance gains that increase monotonically with relation fan-out, in alignment with theoretical predictions.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 2 minor

Summary. The paper claims that any principled relation operator in knowledge graph embeddings must satisfy linearity, trace preservation, and complete positivity; these axioms characterize Kraus channels via the representation theorem, with the completeness constraint equivalent by construction. Existing operator-based KGE models are recovered as rank-1 special cases. The work introduces w-Kraus channels for arbitrary metric geometries, proposes the KrausKGE model that natively handles 1-to-N and N-to-N relations, supports k-hop reasoning without path encoders or norm constraints on entities, and defines a per-relation complexity measure with a provable lower bound in terms of empirical relation matrix rank. Experiments report consistent outperformance on N-to-N relations, with gains increasing monotonically with fan-out.

Significance. If the necessity of the three axioms is established and the empirical results hold under proper controls, the framework unifies existing KGE operators under a single channel-theoretic umbrella and supplies the first theoretically grounded complexity measure in the literature. The native support for multi-arity relations and k-hop reasoning without auxiliary components, together with the monotonic fan-out scaling, would constitute a substantive advance over ad-hoc operator designs.

major comments (2)
  1. [Introduction and theoretical development] The central claim that linearity, trace preservation, and complete positivity are necessary (not merely sufficient) for any principled KGE relation operator is asserted without a derivation from KGE desiderata such as monotonic performance on high fan-out relations, elimination of norm constraints, or k-hop reasoning. No counterexamples are supplied showing that operators violating complete positivity produce concrete embedding or reasoning failures; this necessity step is load-bearing for the subsequent invocation of the Kraus theorem.
  2. [Theoretical development] The direct applicability of the Kraus representation theorem to real-vector embeddings and classical multi-arity relations is not justified; the theorem is stated to characterize the structure once the axioms are granted, but the paper does not address whether the Hilbert-space formulation or the definition of positivity requires domain-specific restrictions or modifications for the KGE setting.
minor comments (2)
  1. [Empirical evaluation] The empirical section should include an explicit table listing all baselines, datasets, and hyperparameter ranges with citations; the statement that KrausKGE 'consistently outperforms strong baselines' is difficult to assess without these details.
  2. [Generalization to arbitrary metric geometries] The definition and construction of w-Kraus channels would benefit from an explicit side-by-side comparison (equations or table) with standard Kraus channels to clarify how completeness is enforced by construction in non-Euclidean geometries.

Simulated Author's Rebuttal

2 responses · 0 unresolved

We thank the referee for their insightful comments. We address each of the major comments point by point below, and we will make revisions to the manuscript where indicated to strengthen the theoretical foundations.

read point-by-point responses
  1. Referee: [Introduction and theoretical development] The central claim that linearity, trace preservation, and complete positivity are necessary (not merely sufficient) for any principled KGE relation operator is asserted without a derivation from KGE desiderata such as monotonic performance on high fan-out relations, elimination of norm constraints, or k-hop reasoning. No counterexamples are supplied showing that operators violating complete positivity produce concrete embedding or reasoning failures; this necessity step is load-bearing for the subsequent invocation of the Kraus theorem.

    Authors: We acknowledge the validity of this observation. The manuscript presents the three axioms as necessary for principled relation operators based on their role in enabling the desired KGE properties, but does not provide an explicit derivation from those desiderata or counterexamples for violations. To address this, we will revise the introduction and theoretical development section to include a step-by-step motivation deriving the axioms from KGE requirements like handling high fan-out relations and supporting k-hop reasoning without additional components. We will also incorporate a new subsection with counterexamples and preliminary results demonstrating concrete failures (such as performance degradation or embedding instability) when complete positivity is not enforced. revision: yes

  2. Referee: [Theoretical development] The direct applicability of the Kraus representation theorem to real-vector embeddings and classical multi-arity relations is not justified; the theorem is stated to characterize the structure once the axioms are granted, but the paper does not address whether the Hilbert-space formulation or the definition of positivity requires domain-specific restrictions or modifications for the KGE setting.

    Authors: This is a fair point regarding the domain of applicability. While the Kraus theorem characterizes CPTP maps on Hilbert spaces, and our work applies it to real vector embeddings in KGE, the manuscript does not explicitly discuss the transition from complex to real spaces or adaptations for multi-arity relations. We will add a clarification in the theoretical development section explaining that the representation holds analogously for real Hilbert spaces with real Kraus operators, and that multi-arity relations are modeled via tensor product spaces without requiring modifications to the positivity definition. This will be supported by references to real-valued channel theory. revision: yes

Circularity Check

0 steps flagged

No significant circularity; derivation relies on external theorem and posited axioms

full rationale

The paper posits linearity, trace preservation, and complete positivity as axioms for principled relation operators, then applies the external Kraus representation theorem to obtain the channel structure. The stated equivalence between these axioms and the completeness constraint follows directly from the theorem rather than internal redefinition or self-referential fitting. No load-bearing step reduces to a self-citation chain, a fitted parameter renamed as prediction, or an ansatz smuggled via prior work by the same authors. The per-relation complexity measure is defined in terms of empirical matrix rank (data-derived but not a model fit), and performance claims are presented as empirical validation aligned with theory, not as tautological outputs. The derivation chain is self-contained against external benchmarks.

Axiom & Free-Parameter Ledger

1 free parameters · 1 axioms · 1 invented entities

The central claim rests on three domain axioms for relation operators plus the Kraus representation theorem imported from quantum information. The model introduces w-Kraus channels as a generalization; no independent empirical evidence for the generalization is supplied in the abstract.

free parameters (1)
  • Kraus rank κ
    Set to 1 to recover most existing models; higher values used for general case. Value chosen per relation or globally.
axioms (1)
  • domain assumption Any principled relation operator must be linear, trace-preserving, and completely positive.
    Stated as the three structural axioms that characterize the Kraus channel family.
invented entities (1)
  • w-Kraus channels no independent evidence
    purpose: Extend the Kraus structure to arbitrary metric geometries while satisfying completeness by construction.
    Introduced to generalize the framework beyond standard Euclidean embeddings.

pith-pipeline@v0.9.0 · 5550 in / 1473 out tokens · 43822 ms · 2026-05-12T03:41:09.884912+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Lean theorems connected to this paper

Citations machine-checked in the Pith Canon. Every link opens the source theorem in the public Lean library.

  • IndisputableMonolith/Cost/FunctionalEquation.lean washburn_uniqueness_aczel unclear
    ?
    unclear

    Relation between the paper passage and the cited Recognition theorem.

    We identify three structural axioms that any principled relation operator must satisfy, linearity, trace preservation, and complete positivity, and show that they characterize a Kraus channel structure via the Kraus representation theorem.

What do these tags mean?
matches
The paper's claim is directly supported by a theorem in the formal canon.
supports
The theorem supports part of the paper's argument, but the paper may add assumptions or extra steps.
extends
The paper goes beyond the formal theorem; the theorem is a base layer rather than the whole result.
uses
The paper appears to rely on the theorem as machinery.
contradicts
The paper's claim conflicts with a theorem or certificate in the canon.
unclear
Pith found a possible connection, but the passage is too broad, indirect, or ambiguous to say the theorem truly supports the claim.

Reference graph

Works this paper leans on

51 extracted references · 51 canonical work pages

  1. [1]

    Squire: A sequence-to-sequence framework for multi-hop knowledge graph reasoning

    Yushi Bai, Xin Lv, Juanzi Li, Lei Hou, Yincen Qu, Zelin Dai, and Feiyu Xiong. Squire: A sequence-to-sequence framework for multi-hop knowledge graph reasoning. InProceedings of the 2022 conference on empirical methods in natural language processing, pages 1649–1662, 2022

  2. [2]

    Multi-relational poincaré graph embed- dings.Advances in neural information processing systems, 32, 2019

    Ivana Balazevic, Carl Allen, and Timothy Hospedales. Multi-relational poincaré graph embed- dings.Advances in neural information processing systems, 32, 2019

  3. [3]

    Tucker: Tensor factorization for knowledge graph completion

    Ivana Balaževi´c, Carl Allen, and Timothy Hospedales. Tucker: Tensor factorization for knowledge graph completion. InProceedings of the 2019 conference on empirical methods in natural language processing and the 9th international joint conference on natural language processing (EMNLP-IJCNLP), pages 5185–5194, 2019

  4. [4]

    Freebase: a collaboratively created graph database for structuring human knowledge

    Kurt Bollacker, Colin Evans, Praveen Paritosh, Tim Sturge, and Jamie Taylor. Freebase: a collaboratively created graph database for structuring human knowledge. InProceedings of the 2008 ACM SIGMOD international conference on Management of data, pages 1247–1250, 2008

  5. [5]

    Translating embeddings for modeling multi-relational data.Advances in neural information processing systems, 26, 2013

    Antoine Bordes, Nicolas Usunier, Alberto Garcia-Duran, Jason Weston, and Oksana Yakhnenko. Translating embeddings for modeling multi-relational data.Advances in neural information processing systems, 26, 2013

  6. [6]

    Knowledge graph embedding: A survey from the perspective of representation spaces.ACM Computing Surveys, 56(6):1–42, 2024

    Jiahang Cao, Jinyuan Fang, Zaiqiao Meng, and Shangsong Liang. Knowledge graph embedding: A survey from the perspective of representation spaces.ACM Computing Surveys, 56(6):1–42, 2024

  7. [7]

    Dual quaternion knowledge graph embeddings

    Zongsheng Cao, Qianqian Xu, Zhiyong Yang, Xiaochun Cao, and Qingming Huang. Dual quaternion knowledge graph embeddings. InProceedings of the AAAI conference on artificial intelligence, volume 35, pages 6894–6902, 2021

  8. [8]

    Geometry interaction knowledge graph embeddings

    Zongsheng Cao, Qianqian Xu, Zhiyong Yang, Xiaochun Cao, and Qingming Huang. Geometry interaction knowledge graph embeddings. InProceedings of the AAAI conference on artificial intelligence, volume 36, pages 5521–5529, 2022

  9. [9]

    Low- dimensional hyperbolic knowledge graph embeddings

    Ines Chami, Adva Wolf, Da-Cheng Juan, Frederic Sala, Sujith Ravi, and Christopher Ré. Low- dimensional hyperbolic knowledge graph embeddings. InProceedings of the 58th annual meeting of the association for computational linguistics, pages 6901–6914, 2020

  10. [10]

    Representations of hermitian kernels by means of krein spaces.Publications of the Research Institute for Mathematical Sciences, 33(6):917–951, 1997

    Tiberiu Constantinescu and Aurelian Gheondea. Representations of hermitian kernels by means of krein spaces.Publications of the Research Institute for Mathematical Sciences, 33(6):917–951, 1997

  11. [11]

    Convolutional 2d knowledge graph embeddings

    Tim Dettmers, Pasquale Minervini, Pontus Stenetorp, and Sebastian Riedel. Convolutional 2d knowledge graph embeddings. InProceedings of the AAAI conference on artificial intelligence, volume 32, 2018

  12. [12]

    Operators on indefinite inner product spaces.Lectures on operator theory and its applications, 3:141–232, 1996

    Michael A Dritschel and James Rovnyak. Operators on indefinite inner product spaces.Lectures on operator theory and its applications, 3:141–232, 1996

  13. [13]

    Learning to exploit long-term relational dependencies in knowledge graphs

    Lingbing Guo, Zequn Sun, and Wei Hu. Learning to exploit long-term relational dependencies in knowledge graphs. InInternational conference on machine learning, pages 2505–2514. PMLR, 2019

  14. [14]

    Learning to represent knowledge graphs with gaussian embedding

    Shizhu He, Kang Liu, Guoliang Ji, and Jun Zhao. Learning to represent knowledge graphs with gaussian embedding. InProceedings of the 24th ACM international on conference on information and knowledge management, pages 623–632, 2015

  15. [15]

    Orthogonal recurrent neural networks with scaled cayley transform

    Kyle Helfrich, Devin Willmott, and Qiang Ye. Orthogonal recurrent neural networks with scaled cayley transform. InInternational Conference on Machine Learning, pages 1969–1978. PMLR, 2018

  16. [16]

    Springer, 1983

    Karl Kraus, Arno Böhm, John D Dollard, and WH Wootters.States, effects, and operations fundamental notions of quantum theory: Lectures in mathematical physics at the university of Texas at Austin. Springer, 1983. 10

  17. [17]

    A survey on knowledge graph-based recommender systems

    Dongze Li, Hanbing Qu, and Jiaqiang Wang. A survey on knowledge graph-based recommender systems. In2023 China Automation Congress (CAC), pages 2925–2930. IEEE, 2023

  18. [18]

    Generalizing knowledge graph embedding with universal orthogonal parameterization.arXiv preprint arXiv:2405.08540, 2024

    Rui Li, Chaozhuo Li, Yanming Shen, Zeyu Zhang, and Xu Chen. Generalizing knowledge graph embedding with universal orthogonal parameterization.arXiv preprint arXiv:2405.08540, 2024

  19. [19]

    House: Knowledge graph embedding with householder parameterization

    Rui Li, Jianan Zhao, Chaozhuo Li, Di He, Yiqi Wang, Yuming Liu, Hao Sun, Senzhang Wang, Weiwei Deng, Yanming Shen, et al. House: Knowledge graph embedding with householder parameterization. InInternational conference on machine learning, pages 13209–13224. PMLR, 2022

  20. [20]

    Multi-hop knowledge graph reasoning with reward shaping

    Xi Victoria Lin, Richard Socher, and Caiming Xiong. Multi-hop knowledge graph reasoning with reward shaping. InProceedings of the 2018 conference on empirical methods in natural language processing, pages 3243–3253, 2018

  21. [21]

    Modeling relation paths for representation learning of knowledge bases

    Yankai Lin, Zhiyuan Liu, Huanbo Luan, Maosong Sun, Siwei Rao, and Song Liu. Modeling relation paths for representation learning of knowledge bases. InProceedings of the 2015 conference on empirical methods in natural language processing, pages 705–714, 2015

  22. [22]

    Learning entity and relation embeddings for knowledge graph completion

    Yankai Lin, Zhiyuan Liu, Maosong Sun, Yang Liu, and Xuan Zhu. Learning entity and relation embeddings for knowledge graph completion. InProceedings of the AAAI conference on artificial intelligence, volume 29, 2015

  23. [23]

    A three-way model for collective learning on multi-relational data

    Maximilian Nickel, V olker Tresp, Hans-Peter Kriegel, et al. A three-way model for collective learning on multi-relational data. InIcml, volume 11, pages 3104482–3104584, 2011

  24. [24]

    Knowledge graph embedding for link prediction: A comparative analysis.ACM Transactions on Knowledge Discovery from Data (TKDD), 15(2):1–49, 2021

    Andrea Rossi, Denilson Barbosa, Donatella Firmani, Antonio Matinata, and Paolo Merialdo. Knowledge graph embedding for link prediction: A comparative analysis.ACM Transactions on Knowledge Discovery from Data (TKDD), 15(2):1–49, 2021

  25. [25]

    Modeling relational data with graph convolutional networks

    Michael Schlichtkrull, Thomas N Kipf, Peter Bloem, Rianne Van Den Berg, Ivan Titov, and Max Welling. Modeling relational data with graph convolutional networks. InEuropean semantic web conference, pages 593–607. Springer, 2018

  26. [26]

    Knowledge graph embedding using graph convolutional networks with relation-aware attention.arXiv preprint arXiv:2102.07200, 2021

    Nasrullah Sheikh, Xiao Qin, Berthold Reinwald, Christoph Miksovic, Thomas Gschwind, and Paolo Scotton. Knowledge graph embedding using graph convolutional networks with relation-aware attention.arXiv preprint arXiv:2102.07200, 2021

  27. [27]

    Question answering system using knowledge graphs

    Spurthy Skandan, Susheen Kanungo, Shreyas Devaraj, Sahil Gupta, and Surabhi Narayan. Question answering system using knowledge graphs. In2023 International Conference on Inventive Computation Technologies (ICICT), pages 656–661. IEEE, 2023

  28. [28]

    Reasoning with neural tensor networks for knowledge base completion.Advances in neural information processing systems, 26, 2013

    Richard Socher, Danqi Chen, Christopher D Manning, and Andrew Ng. Reasoning with neural tensor networks for knowledge base completion.Advances in neural information processing systems, 26, 2013

  29. [29]

    Yago: a core of semantic knowl- edge

    Fabian M Suchanek, Gjergji Kasneci, and Gerhard Weikum. Yago: a core of semantic knowl- edge. InProceedings of the 16th international conference on World Wide Web, pages 697–706, 2007

  30. [30]

    RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space

    Zhiqing Sun, Zhi-Hong Deng, Jian-Yun Nie, and Jian Tang. Rotate: Knowledge graph embed- ding by relational rotation in complex space.arXiv preprint arXiv:1902.10197, 2019

  31. [31]

    Representing text for joint embedding of text and knowledge bases

    Kristina Toutanova, Danqi Chen, Patrick Pantel, Hoifung Poon, Pallavi Choudhury, and Michael Gamon. Representing text for joint embedding of text and knowledge bases. InProceedings of the 2015 conference on empirical methods in natural language processing, pages 1499–1509, 2015

  32. [32]

    Meim: Multi-partition embedding interaction beyond block term format for efficient and expressive link prediction

    Hung Nghiep Tran and Atsuhiro Takasu. Meim: Multi-partition embedding interaction beyond block term format for efficient and expressive link prediction. InProceedings of the Thirty-First International Joint Conference on Artificial Intelligence, pages 2262–2269. International Joint Conferences on Artificial Intelligence Organization, 2022. 11

  33. [33]

    Complex embeddings for simple link prediction

    Théo Trouillon, Johannes Welbl, Sebastian Riedel, Éric Gaussier, and Guillaume Bouchard. Complex embeddings for simple link prediction. InInternational conference on machine learning, pages 2071–2080. PMLR, 2016

  34. [34]

    Wikidata: a free collaborative knowledgebase.Com- munications of the ACM, 57(10):78–85, 2014

    Denny Vrandeˇci´c and Markus Krötzsch. Wikidata: a free collaborative knowledgebase.Com- munications of the ACM, 57(10):78–85, 2014

  35. [35]

    Llm-kgmqa: large language model-augmented multi-hop question-answering system based on knowledge graph in medical field.Knowledge and Information Systems, 67(8):6461–6503, 2025

    FeiLong Wang, Donghui Shi, Jose Aguilar, Xinyi Cui, Jinsong Jiang, Longjian Shen, and Mengya Li. Llm-kgmqa: large language model-augmented multi-hop question-answering system based on knowledge graph in medical field.Knowledge and Information Systems, 67(8):6461–6503, 2025

  36. [36]

    Dirie: Knowledge graph embedding with dirichlet distribution

    Feiyang Wang, Zhongbao Zhang, Li Sun, Junda Ye, and Yang Yan. Dirie: Knowledge graph embedding with dirichlet distribution. InProceedings of the ACM web conference 2022, pages 3082–3091, 2022

  37. [37]

    Reinforced logical reasoning over kgs for interpretable recommendation system.Machine Learning, 114(4):95, 2025

    Shirui Wang, Bohan Xie, Ling Ding, Jianting Chen, and Yang Xiang. Reinforced logical reasoning over kgs for interpretable recommendation system.Machine Learning, 114(4):95, 2025

  38. [38]

    Application of large language models based on knowledge graphs in question- answering systems: A review.Applied and Computational Engineering, 71(1):78–82, 2024

    Yani Wang. Application of large language models based on knowledge graphs in question- answering systems: A review.Applied and Computational Engineering, 71(1):78–82, 2024

  39. [39]

    Knowledge graph embedding by translating on hyperplanes

    Zhen Wang, Jianwen Zhang, Jianlin Feng, and Zheng Chen. Knowledge graph embedding by translating on hyperplanes. InProceedings of the AAAI conference on artificial intelligence, volume 28, 2014

  40. [40]

    End-to-end entity classification on multimodal knowledge graphs.arXiv preprint arXiv:2003.12383, 2020

    WX Wilcke, Peter Bloem, Victor de Boer, RH van t Veer, and FAH van Harmelen. End-to-end entity classification on multimodal knowledge graphs.arXiv preprint arXiv:2003.12383, 2020

  41. [41]

    Transg: A generative model for knowledge graph embedding

    Han Xiao, Minlie Huang, and Xiaoyan Zhu. Transg: A generative model for knowledge graph embedding. InProceedings of the 54th annual meeting of the association for computational linguistics (volume 1: Long papers), pages 2316–2325, 2016

  42. [42]

    Ultrahyperbolic knowledge graph embeddings

    Bo Xiong, Shichao Zhu, Mojtaba Nayyeri, Chengjin Xu, Shirui Pan, Chuan Zhou, and Steffen Staab. Ultrahyperbolic knowledge graph embeddings. InProceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pages 2130–2139, 2022

  43. [43]

    Deeppath: A reinforcement learning method for knowledge graph reasoning

    Wenhan Xiong, Thien Hoang, and William Yang Wang. Deeppath: A reinforcement learning method for knowledge graph reasoning. InProceedings of the 2017 conference on empirical methods in natural language processing, pages 564–573, 2017

  44. [44]

    Embedding entities and relations for learning and inference in knowledge bases

    Bishan Yang, Scott Wen-tau Yih, Xiaodong He, Jianfeng Gao, and Li Deng. Embedding entities and relations for learning and inference in knowledge bases. InProceedings of the International Conference on Learning Representations (ICLR) 2015, 2015

  45. [45]

    12 A OBJECTEXTRACTIONOVERVIEW Astrophysical object mentions are extracted from each paper using GPT-5-mini, prompted with the paper’s arXiv ID, title, abstract, and full OCR text

    Liang Yao, Chengsheng Mao, and Yuan Luo. Kg-bert: Bert for knowledge graph completion. arXiv preprint arXiv:1909.03193, 2019

  46. [46]

    Quaternion knowledge graph embeddings.Advances in neural information processing systems, 32, 2019

    Shuai Zhang, Yi Tay, Lina Yao, and Qi Liu. Quaternion knowledge graph embeddings.Advances in neural information processing systems, 32, 2019

  47. [47]

    Block-diagonal orthogonal relation and matrix entity for knowledge graph embedding

    Yihua Zhu and Hidetoshi Shimodaira. Block-diagonal orthogonal relation and matrix entity for knowledge graph embedding. InFindings of the Association for Computational Linguistics: EMNLP 2024, pages 16956–16972, 2024

  48. [48]

    A* net: A scalable path-based reasoning approach for knowledge graphs.Advances in neural information processing systems, 36:59323–59336, 2023

    Zhaocheng Zhu, Xinyu Yuan, Michael Galkin, Louis-Pascal Xhonneux, Ming Zhang, Maxime Gazeau, and Jian Tang. A* net: A scalable path-based reasoning approach for knowledge graphs.Advances in neural information processing systems, 36:59323–59336, 2023

  49. [49]

    Neural bellman-ford networks: A general graph neural network framework for link prediction.Advances in neural information processing systems, 34:29476–29490, 2021

    Zhaocheng Zhu, Zuobai Zhang, Louis-Pascal Xhonneux, and Jian Tang. Neural bellman-ford networks: A general graph neural network framework for link prediction.Advances in neural information processing systems, 34:29476–29490, 2021

  50. [50]

    Limitations

    Karol Zyczkowski and Ingemar Bengtsson. An introduction to quantum entanglement: a geometric approach.arXiv preprint quant-ph/0606228, 2006. 12 A Technical appendices and supplementary material This appendix contains material that supports and extends the main paper but is not required to follow the core argument. Appendix B provides a detailed structural...

  51. [51]

    Guidelines: • The answer [N/A] means that the paper does not involve crowdsourcing nor research with human subjects

    Institutional review board (IRB) approvals or equivalent for research with human subjects Question: Does the paper describe potential risks incurred by study participants, whether such risks were disclosed to the subjects, and whether Institutional Review Board (IRB) approvals (or an equivalent approval/review based on the requirements of your country or ...