pith. machine review for the scientific record. sign in

arxiv: 1806.03417 · v2 · submitted 2018-06-09 · 💻 cs.AI · cs.LG· stat.ML

Recognition: unknown

Learning Continuous Hierarchies in the Lorentz Model of Hyperbolic Geometry

Authors on Pith no claims yet
classification 💻 cs.AI cs.LGstat.ML
keywords modelembeddingshyperbolichierarchieslearninglorentzpoincarrelationships
0
0 comments X
read the original abstract

We are concerned with the discovery of hierarchical relationships from large-scale unstructured similarity scores. For this purpose, we study different models of hyperbolic space and find that learning embeddings in the Lorentz model is substantially more efficient than in the Poincar\'e-ball model. We show that the proposed approach allows us to learn high-quality embeddings of large taxonomies which yield improvements over Poincar\'e embeddings, especially in low dimensions. Lastly, we apply our model to discover hierarchies in two real-world datasets: we show that an embedding in hyperbolic space can reveal important aspects of a company's organizational structure as well as reveal historical relationships between language families.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 2 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. New non-Euclidean neural quantum states from additional types of hyperbolic recurrent neural networks

    quant-ph 2026-04 unverdicted novelty 7.0

    Hyperbolic RNN and GRU neural quantum states outperform Euclidean versions on Heisenberg J1J2 and J1J2J3 models with 100 spins.

  2. HypEHR: Hyperbolic Modeling of Electronic Health Records for Efficient Question Answering

    cs.AI 2026-04 unverdicted novelty 6.0

    HypEHR is a hyperbolic embedding model for EHR data that uses Lorentzian geometry and hierarchy-aware pretraining to answer clinical questions nearly as well as large language models but with much smaller size.