Recognition: unknown
A Note on the Kullback-Leibler Divergence for the von Mises-Fisher distribution
classification
📊 stat.ML
keywords
distributiondivergencederivationdimensionsentropyfisherknownkullback
read the original abstract
We present a derivation of the Kullback Leibler (KL)-Divergence (also known as Relative Entropy) for the von Mises Fisher (VMF) Distribution in $d$-dimensions.
This paper has not been read by Pith yet.
Forward citations
Cited by 1 Pith paper
-
Polaris: Coupled Orbital Polar Embeddings for Hierarchical Concept Learning
Polaris learns hierarchical concepts via coupled orbital polar embeddings on hyperspheres that separate meaning from structure using tangent projections, exponential maps, and asymmetric objectives, yielding up to 19-...
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.