pith. machine review for the scientific record. sign in

arxiv: 2605.11133 · v1 · submitted 2026-05-11 · 💻 cs.LG · math.DG

Recognition: 2 theorem links

· Lean Theorem

Steerable Neural ODEs on Homogeneous Spaces

Authors on Pith no claims yet

Pith reviewed 2026-05-13 06:32 UTC · model grok-4.3

classification 💻 cs.LG math.DG
keywords steerable neural ODEshomogeneous spacesG-equivarianceparallel transportassociated vector bundlesequivariant dynamicscontinuous normalizing flows
0
0 comments X

The pith

Steerable neural ODEs on homogeneous spaces are G-equivariant when the generating vector field and parallel transport connection are both G-invariant.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper introduces steerable neural ordinary differential equations that operate on homogeneous spaces M equal to G over H. Features are treated as sections of associated vector bundles, and their evolution combines a base flow on the space with parallel transport governed by a connection. This produces a coupled ODE system whose solutions respect the symmetry group G under appropriate invariance conditions. The construction generalizes standard manifold NODEs and continuous normalizing flows on Lie groups to handle transforming vector features. A sympathetic reader would care because it supplies a geometric way to build continuous-time models that automatically preserve symmetries without additional constraints.

Core claim

Steerable NODEs on M = G/H interpret features as sections of associated vector bundles over M and describe their evolution as parallel transport. This yields a coupled system consisting of a flow equation on M and a steering equation on the features. The resulting models are G-equivariant whenever the vector field generating the flow and the connection governing parallel transport are both G-invariant. The framework also shows how existing NODE models and continuous normalizing flows on Lie groups arise as special cases within this geometric setting.

What carries the argument

The coupled flow and steering ODEs where features transform as sections of associated vector bundles via parallel transport under a G-invariant connection.

If this is right

  • Steerable NODEs provide a geometric foundation for learning continuous-time equivariant dynamics of vector-valued features on homogeneous spaces.
  • Existing manifold neural ODEs and continuous normalizing flows on Lie groups are incorporated as special cases of the framework.
  • G-equivariance is guaranteed by the invariance of the vector field and the connection.
  • The approach extends manifold NODEs by transporting associated feature vectors under the local symmetry group H.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • Designing networks where the connection is learned but constrained to be G-invariant could enforce equivariance automatically in continuous flows.
  • This suggests similar steerable constructions might apply to discrete neural networks on homogeneous spaces by discretizing the parallel transport.
  • Applications to physical systems with continuous symmetries, such as rigid body dynamics on rotation groups, could benefit from the built-in equivariance.

Load-bearing premise

That features can be consistently interpreted as sections of associated vector bundles over the homogeneous space M and that their evolution is governed by parallel transport in a neural network setting.

What would settle it

Integrating the coupled ODE system with explicitly G-invariant vector field and connection on a test homogeneous space such as the sphere and verifying whether the output features transform correctly under group actions applied before and after the integration.

Figures

Figures reproduced from arXiv: 2605.11133 by Daniel Persson, Emma Andersdotter, Fredrik Ohlsson.

Figure 1
Figure 1. Figure 1: In a neural ODE on a homogeneous space M = G/H (Fig. 1a), points p ∈ M are transported to outputs Φp(1) along the flow governed by a vector field ϕ. In a steerable NODE (Fig. 1b), this framework is extended to include features that are associated to the points p, defined by a feature field f : M → V , where the vector space V carries a representation of H. The transformation of the features along the NODE … view at source ↗
Figure 2
Figure 2. Figure 2: A steerable NODE consists of a coupled system of ODEs evolving on [PITH_FULL_IMAGE:figures/full_fig_p017_2.png] view at source ↗
Figure 3
Figure 3. Figure 3: An equivariant steerable NODE has the following commutative diagram. [PITH_FULL_IMAGE:figures/full_fig_p023_3.png] view at source ↗
read the original abstract

We introduce steerable neural ordinary differential equations on homogeneous spaces $M=G/H$. These models constitute a novel geometric extension of manifold neural ordinary differential equations (NODEs) that transport associated feature vectors transforming under the local symmetry group $H$. We interpret features as sections of associated vector bundles over $M$, and describe their evolution as parallel transport. This results in a coupled system of ODEs consisting of a flow equation on $M$ and a steering equation acting on features. We show that steerable NODEs are $G$-equivariant whenever the vector field generating the flow and the connection governing parallel transport are both $G$-invariant. Furthermore, we demonstrate how steerable NODEs incorporate existing NODE models and continuous normalizing flows on Lie groups. Our framework provides the geometric foundation for learning continuous-time equivariant dynamics of general vector-valued features on homogeneous spaces.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Circularity Check

0 steps flagged

No significant circularity; central equivariance follows from standard differential geometry

full rationale

The paper's derivation chain rests on the standard fact that a G-invariant vector field on M generates a G-equivariant flow and that parallel transport with a G-invariant connection commutes with the G-action on associated bundles. This is invoked directly in the abstract and full text without reducing to fitted parameters, self-definitions, or load-bearing self-citations. The coupled ODE system (flow on M plus steering on fibers) inherits equivariance under the stated hypotheses by construction from classical geometry, not from any internal renaming or ansatz smuggling. The bundle interpretation of features is the conventional one in geometric deep learning and introduces no self-referential loop. No equations equate a 'prediction' to its own inputs by definition.

Axiom & Free-Parameter Ledger

0 free parameters · 2 axioms · 1 invented entities

The framework rests on standard concepts from differential geometry; the novelty lies in their application to neural ODEs rather than new postulates.

axioms (2)
  • domain assumption M is a homogeneous space of the form G/H
    Standard setup for spaces with transitive group action.
  • domain assumption Features transform as sections of associated vector bundles
    Standard construction in representation theory and differential geometry.
invented entities (1)
  • steerable neural ODE no independent evidence
    purpose: Coupled flow and feature transport model
    The model class is introduced in the paper.

pith-pipeline@v0.9.0 · 5442 in / 1215 out tokens · 82165 ms · 2026-05-13T06:32:29.433831+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Lean theorems connected to this paper

Citations machine-checked in the Pith Canon. Every link opens the source theorem in the public Lean library.

What do these tags mean?
matches
The paper's claim is directly supported by a theorem in the formal canon.
supports
The theorem supports part of the paper's argument, but the paper may add assumptions or extra steps.
extends
The paper goes beyond the formal theorem; the theorem is a base layer rather than the whole result.
uses
The paper appears to rely on the theorem as machinery.
contradicts
The paper's claim conflicts with a theorem or certificate in the canon.
unclear
Pith found a possible connection, but the passage is too broad, indirect, or ambiguous to say the theorem truly supports the claim.

Reference graph

Works this paper leans on

34 extracted references · 34 canonical work pages

  1. [1]

    Ricky T. Q. Chen, Yulia Rubanova, Jesse Bettencourt, and David K. Duvenaud. Neural ordinary differential equations. InAdvances in Neural Information Processing Systems, volume 31, 2018

  2. [2]

    Jason Yim, Andrew Campbell, Andrew Y. K. Foong, Michael Gastegger, José Jiménez-Luna, Sarah Lewis, Victor Garcia Satorras, Bastiaan S. Veeling, Regina Barzilay, Tommi Jaakkola, and Frank Noé. Fast protein backbone generation with SE(3) flow matching. InProceedings of the Machine Learning in Structural Biology Workshop (NeurIPS), 2023. 33

  3. [3]

    SE(3) stochastic flow matching for protein backbone generation

    Avishek (Joey) Bose, Tara Akhound-Sadegh, Guillaume Huguet, Kilian Fatras, Jarrid Rector-Brooks, Cheng-Hao Liu, Andrei Cristian Nica, Maksym Korablyov, Michael Bronstein, and Alexander Tong. SE(3) stochastic flow matching for protein backbone generation. InProceedings of the 12th International Conference on Learning Representations (ICLR), 2024

  4. [4]

    Li and Tanja Kortemme

    Alex J. Li and Tanja Kortemme. ProteinZen: combining latent and SE(3) flow matching for all-atom protein generation. InProceedings of the Machine Learning for Structural Biology Workshop (NeurIPS), 2024

  5. [5]

    La- Proteina: Atomistic protein generation via partially latent flow matching.Arxiv e-print, arXiv:2507.09466 [cs.LG], 2025

    Tomas Geffner, Kieran Didi, Zhonglin Cao, Danny Reidenbach, Zuobai Zhang, Christian Dallago, Emine Kucukbenli, Karsten Kreis, and Arash Vahdat. La- Proteina: Atomistic protein generation via partially latent flow matching.Arxiv e-print, arXiv:2507.09466 [cs.LG], 2025

  6. [6]

    Ian Dunn and David R. Koes. FlowMol3: flow matching for 3d de novo small- molecule generation.Digital Discovery, 2026

  7. [7]

    Mathis Gerdes, Pim de Haan, Roberto Bondesan, and Miranda C. N. Cheng. Nonperturbative trivializing flows for lattice gauge theories.Physical Review D, 112:094516, 2025

  8. [8]

    Longde Huang, Oleksandr Balabanov, Hampus Linander, Mats Granath, Daniel Persson, and Jan E. Gerken. Learning Chern numbers of topological insulators with gauge equivariant neural networks. InAdvances in Neural Information Processing Systems, volume 38, 2025

  9. [9]

    Neural ordinary differential equations on mani- folds

    Luca Falorsi and Patrick Forré. Neural ordinary differential equations on mani- folds. InProceedings of the INNF+ Workshop of the International Conference on Machine Learning (ICML), 2020

  10. [10]

    Aaron Lou, Derek Lim, Isay Katsman, Leo Huang, Qingxuan Jiang, Ser Nam Lim, and Christopher M. De Sa. Neural manifold ordinary differential equations. InAdvances in Neural Information Processing Systems, volume 33, 2020

  11. [11]

    Riemannian continuous normalizing flows

    Emile Mathieu and Maximilian Nickel. Riemannian continuous normalizing flows. InAdvances in Neural Information Processing Systems, volume 33, 2020

  12. [12]

    Equivariant flows: Exact likeli- hood generative learning for symmetric densities

    Jonas Köhler, Leon Klein, and Frank Noé. Equivariant flows: Exact likeli- hood generative learning for symmetric densities. InProceedings of the 37th International Conference on Machine Learning, pages 5361–5370. PMLR, 2020

  13. [13]

    Equivariant manifold flows

    Isay Katsman, Aaron Lou, Derek Lim, Qingxuan Jiang, Ser-Nam Lim, and Christopher De Sa. Equivariant manifold flows. InAdvances in Neural Informa- tion Processing Systems, volume 34, 2021

  14. [14]

    Equivariant manifold neural ODEs and differential invariants.Journal of Machine Learning Research, 26:1–30, 2025

    Emma Andersdotter, Daniel Persson, and Fredrik Ohlsson. Equivariant manifold neural ODEs and differential invariants.Journal of Machine Learning Research, 26:1–30, 2025

  15. [15]

    A general theory of equivariant CNNs on homogeneous spaces

    Taco Cohen, Mario Geiger, and Maurice Weiler. A general theory of equivariant CNNs on homogeneous spaces. InAdvances in Neural Information Processing Systems, volume 32, 2019. 34

  16. [16]

    Miranda C. N. Cheng, Vassilis Anagiannis, Maurice Weiler, Pim de Haan, Taco S. Cohen, and Max Welling. Covariance in physics and convolutional neural networks. InProceedings of the Theoretical Physics for Deep Learning Workshop of the International Conference on Machine Learning (ICML), 2019

  17. [17]

    Gerken, Jimmy Aronsson, Oscar Carlsson, Hampus Linander, Fredrik Ohlsson, Christoffer Petersson, and Daniel Persson

    Jan E. Gerken, Jimmy Aronsson, Oscar Carlsson, Hampus Linander, Fredrik Ohlsson, Christoffer Petersson, and Daniel Persson. Geometric deep learning and equivariant neural networks.Artificial Intelligence Review, 56:14605–14662, 2023

  18. [18]

    On the generalization of equivariance and convolution in neural networks to the action of compact groups

    Risi Kondor and Shubhendu Trivedi. On the generalization of equivariance and convolution in neural networks to the action of compact groups. InProceedings of the 35th International Conference on Machine Learning, volume 80, pages 2747–2755. PMLR, 2018

  19. [19]

    Homogeneous vector bundles and G-equivariant convolutional neural networks.Sampling Theory, Signal Processing, and Data Analysis, 20, 2022

    Jimmy Aronsson. Homogeneous vector bundles and G-equivariant convolutional neural networks.Sampling Theory, Signal Processing, and Data Analysis, 20, 2022

  20. [20]

    Equivariant non-linear maps for neural networks on homogeneous spaces.Mathematical Foundations of Machine Learning, 2, 2026

    Elias Nyholm, Oscar Carlsson, Maurice Weiler, and Daniel Persson. Equivariant non-linear maps for neural networks on homogeneous spaces.Mathematical Foundations of Machine Learning, 2, 2026

  21. [21]

    On invariant connections over a principal fibre bundle

    Hsien-Chung Wang. On invariant connections over a principal fibre bundle. Nagoya Mathematical Journal, 13:1–19, 1958

  22. [22]

    Foundations of Differential Geometry I

    Shoshichi Kobayashi and Katsumi Nomizu. Foundations of Differential Geometry I. InInterscience Tracts in Pure and Applied Mathematics, volume 15. Wiley, 1963

  23. [23]

    CRC Press, 2003

    Mikio Nakahara.Geometry, Topology and Physics. CRC Press, 2003

  24. [24]

    Lee.Introduction to Smooth Manifolds

    John M. Lee.Introduction to Smooth Manifolds. Springer, 2012

  25. [25]

    Addison-Wesley Publishing Company, Inc., 1981

    David Bleecker.Gauge Theory and Variational Principles. Addison-Wesley Publishing Company, Inc., 1981

  26. [26]

    Michor, and Jan Slovak.Natural Operations in Differential Geometry

    Ivan Kolar, Peter W. Michor, and Jan Slovak.Natural Operations in Differential Geometry. Springer Science & Business Media, 2013

  27. [27]

    Cohen and Max Welling

    Taco S. Cohen and Max Welling. Steerable CNNs. InProceedings of the 5th International Conference on Learning Representations (ICLR), 2017

  28. [28]

    Yaron Lipman, Ricky T. Q. Chen, Heli Ben-Hamu, Maximilian Nickel, and Matt Le. Flow matching for generative modeling. InProceedings of the 11th International Conference on Learning Representations (ICLR), 2023

  29. [29]

    Fuchs, Ingmar Posner, and Max Welling

    Victor Garcia Satorras, Emiel Hoogeboom, Fabian B. Fuchs, Ingmar Posner, and Max Welling. E(n) equivariant normalizing flows. InAdvances in Neural Information Processing Systems, volume 34, 2021

  30. [30]

    Bundle scale spaces and local gauge symmetries for graph networks

    Jonas Cassel, Fabio Schlindwein, Peter Albers, and Christoph Schnörr. Bundle scale spaces and local gauge symmetries for graph networks. InScale Space and Variational Methods in Computer Vision, pages 245–257, 2025. 35

  31. [31]

    Sheaf neural networks

    Jakob Hansen and Thomas Gebhart. Sheaf neural networks. InProceedings of the Topological Data Analysis and Beyond Workshop (NeurIPS), 2020

  32. [32]

    Sheaf neural networks with connection laplacians

    Federico Barbero, Cristian Bodnar, Haitz Sáez de Ocáriz Borde, Michael Bron- stein, Petar Veličković, and Pietro Liò. Sheaf neural networks with connection laplacians. InProceedings of the Topological, Algebraic, and Geometric Learning Workshops (ICML), 2022

  33. [33]

    Hampus Linander, Christoffer Petersson, Daniel Persson, and Jan E. Gerken. PEAR: equal area weather forecasting on the sphere. InProceedings of the AI for Science workshop (NeurIPS), 2025

  34. [34]

    Müller, and Daniel Schuh

    Matteo Favoni, Andreas Ipp, David I. Müller, and Daniel Schuh. Lattice gauge equivariant convolutional neural networks.Physical Review Letters, 128:032003, 2022. 36 A Proof of Lemma 3.4 In §3.2 we introduced two definitions of a feature field. Here, we prove the lemma stating that these two definitions are equivalent. Lemma 3.4.Definitions 3.2 and 3.3 are...