pith. machine review for the scientific record. sign in

arxiv: 2605.06140 · v1 · submitted 2026-05-07 · 💻 cs.LG · cs.AI

Recognition: unknown

SymDrift: One-Shot Generative Modeling under Symmetries

Authors on Pith no claims yet

Pith reviewed 2026-05-08 13:46 UTC · model grok-4.3

classification 💻 cs.LG cs.AI
keywords generative modelingsymmetriesdrifting modelsmolecular conformersone-shot generationequivariant modelstransition states
0
0 comments X

The pith

SymDrift lets drifting models generate symmetric structures like molecular conformers in one step by symmetrizing the drift field rather than the training data.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper seeks to make one-shot drifting models respect global symmetries such as rotations without paying the cost of fully symmetrizing the empirical distribution. Standard equivariant drifting models produce a different field than the symmetrized target, so the authors replace that mismatch with two explicit fixes: a drift computed after optimal coordinate alignment and an embedding that is invariant under the symmetry group by design. If these fixes work, single-step sampling becomes accurate for physical systems while cutting compute by up to forty times relative to multi-step baselines. Readers who generate molecules or transition states at scale would then gain a practical route to high-throughput virtual screening and reaction exploration.

Core claim

SymDrift resolves the symmetry mismatch in drifting models by defining a symmetrized drift via optimal alignment in coordinate space and a G-invariant embedding that removes symmetry ambiguity, so that the generated drifting field matches the one induced by the symmetrized target distribution and enables accurate one-shot sampling.

What carries the argument

Symmetrized drift obtained by optimal alignment together with a G-invariant embedding that removes symmetry ambiguity by construction.

If this is right

  • One-step sampling becomes practical for rotation-invariant molecular distributions.
  • Generation cost drops by up to forty times while staying competitive with multi-step equivariant models.
  • Performance on standard conformer and transition-state benchmarks exceeds other one-shot methods.
  • High-throughput tasks such as virtual screening become feasible on modest hardware.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The same alignment-plus-invariant-embedding pattern could apply to other finite or continuous symmetry groups.
  • Real-time molecular design loops could incorporate the method directly when latency matters more than marginal accuracy gains.
  • Integration with existing equivariant architectures might further reduce any remaining per-sample alignment overhead.

Load-bearing premise

That applying optimal alignment to the drift or using an invariant embedding produces a field equivalent to the fully symmetrized target without introducing bias or hidden per-sample costs that cancel the claimed speed gain.

What would settle it

Generate a large set of samples with SymDrift and with a baseline that explicitly symmetrizes every training example, then test whether their marginal distributions over rotation-invariant features differ beyond sampling noise.

Figures

Figures reproduced from arXiv: 2605.06140 by Llu\'is Pastor-P\'erez, Loay Mualem, Mathias Niepert, Samir Darouich, Tanja Bien, Vinh Tong.

Figure 1
Figure 1. Figure 1: Conceptual illustration of SymDrift. a The base drift assigns highest kernel weights ˜k(x, y) to y + 3 , despite y + 2 being the closest conformer under symmetry, leading to a biased drifting field Vˆ + p (x) ̸= V+ pG (x). b SymDrift incorporates symmetries via 1) explicit alignment or 2) G￾invariant embedding space. c This leads to SymDrift correctly identifying y + 2 in the symmetrized space, producing t… view at source ↗
Figure 2
Figure 2. Figure 2: presents an ablation study on the GEOM-QM9 dataset, assessing how different formulations of the drifting field, ranging from base coordinate and iterative alignment to embedding space representations, affect generative performance. Validating Theorem 1, we observe a complete training collapse when applying the base coordinate drift to an equivariant architecture. The model achieves 0% coverage at the defin… view at source ↗
read the original abstract

Generative modeling of physical systems, such as molecules, requires learning distributions that are invariant under global symmetries, such as rotations in three-dimensional space. Equivariant diffusion and flow matching models can incorporate such invariances effectively, even when trained on a non-invariant empirical distribution, but they typically rely on costly multi-step sampling. Recently, drifting models have emerged as an efficient alternative, enabling single-step generation and achieving state-of-the-art performance in generative modeling tasks. However, we show that drifting models face a symmetry-specific challenge, since an equivariant generator does not generally produce the same drifting field as the one obtained from the symmetrized target distribution. Addressing this issue would require expensive symmetrization of the empirical distribution. To avoid this cost, we propose SymDrift, a framework that makes the drifting field itself symmetry-aware. We introduce two complementary strategies: (i) a symmetrized drift in coordinate space based on optimal alignment, and (ii) a $G$-invariant embedding that removes symmetry ambiguity by construction. Empirically, SymDrift outperforms existing one-shot methods on standard benchmarks for conformer and transition state generation, while remaining competitive with significantly more expensive multi-step approaches. By enabling one-shot inference, SymDrift reduces computational overhead by up to 40$\times$ compared to existing baselines, making it promising for high-throughput applications such as virtual drug screening and large-scale reaction network exploration.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

1 major / 2 minor

Summary. The manuscript proposes SymDrift, a framework for one-shot generative modeling of symmetry-invariant distributions in physical systems like molecules. It identifies that standard drifting models do not produce the correct drifting field when the generator is equivariant but the target is symmetrized, and introduces two strategies to address this: symmetrized drift via optimal alignment in coordinate space and G-invariant embeddings. The approach is evaluated on conformer and transition state generation tasks, where it outperforms other one-shot methods and competes with multi-step approaches while offering substantial computational savings.

Significance. If the proposed constructions correctly yield unbiased symmetry-aware drifting fields, this work has the potential to significantly impact high-throughput generative tasks in chemistry and physics by providing an efficient one-shot alternative to multi-step equivariant diffusion models. The claimed 40× reduction in overhead is a notable practical advantage for applications such as virtual drug screening.

major comments (1)
  1. [The symmetrized drift construction (Section 3)] The central claim that the symmetrized drift based on optimal alignment produces a field equivalent to that from the fully symmetrized target distribution is not sufficiently supported. Optimal alignment selects a single best representative rather than integrating over the group orbit, and for groups involving continuous rotations and discrete permutations of atoms, this may not recover the same vector field. This equivalence is load-bearing for the assertion that SymDrift avoids bias without the explicit averaging cost.
minor comments (2)
  1. The paper would benefit from an explicit statement of the group G in the introduction, including whether it includes reflections or only proper rotations.
  2. [Experimental section] Include ablation studies separating the contributions of the two proposed strategies (optimal alignment vs. G-invariant embedding) to the performance gains.

Simulated Author's Rebuttal

1 responses · 0 unresolved

We thank the referee for their careful reading and insightful feedback on our manuscript. We address the major comment point-by-point below and have revised the paper to strengthen the theoretical justification for the symmetrized drift construction.

read point-by-point responses
  1. Referee: [The symmetrized drift construction (Section 3)] The central claim that the symmetrized drift based on optimal alignment produces a field equivalent to that from the fully symmetrized target distribution is not sufficiently supported. Optimal alignment selects a single best representative rather than integrating over the group orbit, and for groups involving continuous rotations and discrete permutations of atoms, this may not recover the same vector field. This equivalence is load-bearing for the assertion that SymDrift avoids bias without the explicit averaging cost.

    Authors: We appreciate this precise observation on the distinction between selecting a single optimal representative and explicit integration over the group orbit. In Section 3, the symmetrized drift is defined by solving for the group element g* that minimizes the squared Euclidean distance between the generated point and the target after applying g, which is the standard optimal alignment procedure used in molecular conformation tasks. While this is a pointwise selection rather than an average, we note that the resulting vector field is the gradient of the aligned distance and, under the assumption that the optimal alignment is unique (which holds almost surely for generic configurations under continuous rotations), it coincides with the field induced by the symmetrized distribution. To make this rigorous, the revised manuscript now includes a formal proposition establishing the equivalence of the expected drift fields, along with a brief error analysis for the discrete permutation component when multiple alignments are possible. We have also added a small-scale numerical verification comparing the aligned drift to a Monte-Carlo averaged symmetrized field on toy examples. These additions directly address the load-bearing claim without changing the method or empirical results. revision: yes

Circularity Check

0 steps flagged

No significant circularity; algorithmic constructions are independent of target result

full rationale

The abstract and context present SymDrift as a new framework introducing symmetrized drift via optimal alignment and G-invariant embedding to address a symmetry challenge in drifting models. No equations, derivations, or parameter-fitting steps are supplied that reduce by construction to the symmetrized target distribution or to self-citations. Claims rest on empirical outperformance on benchmarks rather than any self-definitional, fitted-input, or uniqueness-imported reduction. The derivation chain is therefore self-contained and does not exhibit any of the enumerated circularity patterns.

Axiom & Free-Parameter Ledger

0 free parameters · 1 axioms · 2 invented entities

The central claim rests on the domain assumption that physical systems possess global symmetries (rotations, etc.) and on the technical assumption that optimal alignment or invariant embeddings can be computed efficiently enough to preserve the one-shot advantage. No free parameters or new physical entities are mentioned.

axioms (1)
  • domain assumption Physical systems such as molecules are subject to global symmetries (e.g., rotations in 3D space) that the generative distribution must respect.
    Stated in the first sentence of the abstract as a requirement for generative modeling of physical systems.
invented entities (2)
  • Symmetrized drift in coordinate space no independent evidence
    purpose: To produce a symmetry-aware drifting field via optimal alignment without explicit symmetrization of the empirical distribution.
    Introduced as strategy (i) to address the mismatch between equivariant generators and symmetrized targets.
  • G-invariant embedding no independent evidence
    purpose: To remove symmetry ambiguity by construction in the embedding space.
    Introduced as strategy (ii) complementary to the coordinate-space approach.

pith-pipeline@v0.9.0 · 5575 in / 1401 out tokens · 55436 ms · 2026-05-08T13:46:24.701547+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

63 extracted references · 7 canonical work pages · 2 internal anchors

  1. [1]

    Denoising diffusion probabilistic models.Advances in neural information processing systems, 33:6840–6851, 2020

    Jonathan Ho, Ajay Jain, and Pieter Abbeel. Denoising diffusion probabilistic models.Advances in neural information processing systems, 33:6840–6851, 2020

  2. [2]

    Score-based generative modeling through stochastic differential equations

    Yang Song, Jascha Sohl-Dickstein, Diederik P Kingma, Abhishek Kumar, Stefano Ermon, and Ben Poole. Score-based generative modeling through stochastic differential equations. In International Conference on Learning Representations, 2021

  3. [3]

    Generative Modeling via Drifting

    Mingyang Deng, He Li, Tianhong Li, Yilun Du, and Kaiming He. Generative modeling via drifting.arXiv preprint arXiv:2602.04770, 2026

  4. [4]

    Yaron Lipman, Ricky T. Q. Chen, Heli Ben-Hamu, Maximilian Nickel, and Matthew Le. Flow matching for generative modeling. InThe Eleventh International Conference on Learning Representations, 2023

  5. [5]

    Flow straight and fast: Learning to generate and transfer data with rectified flow

    Xingchao Liu, Chengyue Gong, and qiang liu. Flow straight and fast: Learning to generate and transfer data with rectified flow. InThe Eleventh International Conference on Learning Representations, 2023

  6. [6]

    Stochastic interpolants: A unifying framework for flows and diffusions.Journal of Machine Learning Research, 26(209): 1–80, 2025

    Michael Albergo, Nicholas M Boffi, and Eric Vanden-Eijnden. Stochastic interpolants: A unifying framework for flows and diffusions.Journal of Machine Learning Research, 26(209): 1–80, 2025

  7. [7]

    Elucidating the design space of diffusion-based generative models.Advances in neural information processing systems, 35: 26565–26577, 2022

    Tero Karras, Miika Aittala, Timo Aila, and Samuli Laine. Elucidating the design space of diffusion-based generative models.Advances in neural information processing systems, 35: 26565–26577, 2022

  8. [8]

    Mean flows for one-step generative modeling

    Zhengyang Geng, Mingyang Deng, Xingjian Bai, J Zico Kolter, and Kaiming He. Mean flows for one-step generative modeling. InThe Thirty-ninth Annual Conference on Neural Information Processing Systems, 2025

  9. [9]

    Diffusion transformers with representation autoencoders

    Boyang Zheng, Nanye Ma, Shengbang Tong, and Saining Xie. Diffusion transformers with representation autoencoders. InThe Fourteenth International Conference on Learning Repre- sentations, 2026

  10. [10]

    Propmolflow: property-guided molecule generation with geometry-complete flow matching.Nature Computa- tional Science, pages 1–10, 2026

    Cheng Zeng, Jirui Jin, Connor Ambrose, George Karypis, Mark Transtrum, Ellad B Tadmor, Richard G Hennig, Adrian Roitberg, Stefano Martiniani, and Mingjie Liu. Propmolflow: property-guided molecule generation with geometry-complete flow matching.Nature Computa- tional Science, pages 1–10, 2026

  11. [11]

    Equivariant flow matching with hybrid probability transport for 3d molecule generation.Advances in Neural Information Processing Systems, 2023

    Yuxuan Song, Jingjing Gong, Minkai Xu, Ziyao Cao, Yanyan Lan, Stefano Ermon, Hao Zhou, and Wei-Ying Ma. Equivariant flow matching with hybrid probability transport for 3d molecule generation.Advances in Neural Information Processing Systems, 2023

  12. [12]

    Equivariant flow matching.Advances in Neural Information Processing Systems, 2023

    Leon Klein, Andreas Krämer, and Frank Noé. Equivariant flow matching.Advances in Neural Information Processing Systems, 2023

  13. [13]

    Efficient molecular con- former generation with SO(3)-averaged flow matching and reflow

    Zhonglin Cao, Mario Geiger, Allan Dos Santos Costa, Danny Reidenbach, Karsten Kreis, Tomas Geffner, Franco Pellegrini, Guoqing Zhou, and Emine Kucukbenli. Efficient molecular con- former generation with SO(3)-averaged flow matching and reflow. InForty-second International Conference on Machine Learning, 2025

  14. [14]

    Improving and generalizing flow-based genera- tive models with minibatch optimal transport.Transactions on Machine Learning Research,

    Alexander Tong, Kilian FATRAS, Nikolay Malkin, Guillaume Huguet, Yanlei Zhang, Jarrid Rector-Brooks, Guy Wolf, and Yoshua Bengio. Improving and generalizing flow-based genera- tive models with minibatch optimal transport.Transactions on Machine Learning Research,

  15. [15]

    Accurate structure prediction of biomolecular interactions with alphafold 3.Nature, 630(8016):493–500, 2024

    Josh Abramson, Jonas Adler, Jack Dunger, Richard Evans, Tim Green, Alexander Pritzel, Olaf Ronneberger, Lindsay Willmore, Andrew J Ballard, Joshua Bambrick, et al. Accurate structure prediction of biomolecular interactions with alphafold 3.Nature, 630(8016):493–500, 2024

  16. [16]

    Proteina: Scaling flow-based protein structure generative models.arXiv preprint arXiv:2503.00710, 2025

    Tomas Geffner, Kieran Didi, Zuobai Zhang, Danny Reidenbach, Zhonglin Cao, Jason Yim, Mario Geiger, Christian Dallago, Emine Kucukbenli, Arash Vahdat, et al. Proteina: Scaling flow-based protein structure generative models.arXiv preprint arXiv:2503.00710, 2025. 11

  17. [17]

    Equivariant flows: exact likelihood generative learning for symmetric densities

    Jonas Köhler, Leon Klein, and Frank Noé. Equivariant flows: exact likelihood generative learning for symmetric densities. InInternational conference on machine learning, pages 5361–5370. PMLR, 2020

  18. [18]

    Equivariant diffusion for molecule generation in 3D

    Emiel Hoogeboom, Víctor Garcia Satorras, Clément Vignac, and Max Welling. Equivariant diffusion for molecule generation in 3D. InProceedings of the 39th International Conference on Machine Learning, 2022

  19. [19]

    Rao- blackwell gradient estimators for equivariant denoising diffusion

    Vinh Tong, Trung-Dung Hoang, Anji Liu, Guy Van den Broeck, and Mathias Niepert. Rao- blackwell gradient estimators for equivariant denoising diffusion. InProceedings of the 39th Conference on Neural Information Processing Systems, 2025

  20. [20]

    Et-flow: Equivariant flow-matching for molecular conformer generation.Advances in Neural Information Processing Systems, 37:128798–128824, 2024

    Majdi Hassan, Nikhil Shenoy, Jungyoon Lee, Hannes Stärk, Stephan Thaler, and Dominique Beaini. Et-flow: Equivariant flow-matching for molecular conformer generation.Advances in Neural Information Processing Systems, 37:128798–128824, 2024

  21. [21]

    Consistency models

    Yang Song, Prafulla Dhariwal, Mark Chen, and Ilya Sutskever. Consistency models. In Proceedings of the 40th International Conference on Machine Learning, 2023

  22. [22]

    Learning to discretize denoising diffusion ODEs

    Vinh Tong, Dung Trung Hoang, Anji Liu, Guy Van den Broeck, and Mathias Niepert. Learning to discretize denoising diffusion ODEs. InThe Thirteenth International Conference on Learning Representations, 2025

  23. [23]

    Instaflow: One step is enough for high-quality diffusion-based text-to-image generation

    Xingchao Liu, Xiwen Zhang, Jianzhu Ma, Jian Peng, et al. Instaflow: One step is enough for high-quality diffusion-based text-to-image generation. InThe Twelfth International Conference on Learning Representations, 2023

  24. [24]

    Simple reflow: Improved techniques for fast flow models

    Beomsu Kim, Yu-Guan Hsieh, Michal Klein, marco cuturi, Jong Chul Ye, Bahjat Kawar, and James Thornton. Simple reflow: Improved techniques for fast flow models. InThe Thirteenth International Conference on Learning Representations, 2025

  25. [25]

    Slimflow: Training smaller one-step diffusion models with rectified flow

    Yuanzhi Zhu, Xingchao Liu, and Qiang Liu. Slimflow: Training smaller one-step diffusion models with rectified flow. InEuropean Conference on Computer Vision, pages 342–359. Springer, 2024

  26. [26]

    Improving the training of rectified flows

    Sangyun Lee, Zinan Lin, and Giulia Fanti. Improving the training of rectified flows. InThe Thirty-eighth Annual Conference on Neural Information Processing Systems, 2024

  27. [27]

    Better informed distance geometry: using what we know to improve conformation generation.Journal of chemical information and modeling, 55 (12):2562–2574, 2015

    Sereina Riniker and Gregory A Landrum. Better informed distance geometry: using what we know to improve conformation generation.Journal of chemical information and modeling, 55 (12):2562–2574, 2015

  28. [28]

    Predicting molecular conformation via dynamic graph score matching.Advances in neural information processing systems, 34: 19784–19795, 2021

    Shitong Luo, Chence Shi, Minkai Xu, and Jian Tang. Predicting molecular conformation via dynamic graph score matching.Advances in neural information processing systems, 34: 19784–19795, 2021

  29. [29]

    Learning gradient fields for molecular conformation generation

    Chence Shi, Shitong Luo, Minkai Xu, and Jian Tang. Learning gradient fields for molecular conformation generation. InInternational conference on machine learning, pages 9558–9568. PMLR, 2021

  30. [30]

    Geodiff: A geo- metric diffusion model for molecular conformation generation.arXiv preprint arXiv:2203.02923, 2022

    Minkai Xu, Lantao Yu, Yang Song, Chence Shi, Stefano Ermon, and Jian Tang. Geodiff: A geo- metric diffusion model for molecular conformation generation.arXiv preprint arXiv:2203.02923, 2022

  31. [31]

    Torsional diffusion for molecular conformer generation.Advances in neural information processing systems, 35:24240–24253, 2022

    Bowen Jing, Gabriele Corso, Jeffrey Chang, Regina Barzilay, and Tommi Jaakkola. Torsional diffusion for molecular conformer generation.Advances in neural information processing systems, 35:24240–24253, 2022

  32. [32]

    NExt-mol: 3d diffusion meets 1d language modeling for 3d molecule generation

    Zhiyuan Liu, Yanchen Luo, Han Huang, Enzhi Zhang, Sihang Li, Junfeng Fang, Yaorui Shi, Xiang Wang, Kenji Kawaguchi, and Tat-Seng Chua. NExt-mol: 3d diffusion meets 1d language modeling for 3d molecule generation. InThe Thirteenth International Conference on Learning Representations, 2025. 12

  33. [33]

    Energy-guided flow matching enables few-step conformer generation and ground-state identification.arXiv preprint arXiv:2512.22597, 2025

    Guikun Xu, Xiaohan Yi, Peilin Zhao, and Yatao Bian. Energy-guided flow matching enables few-step conformer generation and ground-state identification.arXiv preprint arXiv:2512.22597, 2025

  34. [34]

    Nudged elastic band method for finding minimum energy paths of transitions

    Hannes Jónsson, Greg Mills, and Karsten W Jacobsen. Nudged elastic band method for finding minimum energy paths of transitions. InClassical and quantum dynamics in condensed phase simulations, pages 385–404. World Scientific, 1998

  35. [35]

    Uberuaga, and Hannes Jónsson

    Graeme Henkelman, Blas P. Uberuaga, and Hannes Jónsson. A climbing image nudged elastic band method for finding saddle points and minimum energy paths.J. Chem. Phys., 113(22): 9901–9904, December 2000. ISSN 0021-9606

  36. [36]

    Search for stationary points on surfaces.J

    Ajit Banerjee, Noah Adams, Jack Simons, and Ron Shepard. Search for stationary points on surfaces.J. Phys. Chem., 89(1):52–57, January 1985. ISSN 0022-3654, 1541-5740

  37. [37]

    A dimer method for finding saddle points on high dimensional potential surfaces using only first derivatives.J

    Graeme Henkelman and Hannes Jónsson. A dimer method for finding saddle points on high dimensional potential surfaces using only first derivatives.J. Chem. Phys., 111(15):7010–7022, October 1999. ISSN 0021-9606

  38. [38]

    Generating transition states of isomerization reactions with deep learning.Physical Chemistry Chemical Physics, 22(41):23618–23626, 2020

    Lagnajit Pattanaik, John B Ingraham, Colin A Grambow, and William H Green. Generating transition states of isomerization reactions with deep learning.Physical Chemistry Chemical Physics, 22(41):23618–23626, 2020

  39. [39]

    Accurate transition state generation with an object-aware equivariant elementary reaction diffusion model.Nature computational science, 3(12):1045–1055, 2023

    Chenru Duan, Yuanqi Du, Haojun Jia, and Heather J Kulik. Accurate transition state generation with an object-aware equivariant elementary reaction diffusion model.Nature computational science, 3(12):1045–1055, 2023

  40. [40]

    Yuan, Anup Kumar, Xingyi Guan, Eric D

    Eric C.-Y . Yuan, Anup Kumar, Xingyi Guan, Eric D. Hermes, Andrew S. Rosen, Judit Zádor, Teresa Head-Gordon, and Samuel M. Blau. Analytical ab initio hessian from a deep learning potential for transition state optimization.Nat. Commun., 15(1):8865, October 2024. ISSN 2041-1723

  41. [41]

    Kitchin, Zachary W

    Brook Wander, Muhammed Shuaibi, John R. Kitchin, Zachary W. Ulissi, and C. Lawrence Zitnick. Cattsunami: Accelerating transition state energy calculations with pretrained graph neural networks.ACS Catal., 15(7):5283–5294, 2025

  42. [42]

    Diffusion-based generative ai for exploring transition states from 2d molecular graphs.Nature Communications, 15(1):341, 2024

    Seonghwan Kim, Jeheon Woo, and Woo Youn Kim. Diffusion-based generative ai for exploring transition states from 2d molecular graphs.Nature Communications, 15(1):341, 2024

  43. [43]

    Goflow: efficient transition state geometry prediction with flow matching and e (3)-equivariant neural networks.Digital Discovery, 4(12):3492–3501, 2025

    Leonard Galustian, Konstantin Mark, Johannes Karwounopoulos, Maximilian P-P Kovar, and Esther Heid. Goflow: efficient transition state geometry prediction with flow matching and e (3)-equivariant neural networks.Digital Discovery, 4(12):3492–3501, 2025

  44. [44]

    Adaptive transition-state refinement with learned equilibrium flows.Journal of Chemical Information and Modeling, 66(4):2154–2165, 2026

    Samir Darouich, Vinh Tong, Tanja Bien, Johannes Kästner, and Mathias Niepert. Adaptive transition-state refinement with learned equilibrium flows.Journal of Chemical Information and Modeling, 66(4):2154–2165, 2026

  45. [45]

    motsart: Accelerating automated transition state search with generative models in a low-data regime

    Leonard Galustian, Johannes Karwounopoulos, Tori Demuth, Jasper De Landsheere, Konstantin Mark, Maximilian P-P Kovar, Anton Zamyatin, Dennis Svatunek, and Esther Heid. motsart: Accelerating automated transition state search with generative models in a low-data regime. ChemRxiv, 2026(0417), 2026

  46. [46]

    Equivariant score-based gen- erative models provably learn distributions with symmetries efficiently.arXiv preprint arXiv:2410.01244, 2024

    Ziyu Chen, Markos A Katsoulakis, and Benjamin J Zhang. Equivariant score-based gen- erative models provably learn distributions with symmetries efficiently.arXiv preprint arXiv:2410.01244, 2024

  47. [47]

    A solution for the best rotation to relate two sets of vectors.Foundations of Crystallography, 32(5):922–923, 1976

    Wolfgang Kabsch. A solution for the best rotation to relate two sets of vectors.Foundations of Crystallography, 32(5):922–923, 1976

  48. [48]

    The hungarian method for the assignment problem.Naval research logistics quarterly, 2(1-2):83–97, 1955

    Harold W Kuhn. The hungarian method for the assignment problem.Naval research logistics quarterly, 2(1-2):83–97, 1955

  49. [49]

    Geom, energy-annotated molecular conforma- tions for property prediction and molecular generation.Scientific data, 9(1):185, 2022

    Simon Axelrod and Rafael Gomez-Bombarelli. Geom, energy-annotated molecular conforma- tions for property prediction and molecular generation.Scientific data, 9(1):185, 2022. 13

  50. [50]

    Crest—a program for the exploration of low-energy molecular chemical space.The Journal of Chemical Physics, 160(11), 2024

    Philipp Pracht, Stefan Grimme, Christoph Bannwarth, Fabian Bohle, Sebastian Ehlert, Gereon Feldmann, Johannes Gorges, Marcel Müller, Tim Neudecker, Christoph Plett, et al. Crest—a program for the exploration of low-energy molecular chemical space.The Journal of Chemical Physics, 160(11), 2024

  51. [51]

    Coley, Regina Barzilay, Klavs Jensen, William Green, and Tommi S

    Octavian-Eugen Ganea, Lagnajit Pattanaik, Connor W. Coley, Regina Barzilay, Klavs Jensen, William Green, and Tommi S. Jaakkola. Geomol: Torsional geometric generation of molecular 3d conformer ensembles. In A. Beygelzimer, Y . Dauphin, P. Liang, and J. Wortman Vaughan, editors,Advances in Neural Information Processing Systems, 2021

  52. [52]

    Sampling 3d molecular conformers with diffusion transformers.arXiv preprint arXiv:2506.15378, 2025

    J Thorben Frank, Winfried Ripken, Gregor Lied, Klaus-Robert MÞller, Oliver T Unke, and Stefan Chmiela. Sampling 3d molecular conformers with diffusion transformers.arXiv preprint arXiv:2506.15378, 2025

  53. [53]

    Generative Sliced MMD Flows with Riesz Kernels

    Ping He, Om Khangaonkar, Hamed Pirsiavash, Yikun Bai, and Soheil Kolouri. Sinkhorn-drifting generative models.arXiv preprint arXiv:2603.12366, 2026

  54. [54]

    Swallowing the bitter pill: Simplified scalable conformer generation

    Yuyang Wang, Ahmed AA Elhag, Navdeep Jaitly, Joshua M Susskind, and Miguel Ángel Bautista. Swallowing the bitter pill: Simplified scalable conformer generation. InInternational Conference on Machine Learning, pages 50400–50418. PMLR, 2024

  55. [55]

    High accuracy barrier heights, enthalpies, and rate coefficients for chemical reactions.Scientific Data, 9(1):417, 2022

    Kevin Spiekermann, Lagnajit Pattanaik, and William H Green. High accuracy barrier heights, enthalpies, and rate coefficients for chemical reactions.Scientific Data, 9(1):417, 2022

  56. [56]

    Joshi, Cristian Bodnar, Simon V

    Chaitanya K. Joshi, Cristian Bodnar, Simon V . Mathis, Taco Cohen, and Pietro Liò. On the expressive power of geometric graph neural networks. InInternational Conference on Machine Learning, 2023

  57. [57]

    Weisfeiler leman for euclidean equivariant machine learning

    Snir Hordan, Tal Amir, and Nadav Dym. Weisfeiler leman for euclidean equivariant machine learning. InInternational Conference on Machine Learning, 2024

  58. [58]

    On the expressive power of sparse geometric mpnns

    Yonatan Sverdlov and Nadav Dym. On the expressive power of sparse geometric mpnns. In International Conference on Learning Representations, 2025

  59. [59]

    Machine learning of reaction properties via learned repre- sentations of the condensed graph of reaction.Journal of Chemical Information and Modeling, 62(9):2101–2110, 2021

    Esther Heid and William H Green. Machine learning of reaction properties via learned repre- sentations of the condensed graph of reaction.Journal of Chemical Information and Modeling, 62(9):2101–2110, 2021. 14 SYMDRIFT: ONE-SHOTGENERATIVEMODELING UNDERSYMMETRIES ADDITIONALMATERIAL A Extented Background 16 B Theoretical Proofs 16 B.1 Gradient definition . ...

  60. [60]

    Expression forV + pG(x)

  61. [61]

    Expression for ˆV+ p (x)

  62. [62]

    -R” denotes Recall, and “-P

    Comparison. Step 1: Symmetrized distribution drift.By definition, V+ pG(x) = Ey+∼pG k(x,y +)(y+ −x) Ey+∼pG k(x,y +) . Usingp G(y) =E g∼G[p(g−1y)], we rewrite the numerator: ApG(x) =E y+∼pEg∼G k(x, gy+)(gy+ −x) =E g∼GEy+∼p k(x, gy+)(gy+ −x) Using orthogonality of G, we have k(x, gy) =k(g −1x,y), and we rewrite gy−x=g(y−g −1x). Hence, ApG(x) =E g∼G h gE y∼p...

  63. [63]

    Guidelines: • The answer [N/A] means that the paper does not involve crowdsourcing nor research with human subjects

    Institutional review board (IRB) approvals or equivalent for research with human subjects Question: Does the paper describe potential risks incurred by study participants, whether such risks were disclosed to the subjects, and whether Institutional Review Board (IRB) approvals (or an equivalent approval/review based on the requirements of your country or ...