Chem-GMNet uses sphere-native embeddings, DualSKA attention, and SH-FFN layers to match or beat ChemBERTa-2 on MoleculeNet tasks with fewer parameters and sometimes no pretraining.
Directional mes- sage passing for molecular graphs.arXiv:2003.03123
5 Pith papers cite this work. Polarity classification is still indexing.
citation-role summary
citation-polarity summary
roles
background 1polarities
unclear 1representative citing papers
QT-Net predicts atomic electron populations and multipoles via a new SOAP-cluster held-out test, improving molecular property prediction and recovering QM9 dipole moments from per-atom outputs.
A composition-weighted symbolic regression framework learns analytical expressions and elemental weightings from composition to predict materials properties with accuracy competitive to black-box models while producing explicit, constraint-enforcing formulas.
BiScale-GTR achieves claimed state-of-the-art results on MoleculeNet, PharmaBench and LRGB by combining improved fragment tokenization with a parallel GNN-Transformer architecture that operates at both atom and fragment scales.
Geometric deep learning provides a unified mathematical framework based on grids, groups, graphs, geodesics, and gauges to explain and extend neural network architectures by incorporating physical regularities.
citing papers explorer
-
Chem-GMNet: A Sphere-Native Geometric Transformer for Molecular Property Prediction
Chem-GMNet uses sphere-native embeddings, DualSKA attention, and SH-FFN layers to match or beat ChemBERTa-2 on MoleculeNet tasks with fewer parameters and sometimes no pretraining.
-
QT-Net: Rethinking Evaluation of AI Models in Atomic Chemical Space
QT-Net predicts atomic electron populations and multipoles via a new SOAP-cluster held-out test, improving molecular property prediction and recovering QM9 dipole moments from per-atom outputs.
-
Composition-Weighted Symbolic Regression for General-Purpose Property Prediction
A composition-weighted symbolic regression framework learns analytical expressions and elemental weightings from composition to predict materials properties with accuracy competitive to black-box models while producing explicit, constraint-enforcing formulas.
-
BiScale-GTR: Fragment-Aware Graph Transformers for Multi-Scale Molecular Representation Learning
BiScale-GTR achieves claimed state-of-the-art results on MoleculeNet, PharmaBench and LRGB by combining improved fragment tokenization with a parallel GNN-Transformer architecture that operates at both atom and fragment scales.
-
Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges
Geometric deep learning provides a unified mathematical framework based on grids, groups, graphs, geodesics, and gauges to explain and extend neural network architectures by incorporating physical regularities.