Gauge-equivariant graph neural networks embed non-Abelian local symmetries directly into message passing for lattice gauge theories, enabling learning of nonlocal observables from local operations.
Neural message passing for quantum chemistry
6 Pith papers cite this work. Polarity classification is still indexing.
citation-role summary
citation-polarity summary
roles
background 1polarities
unclear 1representative citing papers
A gauge-invariant GNN using Wilson loops as inputs accurately predicts observables and simulates dynamics in Z2 and U(1) lattice gauge models.
PROVFUSION fuses three complementary views of provenance data with lightweight schemes and voting to achieve higher detection accuracy and lower false positives than node- or edge-only baselines on nine benchmarks.
Machine learning models, especially certain deep neural networks, can predict lattice thermal conductivity with useful accuracy across different generalization tests while being orders of magnitude faster than first-principles calculations.
GRASP detects anomalies in system provenance graphs via self-supervised executable prediction from two-hop neighborhoods, outperforming prior PIDS on DARPA datasets by identifying all documented attacks where behaviors are learnable plus additional unlabeled suspicious activity.
Geometric deep learning provides a unified mathematical framework based on grids, groups, graphs, geodesics, and gauges to explain and extend neural network architectures by incorporating physical regularities.
citing papers explorer
-
Gauge-Equivariant Graph Neural Networks for Lattice Gauge Theories
Gauge-equivariant graph neural networks embed non-Abelian local symmetries directly into message passing for lattice gauge theories, enabling learning of nonlocal observables from local operations.
-
Graph Neural Networks in the Wilson Loop Representation of Abelian Lattice Gauge Theories
A gauge-invariant GNN using Wilson loops as inputs accurately predicts observables and simulates dynamics in Z2 and U(1) lattice gauge models.
-
Beyond Nodes vs. Edges: A Multi-View Fusion Framework for Provenance-Based Intrusion Detection
PROVFUSION fuses three complementary views of provenance data with lightweight schemes and voting to achieve higher detection accuracy and lower false positives than node- or edge-only baselines on nine benchmarks.
-
Fast and Accurate Prediction of Lattice Thermal Conductivity via Machine Learning Surrogates
Machine learning models, especially certain deep neural networks, can predict lattice thermal conductivity with useful accuracy across different generalization tests while being orders of magnitude faster than first-principles calculations.
-
GRASP -- Graph-Based Anomaly Detection Through Self-Supervised Classification
GRASP detects anomalies in system provenance graphs via self-supervised executable prediction from two-hop neighborhoods, outperforming prior PIDS on DARPA datasets by identifying all documented attacks where behaviors are learnable plus additional unlabeled suspicious activity.
-
Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges
Geometric deep learning provides a unified mathematical framework based on grids, groups, graphs, geodesics, and gauges to explain and extend neural network architectures by incorporating physical regularities.