SLayerGen generates crystals invariant to any space or layer group via autoregressive lattice and Wyckoff sampling plus equivariant diffusion, achieving gains over bulk models on diperiodic materials after correcting a prior loss inconsistency for hexagonal groups.
hub
Barroso-Luque, M
15 Pith papers cite this work. Polarity classification is still indexing.
hub tools
years
2026 15representative citing papers
MatRIS-MoE and Janus enable efficient exascale training of billion-parameter universal interatomic potentials by addressing second-order derivative computation and communication overheads.
CrystalREPA closes the representation gap between crystal generators and universal MLIPs via contrastive alignment, yielding more stable and valid generated crystals while revealing that MLIP teacher quality is better predicted by representation distinguishability than by leaderboard accuracy.
Structural pruning of SO(3) equivariant atomistic models from large checkpoints yields 1.5-4x fewer parameters and 2.5-4x less pre-training compute than small models trained from scratch, while outperforming them on most Matbench Discovery metrics and downstream tasks.
MatterSim-MT is a foundation model pretrained on over 35 million first-principles structures that predicts material structure, dynamics, and thermodynamics while enabling multi-task simulations of phonon splitting, ferroelectric hysteresis, and redox transitions.
Density diversity in training data is the key factor for making machine learning interatomic potentials transferable across thermodynamic states, outperforming temperature diversity.
VibroML automates remediation of dynamic instabilities in crystalline materials by combining MLIPs with genetic algorithms for polymorph search, finite-temperature MD validation, and compositional alloying to yield stable structures from databases like Alexandria.
PET-UAFD ensemble of ML potentials, calibrated on experimental cohesive energies and moduli, matches experimental accuracy on liquid properties and supplies uncertainty estimates via the PET-EXP protocol.
An agentic framework fusing large atomic and language models rediscovers 66 known superconductors and guides experimental verification of four new ones with transition temperatures from 2.5 K to 6.5 K.
Fine-tuned MACE MLIPs achieve lower mean absolute errors on catalytic reaction energies and barriers than from-scratch models, with a large fine-tuned model performing best on both metallic and oxide systems including out-of-distribution cases.
OptiMat Alloys is a conversational AI system that maintains a living FAIR database of multi-principal element alloy calculations and enables natural-language, on-demand computations with built-in uncertainty checks.
Foundational atomistic models reproduce some structural and dynamical properties of iron alloys under core conditions but none consistently match first-principles benchmarks due to missing explicit treatment of thermal electronic excitations.
New ACE and MACE potentials for InP achieve at most 4% error on partial dislocation formation energies versus DFT, outperforming literature models by factors of 4-12 while being computationally faster.
GRACE MLIPs train faster and predict alloy properties more accurately than NEP, but NEP's 60-fold speed advantage enables reliable million-atom simulations of shock propagation when paired with ensemble uncertainty quantification.
A review of generative AI for inverse design of inorganic compounds, analyzing adaptations for their complexity in composition, geometry, symmetry, and electronic structure, with discussion of future benchmarks and synthesizability metrics.
citing papers explorer
-
SLayerGen: a Crystal Generative Model for all Space and Layer Groups
SLayerGen generates crystals invariant to any space or layer group via autoregressive lattice and Wyckoff sampling plus equivariant diffusion, achieving gains over bulk models on diperiodic materials after correcting a prior loss inconsistency for hexagonal groups.
-
Breaking the Training Barrier of Billion-Parameter Universal Machine Learning Interatomic Potentials
MatRIS-MoE and Janus enable efficient exascale training of billion-parameter universal interatomic potentials by addressing second-order derivative computation and communication overheads.
-
CrystalREPA: Transferring Physical Priors from Universal MLIPs to Crystal Generative Models
CrystalREPA closes the representation gap between crystal generators and universal MLIPs via contrastive alignment, yielding more stable and valid generated crystals while revealing that MLIP teacher quality is better predicted by representation distinguishability than by leaderboard accuracy.
-
Compact SO(3) Equivariant Atomistic Foundation Models via Structural Pruning
Structural pruning of SO(3) equivariant atomistic models from large checkpoints yields 1.5-4x fewer parameters and 2.5-4x less pre-training compute than small models trained from scratch, while outperforming them on most Matbench Discovery metrics and downstream tasks.
-
MatterSim-MT: A multi-task foundation model for in silico materials characterization
MatterSim-MT is a foundation model pretrained on over 35 million first-principles structures that predicts material structure, dynamics, and thermodynamics while enabling multi-task simulations of phonon splitting, ferroelectric hysteresis, and redox transitions.
-
Density diversity in training data governs thermodynamic transferability of machine learning interatomic potentials
Density diversity in training data is the key factor for making machine learning interatomic potentials transferable across thermodynamic states, outperforming temperature diversity.
-
VibroML: an automated toolkit for high-throughput vibrational analysis and dynamic instability remediation of crystalline materials using machine-learned potentials
VibroML automates remediation of dynamic instabilities in crystalline materials by combining MLIPs with genetic algorithms for polymorph search, finite-temperature MD validation, and compositional alloying to yield stable structures from databases like Alexandria.
-
Errors that matter: Uncertainty-aware universal machine-learning potentials calibrated on experiments
PET-UAFD ensemble of ML potentials, calibrated on experimental cohesive energies and moduli, matches experimental accuracy on liquid properties and supplies uncertainty estimates via the PET-EXP protocol.
-
Agentic Fusion of Large Atomic and Language Models to Accelerate Superconductor Discovery
An agentic framework fusing large atomic and language models rediscovers 66 known superconductors and guides experimental verification of four new ones with transition temperatures from 2.5 K to 6.5 K.
-
Systematic Fine-Tuning of MACE Interatomic Potentials for Catalysis
Fine-tuned MACE MLIPs achieve lower mean absolute errors on catalytic reaction energies and barriers than from-scratch models, with a large fine-tuned model performing best on both metallic and oxide systems including out-of-distribution cases.
-
OptiMat Alloys: a FAIR, living database of multi-principal element alloys enabled by a conversational agent
OptiMat Alloys is a conversational AI system that maintains a living FAIR database of multi-principal element alloy calculations and enables natural-language, on-demand computations with built-in uncertainty checks.
-
Assessing foundational atomistic models for iron alloys under Earth's core conditions
Foundational atomistic models reproduce some structural and dynamical properties of iron alloys under core conditions but none consistently match first-principles benchmarks due to missing explicit treatment of thermal electronic excitations.
-
Accurate and Efficient Interatomic Potentials for Dislocations in InP
New ACE and MACE potentials for InP achieve at most 4% error on partial dislocation formation energies versus DFT, outperforming literature models by factors of 4-12 while being computationally faster.
-
Machine Learning Interatomic Potentials for Million-Atom Simulations of Multicomponent Alloys
GRACE MLIPs train faster and predict alloy properties more accurately than NEP, but NEP's 60-fold speed advantage enables reliable million-atom simulations of shock propagation when paired with ensemble uncertainty quantification.
-
Inverse Design of Inorganic Compounds with Generative AI
A review of generative AI for inverse design of inorganic compounds, analyzing adaptations for their complexity in composition, geometry, symmetry, and electronic structure, with discussion of future benchmarks and synthesizability metrics.