VNN-LIB 2.0 defines a network theory abstraction, formal query syntax, type system over numeric domains, and Agda-mechanized semantics to provide rigorous foundations for neural network verification independent of evolving model formats.
The 6th international verification of neural networks competition (VNN-COMP 2025): Summary and results
5 Pith papers cite this work. Polarity classification is still indexing.
years
2026 5representative citing papers
A ReLU-catalyzed abstraction method yields tighter bounds for transformer verification by converting dot-product constraints into ReLU forms that leverage standard convex relaxations.
A polynomial-based circuit model combined with polynomial zonotope reachability analysis verifies analog neural networks under process variations, reducing verification time from days to seconds while enclosing 99% of variation samples across three datasets.
Luna delivers a C++ bound propagator supporting interval, DeepPoly/CROWN, and alpha-CROWN analyses that reports tighter bounds and higher speed than the leading Python alpha-CROWN implementation on VNN-COMP 2025 benchmarks.
citing papers explorer
-
VNN-LIB 2.0: Rigorous Foundations for Neural Network Verification
VNN-LIB 2.0 defines a network theory abstraction, formal query syntax, type system over numeric domains, and Agda-mechanized semantics to provide rigorous foundations for neural network verification independent of evolving model formats.
-
Precise Verification of Transformers through ReLU-Catalyzed Abstraction Refinement
A ReLU-catalyzed abstraction method yields tighter bounds for transformer verification by converting dot-product constraints into ReLU forms that leverage standard convex relaxations.
-
Formally Verifying Analog Neural Networks Under Process Variations Using Polynomial Zonotopes
A polynomial-based circuit model combined with polynomial zonotope reachability analysis verifies analog neural networks under process variations, reducing verification time from days to seconds while enclosing 99% of variation samples across three datasets.
-
The Luna Bound Propagator for Formal Analysis of Neural Networks
Luna delivers a C++ bound propagator supporting interval, DeepPoly/CROWN, and alpha-CROWN analyses that reports tighter bounds and higher speed than the leading Python alpha-CROWN implementation on VNN-COMP 2025 benchmarks.
- Quantitative Linear Logic for Neuro-Symbolic Learning and Verification