Existence of a high-arity sample compression scheme of non-trivial quality implies high-arity PAC learnability.
Understanding Machine Learning: From Theory to Algorithms
2 Pith papers cite this work. Polarity classification is still indexing.
2
Pith papers citing it
fields
cs.LG 2years
2026 2verdicts
UNVERDICTED 2representative citing papers
A functional similarity metric for ReLU networks uses normalized activation region signatures and MinHash to overcome parametric symmetries like neuron permutation and scaling.
citing papers explorer
-
High-arity Sample Compression
Existence of a high-arity sample compression scheme of non-trivial quality implies high-arity PAC learnability.
-
Functional Similarity Metric for Neural Networks: Overcoming Parametric Ambiguity via Activation Region Analysis
A functional similarity metric for ReLU networks uses normalized activation region signatures and MinHash to overcome parametric symmetries like neuron permutation and scaling.