Bayesian softmax-gated mixture-of-experts models achieve posterior contraction for density estimation and parameter recovery using Voronoi losses, plus two strategies for choosing the number of experts.
arXiv preprint arXiv:1704.00805 , year=
5 Pith papers cite this work. Polarity classification is still indexing.
years
2026 5verdicts
UNVERDICTED 5representative citing papers
SC-DN establishes a global first-order stationary point per round and solves a mixed-integer signomial program to optimize four control variables for VFL, yielding better classification performance and lower resource use than greedy baselines on image and multi-modal data.
Common ID estimators fail to track the true intrinsic dimension of neural representations and are instead driven by other factors.
SCGFM creates transferable graph representations by aligning heterogeneous topologies to shared learnable geometric bases via Gromov-Wasserstein distances and re-encoding features accordingly.
QAOA ansatz with finite layers can capture any bitstring distribution and solves the Fair Cut Cover problem with provable and empirical advantages over classical approximations on certain graphs.
citing papers explorer
-
On Bayesian Softmax-Gated Mixture-of-Experts Models
Bayesian softmax-gated mixture-of-experts models achieve posterior contraction for density estimation and parameter recovery using Voronoi losses, plus two strategies for choosing the number of experts.
-
Optimizing Server Placement for Vertical Federated Learning in Dynamic Edge/Fog Networks
SC-DN establishes a global first-order stationary point per round and solves a mixed-integer signomial program to optimize four control variables for VFL, yielding better classification performance and lower resource use than greedy baselines on image and multi-modal data.
-
Rethinking Intrinsic Dimension Estimation in Neural Representations
Common ID estimators fail to track the true intrinsic dimension of neural representations and are instead driven by other factors.
-
Structure-Centric Graph Foundation Model via Geometric Bases
SCGFM creates transferable graph representations by aligning heterogeneous topologies to shared learnable geometric bases via Gromov-Wasserstein distances and re-encoding features accordingly.
-
Learning Cut Distributions with Quantum Optimization
QAOA ansatz with finite layers can capture any bitstring distribution and solves the Fair Cut Cover problem with provable and empirical advantages over classical approximations on certain graphs.