A mixture model with adaptive KDE and per-image cross-validation raises estimated human fixation consistency by 5-15% median log-likelihood and up to 2 AUC points over fixed-bandwidth Gaussian baselines.
Title resolution pending
10 Pith papers cite this work. Polarity classification is still indexing.
years
2026 10verdicts
UNVERDICTED 10representative citing papers
The profile maximum likelihood estimator for the location in anisotropic hyperbolic wrapped normal models is strongly consistent, asymptotically normal, and attains the Hájek-Le Cam minimax lower bound under squared geodesic loss.
Newton's recursive mixture estimator is a discrete gradient flow on the Fisher-Rao manifold of probability measures.
Reformulating local polynomial fitting with orthogonal Chebyshev polynomials yields two algorithms that cut memory use, improve scalability, and deliver orders-of-magnitude better numerical accuracy than Vandermonde-based methods for Savitzky-Golay filters.
Rectified AI priors, obtained by correcting AI-induced data laws before embedding them in techniques like Dirichlet process priors, reduce bias, improve credible interval coverage, and boost performance in tasks like skin disease classification.
Joint location-scale minimization for geometric medians on product manifolds degenerates to marginal medians, and three new scale-selection methods restore identifiability with asymptotic guarantees.
A Bayesian model for multi-feature contact matrices that uses tensor structures and contingency table theory to satisfy structural constraints and impute missing contact features, validated on simulations and US/German survey data.
On infinite bounded-degree graphs, divisible sandpiles with i.i.d. initial masses of mean μ stabilize almost surely if μ < 1 and masses have finite p-moment for p > 3, but explode if μ ≥ 1; the conditions are nearly sharp via counterexamples on other graphs.
Distributionally robust k-means minimizes worst-case squared distance over a Wasserstein-2 ball around the empirical distribution, yielding a tractable soft-clustering algorithm with monotonic block coordinate descent and local linear convergence.
fastml is an R package that enforces leakage-free preprocessing through guarded resampling and provides a unified interface for safer automated ML including survival analysis.
citing papers explorer
-
Raising the Ceiling: Better Empirical Fixation Densities for Saliency Benchmarking
A mixture model with adaptive KDE and per-image cross-validation raises estimated human fixation consistency by 5-15% median log-likelihood and up to 2 AUC points over fixed-bandwidth Gaussian baselines.
-
Profile Likelihood Inference for Anisotropic Hyperbolic Wrapped Normal Models on Hyperbolic Space
The profile maximum likelihood estimator for the location in anisotropic hyperbolic wrapped normal models is strongly consistent, asymptotically normal, and attains the Hájek-Le Cam minimax lower bound under squared geodesic loss.
-
Newton's Algorithm as a Gradient Flow: A Geometric Framework for Recursive Mixture Estimation
Newton's recursive mixture estimator is a discrete gradient flow on the Fisher-Rao manifold of probability measures.
-
Fast and accurate noise removal by curve fitting using orthogonal polynomials
Reformulating local polynomial fitting with orthogonal Chebyshev polynomials yields two algorithms that cut memory use, improve scalability, and deliver orders-of-magnitude better numerical accuracy than Vandermonde-based methods for Savitzky-Golay filters.
-
Supercharging Bayesian Inference with Reliable AI-Informed Priors
Rectified AI priors, obtained by correcting AI-induced data laws before embedding them in techniques like Dirichlet process priors, reduce bias, improve credible interval coverage, and boost performance in tasks like skin disease classification.
-
Scale selection for geometric medians on product manifolds
Joint location-scale minimization for geometric medians on product manifolds degenerates to marginal medians, and three new scale-selection methods restore identifiability with asymptotic guarantees.
-
Bayesian Modeling and Prediction of Generalized Contact Matrices
A Bayesian model for multi-feature contact matrices that uses tensor structures and contingency table theory to satisfy structural constraints and impute missing contact features, validated on simulations and US/German survey data.
-
Divisible sandpiles via random walks in random scenery
On infinite bounded-degree graphs, divisible sandpiles with i.i.d. initial masses of mean μ stabilize almost surely if μ < 1 and masses have finite p-moment for p > 3, but explode if μ ≥ 1; the conditions are nearly sharp via counterexamples on other graphs.
-
Distributionally Robust K-Means Clustering
Distributionally robust k-means minimizes worst-case squared distance over a Wasserstein-2 ball around the empirical distribution, yielding a tractable soft-clustering algorithm with monotonic block coordinate descent and local linear convergence.
-
fastml: Guarded Resampling Workflows for Safer Automated Machine Learning in R
fastml is an R package that enforces leakage-free preprocessing through guarded resampling and provides a unified interface for safer automated ML including survival analysis.