Recognition: unknown
Adapting the ABC distance function
read the original abstract
Approximate Bayesian computation performs approximate inference for models where likelihood computations are expensive or impossible. Instead simulations from the model are performed for various parameter values and accepted if they are close enough to the observations. There has been much progress on deciding which summary statistics of the data should be used to judge closeness, but less work on how to weight them. Typically weights are chosen at the start of the algorithm which normalise the summary statistics to vary on similar scales. However these may not be appropriate in iterative ABC algorithms, where the distribution from which the parameters are proposed is updated. This can substantially alter the resulting distribution of summary statistics, so that different weights are needed for normalisation. This paper presents two iterative ABC algorithms which adaptively update their weights and demonstrates improved results on test applications.
This paper has not been read by Pith yet.
Forward citations
Cited by 2 Pith papers
-
Linked-Tucker Factorized Individualized Regression for Paired Multivariate Categorical Outcomes
A linked Tucker tensor factorization enables a joint individualized hurdle-ordinal regression model that uncovers spatially heterogeneous effects of fluoride and diet on paired caries and fluorosis outcomes.
-
Linked-Tucker Factorized Individualized Regression for Paired Multivariate Categorical Outcomes
A linked-Tucker factorized hurdle-ordinal regression model is developed for paired zero-inflated ordinal dental outcomes with individualized effects, applied to the Iowa Fluoride Study to identify spatially heterogene...
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.