Recognition: unknown
Statistical Decisions Using Likelihood Information Without Prior Probabilities
read the original abstract
This paper presents a decision-theoretic approach to statistical inference that satisfies the likelihood principle (LP) without using prior information. Unlike the Bayesian approach, which also satisfies LP, we do not assume knowledge of the prior distribution of the unknown parameter. With respect to information that can be obtained from an experiment, our solution is more efficient than Wald's minimax solution.However, with respect to information assumed to be known before the experiment, our solution demands less input than the Bayesian solution.
This paper has not been read by Pith yet.
Forward citations
Cited by 1 Pith paper
-
Possibilistic Predictive Uncertainty for Deep Learning
DAPPr introduces a possibilistic framework that projects parameter posteriors to predictions via supremum and approximates them with Dirichlet possibility functions to yield efficient, closed-form epistemic uncertaint...
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.