Pandora's Regret is a closed-form pairwise scoring rule derived from expected optimal search costs that elicits true probabilities and outperforms log loss, accuracy, and F1 at predicting diagnostic costs on MedMNIST models.
Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining , pages =
4 Pith papers cite this work. Polarity classification is still indexing.
years
2026 4representative citing papers
Standard OLS fairness tests for deterministic pricing algorithms use invalid standard errors; corrected estimators reveal that all 34 tested Illinois auto insurers discriminate against minority zip codes.
The Pareto frontier of fair algorithmic decisions consists of deterministic group-specific threshold rules on predicted success probabilities, which can include upper bounds for some fairness metrics and holds independently of model training approach.
Prioritization algorithms in public services generate relative disparities among intersectional groups as resources become scarce, intensifying perceptions of inequality.
citing papers explorer
-
Pandora's Regret: A Proper Scoring Rule for Evaluating Sequential Search
Pandora's Regret is a closed-form pairwise scoring rule derived from expected optimal search costs that elicits true probabilities and outperforms log loss, accuracy, and F1 at predicting diagnostic costs on MedMNIST models.
-
Fairness Testing for Algorithmic Pricing
Standard OLS fairness tests for deterministic pricing algorithms use invalid standard errors; corrected estimators reveal that all 34 tested Illinois auto insurers discriminate against minority zip codes.
-
Fairness vs Performance: Characterizing the Pareto Frontier of Algorithmic Decision Systems
The Pareto frontier of fair algorithmic decisions consists of deterministic group-specific threshold rules on predicted success probabilities, which can include upper bounds for some fairness metrics and holds independently of model training approach.
-
The Paradox of Prioritization in Public Sector Algorithms
Prioritization algorithms in public services generate relative disparities among intersectional groups as resources become scarce, intensifying perceptions of inequality.