pith. machine review for the scientific record. sign in

arxiv: 2605.02773 · v1 · submitted 2026-05-04 · 💻 cs.CY

Recognition: unknown

A Critical Pragmatism Approach for Algorithmic Fairness: Lessons from Urban Planning Theory

Authors on Pith no claims yet

Pith reviewed 2026-05-08 17:25 UTC · model grok-4.3

classification 💻 cs.CY
keywords algorithmic fairnesscritical pragmatismurban planningwicked problemsmachine learning ethicsstakeholder engagementalgorithm designgovernance
0
0 comments X

The pith

Algorithmic fairness problems can be addressed by applying critical pragmatism from urban planning theory to manage value conflicts and power dynamics.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper tries to establish that algorithmic fairness issues mirror the wicked problems of urban planning, which are complex, value-laden, and involve conflicts over power and resources. A sympathetic reader would care because traditional approaches in machine learning, such as formal fairness definitions, often fail to account for the practical realities of stakeholder engagement and governance that arise in deployment. By adapting critical pragmatism from planning theory, the authors offer a flexible framework that emphasizes reflection on what practitioners actually do in the face of conflict. This is shown through specific recommendations applied to cases like mortgage lending algorithms, school choice systems, and data collection on feminicide. If the parallel holds, it could help data scientists and policymakers create algorithms that better navigate real-world ethical challenges.

Core claim

We draw a parallel between algorithmic fairness problems and urban planning, framing them as wicked problems. We argue that algorithmic fairness can learn from urban planning theory, specifically critical pragmatism, a reflective and deliberative approach to addressing such problems that considers what practitioners actually do in the face of conflict and power. We provide specific recommendations and apply them to case studies in ML and algorithm design: automated mortgage lending, school choice, and feminicide counterdata collection. Researchers and practitioners can incorporate these recommendations derived from urban planning into their ongoing work to more holistically address practical

What carries the argument

critical pragmatism, a reflective and deliberative approach to addressing wicked problems that considers what practitioners actually do in the face of conflict and power

If this is right

  • Existing fairness frameworks overlook practical challenges of governance, resource allocation, and stakeholder engagement that the new approach addresses.
  • Researchers and practitioners can incorporate specific recommendations from urban planning into their work on fair algorithm design.
  • The framework applies directly to automated mortgage lending, school choice systems, and feminicide counterdata collection.
  • It supports a more holistic response to complex ethical decisions in machine learning by accounting for power dynamics.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • This method could extend to other AI ethics areas like content moderation where technical rules clash with social values.
  • Algorithm design processes might benefit from structured deliberative consultations with affected communities, similar to urban planning meetings.
  • A testable extension would be to run pilot comparisons measuring whether the approach reduces deployment harms more than standard fairness audits alone.
  • Similar parallels may exist with wicked problems in fields like environmental policy or public health resource allocation.

Load-bearing premise

The structural similarities between urban planning conflicts and algorithmic fairness decisions are strong enough for direct transfer of critical pragmatism methods without substantial loss of applicability or need for unstated domain-specific modifications.

What would settle it

A real-world test where applying the critical pragmatism recommendations to an algorithmic fairness case produces no improvement in stakeholder conflict resolution or perceived fairness outcomes compared to using only mathematical fairness definitions would challenge the transferability of the approach.

Figures

Figures reproduced from arXiv: 2605.02773 by Allison Koenecke, Jennah Gosciak, Karen Levy.

Figure 1
Figure 1. Figure 1: Connections are depicted between urban planning theory and algorithmic fairness. Rational-comprehensive planning models and formal fairness definitions are more apt for “tame” or “benign” problems, as discussed in Rittel and Webber [133], while pragmatic plan￾ning models and holistic fairness definitions address “wicked problems” [133]. These terms are defined in Sections 2, 3, and 5.2. fairness, which has… view at source ↗
Figure 2
Figure 2. Figure 2: Shown is an abstract representation of the pragmatic influence on planning theory and ideas since Rittel and Webber [133]’s seminal work on wicked problems. The selected theories are based on discussions in [83, 81, 62]. The arrow represents the influence of these ideas over time. While algorithmic fairness has analogously taken a “participatory turn,” principles like reflective and deliberative practice a… view at source ↗
Figure 3
Figure 3. Figure 3: Presented here are the four elements of a critical pragmatism approach for algo￾rithmic fairness: (1) reflect-in-action, (2) practice deliberation through public meetings, (3) listen critically and negotiate creatively, and (4) share practice stories. matism, we propose four recommendations for algorithmic fairness practitioners, which are shown in view at source ↗
read the original abstract

As data scientists grapple with increasingly complex ethical decisions in machine learning (ML) and data science, the field of algorithmic fairness has offered multiple solutions, from formal mathematical definitions to holistic notions of fairness drawn from various academic disciplines. However, navigating and implementing these fairness approaches in practice remains an ongoing challenge. In this paper, we draw a parallel between the types of problems arising in algorithmic fairness and urban planning. We frame algorithmic fairness problems as `wicked problems,' a term originating from the planning and policy space to describe the intractable, value-laden, and complex nature of this work. As such, we argue that the field of algorithmic fairness can learn from theoretical work in urban planning in ameliorating its own set of wicked problems. Urban planning is typically concerned with practical issues of governance, resource allocation, stakeholder engagement, and conflicts involving deep-seated differences. These are challenges that existing fairness frameworks can easily overlook. We present a flexible framework for designing fairer algorithms based on the urban planning theory approach of critical pragmatism -- a reflective and deliberative approach to addressing wicked problems that considers what practitioners actually do in the face of conflict and power. We provide specific recommendations and apply them to several case studies in ML and algorithm design: automated mortgage lending, school choice, and feminicide counterdata collection. Researchers and practitioners can incorporate these recommendations derived from urban planning into their ongoing work to more holistically address practical problems arising in fair algorithm design.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 1 minor

Summary. The paper claims that algorithmic fairness problems are 'wicked problems' similar to those in urban planning, and that critical pragmatism from urban planning theory offers a reflective, deliberative framework for designing fairer algorithms. It derives specific recommendations from this approach and applies them to case studies in automated mortgage lending, school choice, and feminicide counterdata collection, arguing that this can help address practical challenges like stakeholder conflicts and power dynamics overlooked by existing fairness frameworks.

Significance. If the analogy and transfer hold, the work could meaningfully broaden algorithmic fairness research by importing practice-oriented insights on governance and deliberation from urban planning, potentially yielding more context-sensitive and implementable fairness strategies that complement mathematical definitions. The emphasis on what practitioners actually do in the face of conflict is a strength, but the significance is tempered by the absence of demonstrated technical integration or empirical outcomes.

major comments (2)
  1. The central transfer of critical pragmatism assumes structural equivalence between urban planning conflicts and algorithmic design decisions, but the manuscript provides no explicit mechanism showing how deliberative processes would alter concrete technical choices such as loss functions, fairness constraints, or evaluation metrics. This is load-bearing for the claim that the framework yields actionable recommendations for ML pipelines.
  2. In the case studies (automated mortgage lending, school choice, feminicide counterdata), the applications describe high-level recommendations but do not demonstrate how stakeholder deliberation or reflective practice would produce specific modifications to the algorithms or data practices, leaving the operationalization of the framework untested and potentially ad-hoc.
minor comments (1)
  1. The abstract states that the recommendations are 'specific' and applied to case studies, but the level of detail in the provided text remains high-level; clarifying the granularity expected in the full recommendations section would improve reader expectations.

Simulated Author's Rebuttal

2 responses · 0 unresolved

We thank the referee for these constructive comments, which identify key areas where the manuscript can more clearly articulate the link between the proposed framework and technical practice. We address each point below and will revise the paper to strengthen the demonstration of operationalization.

read point-by-point responses
  1. Referee: The central transfer of critical pragmatism assumes structural equivalence between urban planning conflicts and algorithmic design decisions, but the manuscript provides no explicit mechanism showing how deliberative processes would alter concrete technical choices such as loss functions, fairness constraints, or evaluation metrics. This is load-bearing for the claim that the framework yields actionable recommendations for ML pipelines.

    Authors: We agree that the manuscript would benefit from a more explicit illustration of how deliberative outcomes can inform specific technical decisions. The critical pragmatism approach is intended to shape the upstream process of identifying relevant fairness criteria and power dynamics through stakeholder reflection, which then guides downstream choices such as which fairness constraints to impose or which metrics to prioritize. While the paper does not claim a one-to-one mapping, we will add a new subsection with worked examples showing how deliberation in one case study could lead to adjustments in model constraints or evaluation criteria, thereby making the actionability more concrete. revision: partial

  2. Referee: In the case studies (automated mortgage lending, school choice, feminicide counterdata), the applications describe high-level recommendations but do not demonstrate how stakeholder deliberation or reflective practice would produce specific modifications to the algorithms or data practices, leaving the operationalization of the framework untested and potentially ad-hoc.

    Authors: The case studies are presented as illustrative applications of the recommendations rather than as empirical tests of deliberation in action. We acknowledge that they currently remain at a high level and do not simulate the step-by-step effects of reflective practice on algorithm or data modifications. In revision we will expand each case study with additional detail on plausible deliberation outcomes and the resulting concrete changes to data practices or model specifications, while clarifying that these remain hypothetical illustrations given the theoretical nature of the contribution. revision: yes

Circularity Check

0 steps flagged

No circularity: framework imported from external urban planning theory

full rationale

The paper's derivation consists of framing algorithmic fairness issues as 'wicked problems' (a concept originating in planning literature) and proposing to adapt critical pragmatism methods from urban planning. This is a direct conceptual transfer from an external discipline rather than a derivation from the authors' prior fairness results, fitted parameters, or self-referential definitions. No equations, statistical predictions, or load-bearing self-citations appear in the provided text; the case-study applications are illustrative rather than reductions of the core claim to its inputs. The transferability assumption is an empirical or analogical claim open to external evaluation, not a circularity.

Axiom & Free-Parameter Ledger

0 free parameters · 2 axioms · 0 invented entities

The paper rests on two domain assumptions about problem similarity and transferability; no free parameters or invented entities are introduced.

axioms (2)
  • domain assumption Algorithmic fairness problems qualify as wicked problems in the sense used in urban planning and policy.
    This is the opening framing that justifies importing planning theory.
  • ad hoc to paper Critical pragmatism developed for urban planning can be adapted into actionable recommendations for algorithm design.
    This is the load-bearing transfer step that produces the proposed framework.

pith-pipeline@v0.9.0 · 5562 in / 1268 out tokens · 29456 ms · 2026-05-08T17:25:13.060288+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

176 extracted references · 36 canonical work pages

  1. [1]

    Atila Abdulkadiro˘ glu, Parag A Pathak, and Alvin E Roth. 2005. The new york city high school match.American Economic Review95, 2 (2005), 364–367

  2. [2]

    Michael Akinwumi, John Merrill, Lisa Rice, Kareem Saleh, and Maureen Yap. 2021. An AI fair lending policy agenda for the federal financial regulators

  3. [3]

    Nil-Jana Akpinar, Manish Nagireddy, Logan Stapleton, Hao-Fei Cheng, Haiyi Zhu, Steven Wu, and Hoda Heidari. 2022. A sandbox tool to bias (stress)-test fairness algorithms

  4. [4]

    Robert W Aldridge, Alistair Story, Stephen W Hwang, Merete Nordentoft, Serena A Luchenski, Greg Hartwell, Emily J Tweed, Dan Lewer, Srinivasa Vittal Katikireddi, and Andrew C Hayward. 2018. Morbidity and mortality in homeless individuals, pris- oners, sex workers, and individuals with substance use disorders in high-income coun- tries: a systematic review...

  5. [5]

    Ernest R Alexander. 1981. If planning isn’t everything, maybe its something.The Town Planning Review52, 2 (1981), 131–142

  6. [6]

    Ernest R Alexander. 2022. On planning, planning theories, and practices: A critical reflection.Planning Theory21, 2 (2022), 181–211. 20

  7. [7]

    Philip Allmendinger and Mark Tewdwr-Jones. 2002. The communicative turn in urban planning: Unravelling paradigmatic, imperialistic and moralistic dimensions.Space and polity6, 1 (2002), 5–24

  8. [8]

    Elizabeth Anderson. 2020. How to be a pragmatist. InThe Routledge Handbook of Practical Reason, Ruth Chang and Kurt Sylvan (Eds.). Routledge, Oxford, 83–94

  9. [9]

    Anderson

    M. Anderson. 1964.The Federal Bulldozer: A Critical Analysis of Urban Renewal, 1949-1962. M.I.T. Press.https://books.google.com/books?id=Zy8zAAAAMAAJ

  10. [10]

    Julia Angwin, Jeff Larson, Surya Mattu, and Lauren Kirchner. 2016. Machine Bias. Retrieved January 21, 2024 fromhttps://www.propublica.org/article/machine -bias-risk-assessments-in-criminal-sentencing

  11. [11]

    Falaah Arif Khan, Eleni Manis, and Julia Stoyanovich. 2022. Towards Substan- tive Conceptions of Algorithmic Fairness: Normative Guidance from Equal Oppor- tunity Doctrines. InProceedings of the 2nd ACM Conference on Equity and Access in Algorithms, Mechanisms, and Optimization(Arlington, VA, USA)(EAAMO ’22). Association for Computing Machinery, New York,...

  12. [12]

    Sherry R Arnstein. 1969. A ladder of citizen participation.Journal of the American Institute of planners35, 4 (1969), 216–224

  13. [13]

    American Planning Association. [n. d.]. Planning Specializations.https://www.plan ning.org/choosingplanning/specializations/

  14. [14]

    Mohammad Javad Azizi, Phebe Vayanos, Bryan Wilder, Eric Rice, and Milind Tambe

  15. [15]

    InIntegration of Constraint Programming, Artificial Intelligence, and Operations Research, Willem-Jan van Hoeve (Ed.)

    Designing Fair, Efficient, and Interpretable Policies for Prioritizing Homeless Youth for Housing Resources. InIntegration of Constraint Programming, Artificial Intelligence, and Operations Research, Willem-Jan van Hoeve (Ed.). Springer Interna- tional Publishing, Cham, 35–51

  16. [16]

    2012.Pragmatism: an introduction

    Michael Bacon. 2012.Pragmatism: an introduction. Polity

  17. [17]

    Peggy Bailey. 2022. Addressing the Affordable Housing Crisis Requires Expanding Rental Assistance and Adding Housing Units.https://www.cbpp.org/research/ho using/addressing-the-affordable-housing-crisis-requires-expanding-ren tal-assistance-and

  18. [18]

    2023.Fairness and Machine Learning: Limitations and Opportunities

    Solon Barocas, Moritz Hardt, and Arvind Narayanan. 2023.Fairness and Machine Learning: Limitations and Opportunities. MIT Press

  19. [19]

    Reuben Binns. 2018. Fairness in Machine Learning: Lessons from Political Philosophy. InProceedings of the 1st Conference on Fairness, Accountability and Transparency (Proceedings of Machine Learning Research, Vol. 81), Sorelle A. Friedler and Christo Wilson (Eds.). PMLR, 149–159.https://proceedings.mlr.press/v81/binns18a .html 21

  20. [20]

    Abeba Birhane, William Isaac, Vinodkumar Prabhakaran, Mark Diaz, Madeleine Clare Elish, Iason Gabriel, and Shakir Mohamed. 2022. Power to the People? Opportunities and Challenges for Participatory AI. , Article 6 (2022), 8 pages. doi:10.1145/355162 4.3555290

  21. [21]

    Alan Black. 1990. The Chicago area transportation study: A case study of rational planning.Journal of Planning Education and Research10, 1 (1990), 27–37

  22. [22]

    Emily Black, Rakshit Naidu, Rayid Ghani, Kit Rodolfa, Daniel Ho, and Hoda Heidari

  23. [23]

    InProceedings of the 3rd ACM Conference on Equity and Access in Algorithms, Mechanisms, and Optimization (EAAMO ’23)

    Toward Operationalizing Pipeline-aware ML Fairness: A Research Agenda for Developing Practical Guidelines and Tools. InProceedings of the 3rd ACM Conference on Equity and Access in Algorithms, Mechanisms, and Optimization (EAAMO ’23). Association for Computing Machinery, New York, NY, USA, 1–11. doi:10.1145/36 17694.3623259

  24. [24]

    Maya Brennan, Patrick Reed, and Lisa A Sturtevant. 2011. The impacts of affordable housing on education: A research summary. InWashington, DC: Center for Housing Policy and National Housing Conference

  25. [25]

    Alexander Buhmann and Christian Fieseler. 2023. Deep learning meets deep democ- racy: Deliberative governance and responsible innovation in artificial intelligence.Busi- ness Ethics Quarterly33, 1 (2023), 146–179

  26. [26]

    Gene Callahan and Sanford Ikeda. 2014. Jane Jacobs’ critique of rationalism in urban planning.Cosmos Taxis1, 3 (2014), 10–19

  27. [27]

    Alex Chohlas-Wood, Madison Coots, Sharad Goel, and Julian Nyarko. 2023. Designing equitable algorithms.Nature Computational Science3, 7 (2023), 601–610

  28. [28]

    Alexandra Chouldechova. 2017. Fair prediction with disparate impact: A study of bias in recidivism prediction instruments.Big data5, 2 (2017), 153–163

  29. [29]

    Alexandra Chouldechova. 2017. Fair Prediction with Disparate Impact: A Study of Bias in Recidivism Prediction Instruments.Big Data5, 2 (2017), 153–163. arXiv:https://doi.org/10.1089/big.2016.0047 doi:10.1089/big.2016.0047PMID: 28632438

  30. [30]

    C West Churchman, Jean-Pierre Protzen, Melvin M Webber, and David Krogh. 2007. In Memoriam: Horst WJ Rittel.Design Issues23, 1 (2007), 89–91

  31. [31]

    Troy Closson. 2024. What Happened When Brooklyn Tried to Integrate Its Middle Schools.https://www.nytimes.com/2024/06/20/nyregion/brooklyn-middle-sch ools-integration.html

  32. [32]

    Sarah R Cohodes, Sean P Corcoran, Jennifer L Jennings, and Carolyn Sattin-Bajaj

  33. [33]

    When do informational interventions work? Experimental evidence from New York City high school choice.Educational Evaluation and Policy Analysis47, 1 (2025), 208–236. 22

  34. [34]

    Maddy Varner Colin Lecher. [n. d.]. NYC’s School Algorithms Cement Segregation. This Data Shows How.https://themarkup.org/machine-learning/2021/05/26 /nycs-school-algorithms-cement-segregation-this-data-shows-how

  35. [35]

    Consumer Financial Protection Bureau. 2024. Notice and opportunities to comment. Retrieved January 21, 2024 fromhttps://www.consumerfinance.gov/rules-polic y/notice-opportunities-comment/

  36. [36]

    Eric Corbett, Remi Denton, and Sheena Erete. 2023. Power and Public Participation in AI. InProceedings of the 3rd ACM Conference on Equity and Access in Algorithms, Mechanisms, and Optimization(Boston, MA, USA)(EAAMO ’23). Association for Computing Machinery, New York, NY, USA, Article 8, 13 pages. doi:10.1145/3617 694.3623228

  37. [37]

    2020.Design justice: Community-led practices to build the worlds we need

    Sasha Costanza-Chock. 2020.Design justice: Community-led practices to build the worlds we need. The MIT Press

  38. [38]

    Amanda Coston, Ashesh Rambachan, and Alexandra Chouldechova. 2021. Charac- terizing Fairness Over the Set of Good Models Under Selective Labels. InProceedings of the 38th International Conference on Machine Learning (Proceedings of Machine Learning Research, Vol. 139), Marina Meila and Tong Zhang (Eds.). PMLR, 2144– 2155.https://proceedings.mlr.press/v139...

  39. [39]

    Paul Davidoff. 1965. Advocacy and pluralism in planning.Journal of the American Institute of planners31, 4 (1965), 331–338

  40. [40]

    Jenna Davis. 2021. The double-edged sword of upzoning.https://www.brookings. edu/articles/the-double-edged-sword-of-upzoning/

  41. [41]

    Jenny L Davis, Apryl Williams, and Michael W Yang. 2021. Algorithmic reparation. Big Data & Society8, 2 (2021), 20539517211044808

  42. [42]

    Fernando Delgado, Stephen Yang, Michael Madaio, and Qian Yang. 2023. The Par- ticipatory Turn in AI Design: Theoretical Foundations and the Current State of Practice. InProceedings of the 3rd ACM Conference on Equity and Access in Al- gorithms, Mechanisms, and Optimization(Boston, MA, USA)(EAAMO ’23). As- sociation for Computing Machinery, New York, NY, U...

  43. [43]

    Wesley Hanwen Deng, Solon Barocas, and Jennifer Wortman Vaughan. 2025. Sup- porting Industry Computing Researchers in Assessing, Articulating, and Addressing the Potential Negative Societal Impact of Their Work.Proceedings of the ACM on Human-Computer Interaction9, 2 (2025), 1–37

  44. [44]

    Wesley Hanwen Deng, Manish Nagireddy, Michelle Seng Ah Lee, Jatinder Singh, Zhi- wei Steven Wu, Kenneth Holstein, and Haiyi Zhu. 2022. Exploring How Machine Learning Practitioners (Try To) Use Fairness Toolkits. InProceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency(Seoul, Republic of Korea) 23 (FAccT ’22). Association for ...

  45. [45]

    Matthew Desmond and Carl Gershenson. 2016. Housing and employment insecurity among the working poor.Social problems63, 1 (2016), 46–67

  46. [46]

    1927.The Public and Its Problems

    John Dewey. 1927.The Public and Its Problems. Project Gutenberg

  47. [47]

    Catherine D’Ignazio. 2022. –Co-Designing for Counterdata Science.Counting Femi- nicide: Data Feminism in Action(2022)

  48. [48]

    Catherine D’Ignazio, Isadora Cruxˆ en, Helena Su´ arez Val, Angeles Martinez Cuba, Mariel Garc´ ıa-Montes, Silvana Fumega, Harini Suresh, and Wonyoung So. 2022. Fem- inicide and counterdata production: Activist efforts to monitor and challenge gender- related violence.Patterns3, 7 (2022)

  49. [49]

    2023.Data feminism

    Catherine D’Ignazio and Lauren F Klein. 2023.Data feminism. MIT press

  50. [50]

    Roel Dobbe, Thomas Krendl Gilbert, and Yonatan Mintz. 2021. Hard choices in artificial intelligence.Artificial Intelligence300 (2021), 103555

  51. [51]

    Cynthia Dwork, Moritz Hardt, Toniann Pitassi, Omer Reingold, and Richard Zemel

  52. [52]

    Proceedings of the 3rd Innovations in Theoretical Computer Science Conference , pages =

    Fairness through awareness. InProceedings of the 3rd Innovations in Theoretical Computer Science Conference(Cambridge, Massachusetts)(ITCS ’12). Association for Computing Machinery, New York, NY, USA, 214–226. doi:10.1145/2090236.2090255

  53. [53]

    Michael Elsen-Rooney. 2024. How the NYC high school admissions process sorts kids by race, poverty, disability.https://www.chalkbeat.org/newyork/2024/10/16/nyc -high-school-admissions-sorts-students-by-race-poverty-and-disability/

  54. [54]

    Michael Elsen-Rooney. 2024. NYC reformed high school admissions 20 years ago. Did it make things better?https://www.chalkbeat.org/newyork/2024/12/16/high-s chool-admissions-reforms-segregation-inequality-conference/

  55. [55]

    Alessandro Fabris, Nina Baranowska, Matthew J Dennis, David Graus, Philipp Hacker, Jorge Saldivar, Frederik Zuiderveen Borgesius, and Asia J Biega. 2025. Fairness and bias in algorithmic hiring: A multidisciplinary survey.ACM Transactions on Intelligent Systems and Technology16, 1 (2025), 1–54

  56. [56]

    Sina Fazelpour and Zachary C. Lipton. 2020. Algorithmic Fairness from a Non-ideal Perspective. InProceedings of the AAAI/ACM Conference on AI, Ethics, and Society (AIES ’20). Association for Computing Machinery, New York, NY, USA, 57–63. doi:10 .1145/3375627.3375828

  57. [57]

    Federal Housing Finance Agency. 2023. FHFA Announces Next Phase of Public En- gagement Process for Updated Credit Score Requirements. Retrieved January 21, 2024 fromhttps://www.fhfa.gov/Media/PublicAffairs/Pages/FHFA-Announces -Next-Phase-of-Public-Engagement-Process-for-Updated-Credit-Score-Req uirements.aspx 24

  58. [58]

    Federal Housing Finance Agency. 2023. Provide Input. Retrieved January 21, 2024 fromhttps://www.fhfa.gov/AboutUs/Contact/Pages/Request-for-Information -Form.aspx

  59. [59]

    Michael Feffer, Michael Skirpan, Zachary Lipton, and Hoda Heidari. 2023. From Pref- erence Elicitation to Participatory ML: A Critical Survey & Guidelines for Future Research. InProceedings of the 2023 AAAI/ACM Conference on AI, Ethics, and So- ciety. 38–48

  60. [60]

    Will Fischer, Sonya Acosta, and Anna Bailey. 2021. An Agenda for the Future of Public Housing.https://www.cbpp.org/research/housing/an-agenda-for-the -future-of-public-housing

  61. [61]

    John Forester. 1980. Critical theory and planning practice.Journal of the American Planning Association46, 3 (1980), 275–286

  62. [62]

    John Forester. 1982. Planning in the Face of Power.Journal of the american planning association48, 1 (1982), 67–80

  63. [63]

    1999.The deliberative practitioner: Encouraging participatory planning processes

    John Forester. 1999.The deliberative practitioner: Encouraging participatory planning processes. Mit Press

  64. [64]

    John Forester. 2012. Learning to improve practice: Lessons from practice stories and practitioners’ own discourse analyses (or why only the loons show up).Planning Theory & Practice13, 1 (2012), 11–26

  65. [65]

    John Forester. 2013. On the theory and practice of critical pragmatism: Deliberative practice and creative negotiations.Planning Theory12, 1 (2013), 5–22.https: //www.jstor.org/stable/26165914Publisher: Sage Publications, Ltd

  66. [66]

    John Forester. 2017. On the evolution of a critical pragmatism.Encounters with planning thought(2017), 280–296

  67. [67]

    John Forester. 2020. Five generations of theory–practice tensions: enriching socio- ecological practice research.Socio-Ecological Practice Research2 (2020), 111–119

  68. [68]

    1988.Planning in the Face of Power

    John F Forester. 1988.Planning in the Face of Power. University of California Press

  69. [69]

    1993.Critical Theory, Public Policy, and Planning Practice

    John F Forester. 1993.Critical Theory, Public Policy, and Planning Practice. State University of New York Press

  70. [70]

    Batya Friedman and Helen Nissenbaum. 1996. Bias in computer systems.ACM Trans- actions on information systems (TOIS)14, 3 (1996), 330–347

  71. [71]

    John Friedmann. 1973. The transactive style of planning.Retracking America: A theory of transactive planning(1973), 171–193

  72. [72]

    J Friedmann. 1987. Planning in the public domain. Princeton.Press, NJ(1987). 25

  73. [73]

    Herbert J Gans. 1959. The human implications of current redevelopment and relocation planning.Journal of the American Institute of Planners25, 1 (1959), 15–26

  74. [74]

    Nikhil Garg. 2025. Heterogeneous participation and allocation skews: when is choice” worth it”?arXiv preprint arXiv:2507.03600(2025)

  75. [75]

    Marissa Kumar Gerchick, Ro Encarnaci´ on, Cole Tanigawa-Lau, Lena Armstrong, Ana Guti´ errez, and Dana´ e Metaxa. 2025. Auditing the Audits: Lessons for Algorithmic Accountability from Local Law 144’s Bias Audits. InProceedings of the 2025 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’25). Association for Computing Machinery, New Yo...

  76. [76]

    Laurie S Goodman and Christopher Mayer. 2018. Homeownership and the American dream.Journal of Economic Perspectives32, 1 (2018), 31–58

  77. [77]

    2019.The smart enough city: Putting technology in its place to reclaim our urban future

    Ben Green. 2019.The smart enough city: Putting technology in its place to reclaim our urban future. MIT Press

  78. [78]

    Ben Green. 2020. The false promise of risk assessments: epistemic reform and the limits of fairness. InProceedings of the 2020 Conference on Fairness, Accountability, and Transparency(Barcelona, Spain)(FAT* ’20). Association for Computing Machinery, New York, NY, USA, 594–606. doi:10.1145/3351095.3372869

  79. [79]

    Ben Green. 2022. Escaping the impossibility of fairness: From formal to substantive algorithmic fairness.Philosophy & Technology35, 4 (2022), 90

  80. [80]

    Ben Green and Salom´ e Viljoen. 2020. Algorithmic realism: expanding the boundaries of algorithmic thought. InProceedings of the 2020 Conference on Fairness, Account- ability, and Transparency(Barcelona, Spain)(FAT* ’20). Association for Computing Machinery, New York, NY, USA, 19–31. doi:10.1145/3351095.3372840

Showing first 80 references.