pith. machine review for the scientific record. sign in

arxiv: 1707.00061 · v1 · submitted 2017-06-30 · 💻 cs.CY · cs.CL

Recognition: unknown

Racial Disparity in Natural Language Processing: A Case Study of Social Media African-American English

Authors on Pith no claims yet
classification 💻 cs.CY cs.CL
keywords languagedisparityafrican-americanenglishnaturalprocessingracialsocial
0
0 comments X
read the original abstract

We highlight an important frontier in algorithmic fairness: disparity in the quality of natural language processing algorithms when applied to language from authors of different social groups. For example, current systems sometimes analyze the language of females and minorities more poorly than they do of whites and males. We conduct an empirical analysis of racial disparity in language identification for tweets written in African-American English, and discuss implications of disparity in NLP.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Ethical and social risks of harm from Language Models

    cs.CL 2021-12 accept novelty 6.0

    The authors provide a detailed taxonomy of 21 risks associated with language models, covering discrimination, information leaks, misinformation, malicious applications, interaction harms, and societal impacts like job...