Back to issue
Cover of: When Algorithms Import Private Bias into Public Enforcement: The Promise and Limitations ofStatistical Debiasing Solutions
Daniel E. Ho, Kristen M. Altenburger

When Algorithms Import Private Bias into Public Enforcement: The Promise and Limitations ofStatistical Debiasing Solutions

Section: Articles
Volume 175 (2019) / Issue 1, pp. 98-122 (25)
Published 19.12.2018
DOI 10.1628/jite-2019-0001
  • article PDF
  • available
  • 10.1628/jite-2019-0001
Due to a system change, access problems and other issues may occur. We are working with urgency on a solution. We apologise for any inconvenience.
Summary
We make two contributions to understanding the role of algorithms in regulatory enforcement. First, we illustrate how big-data analytics can inadvertently import private biases into public policy. We show that a much-hyped use of predictive analytics – using consumer data to target food-safety enforcement – can disproportionately harm Asian establishments. Second, we study a solution by Pope and Sydnor (2011), which aims to debias predictors via marginalization, while still using information of contested predictors. We find the solution may be limited when protected groups have distinct predictor distributions, due to model extrapolation. Common machine-learning techniques heighten these problems.