Washington, D.C. has 690,000 people living there, and 29 mysterious algorithms that influence their daily lives. Automation is used by city agencies for a variety of purposes, including screening housing applicants, predicting criminal recidivism, spotting food assistance fraud, predicting high school dropout risk, and informing sentences for young offenders.
The Electronic Privacy Information Center has released a paper that includes that image of semi-automated urban living (EPIC). The nonprofit’s 14-month investigation into the city’s use of algorithms revealed that 20 agencies used them, with more than a third being used in law enforcement or criminal justice. Numerous city entities refused to give complete explanations of how their technology operated or was used. The project team came to the conclusion that there are probably yet more algorithms being used by the city that they were unable to find.
The findings are noteworthy outside of DC because they provide more proof that other cities have covertly implemented bureaucratic algorithms throughout their departments, where they can influence choices that have an impact on the lives of their residents.
Government organizations frequently use automation to try to make bureaucratic procedures more effective or objective, but it’s frequently hard for the public to know when this is happening, and some systems have been discovered to discriminate and produce outcomes that endanger human lives. 40,000 fraudulent fraud allegations were made in Michigan as a result of an unemployment-fraud detection system with a 93 percent mistake rate. According to a 2020 analysis by Stanford University and New York University, nearly half of federal agencies use automated decision-making tools in some capacity.
In order to get a sense of the myriad ways algorithms might affect citizens’ lives and inspire people in other places to engage in similar activities, EPIC looked closely at how one city used algorithms. Approximately half of the population of Washington, according to Ben Winters, who oversees the nonprofit’s work on AI and human rights, identifies as Black.
“More often than not, automated decision-making systems have disproportionate impacts on Black communities,” Winters says. The project found evidence that automated traffic-enforcement cameras are disproportionately placed in neighborhoods with more Black residents.
Recent efforts against municipal algorithms, notably in police, have focused heavily on cities with sizable Black populations. After the erroneous arrests of Robert Williams and Michael Oliver in 2019 as a result of algorithms misidentifying them, Detroit became the center of discussions concerning face recognition. After Freddie Gray died in police custody in 2015, the use of face recognition by law enforcement sparked some of the first legislative probes into the practice.
In addition to submitting public records requests for contracts, data sharing agreements, privacy impact studies, and other documents, EPIC hunted algorithms by looking for public disclosures by local agencies. Six out of the twelve city agencies answered, giving information such as a $295,000 contract with Pondera Systems, a Thomson Reuters company that develops the FraudCaster fraud detection tool used to evaluate applicants for food aid. California officials discovered earlier this year that more than half of 1.1 million claims made by citizens of the state that Pondera’s software tagged as suspicious were in fact true.
However, agencies generally refused to divulge details about their systems, citing trade secrets and confidentiality. Because of this, it was practically impossible to catalog every algorithm utilized in DC. An earlier attempt this year by Yale Law School researchers to count the number of algorithms utilized by Connecticut state agencies was unsuccessful due to allegations of trade confidentiality.
Governments, according to EPIC, can aid citizens in understanding how they employ algorithms by requiring disclosure any time a system makes a crucial choice that affects a person’s life. Additionally, several elected leaders supported the concept of mandating public registers of the government’s automated decision-making systems. Legislators in Pennsylvania suggested an algorithm registry law last month after a screening algorithm convicted low-income parents of neglecting their children.