Skip to main content

Why Meaningful Algorithm Auditing is Key to Protecting Civil Rights in the Digital Age


Employers today rely on various kinds of artificial intelligence (AI) or other automated tools in their hiring processes, including to advertise job opportunities, screen applications, assess candidates, and conduct interviews. Many of these tools carry well-documented risks of discrimination that can exacerbate existing inequities in the workplace. Employers should avoid tools altogether that carry a high risk of discrimination based on disabilities, race, sex and other protected characteristics, such as personality assessments and AI-analyzed interviews. But where an employer is considering using or is already using an AI tool, robust auditing for discrimination and other harms is one critical step to address the dangers that these tools pose and ensure that it is not violating civil rights laws.

But as usual, the devil is in the details.

A rigorous and holistic discrimination audit of an automated tool — both before and periodically after deployment — can provide employers information to help them determine whether to adopt a tool at all, what mitigation measures may be needed, and whether they need to abandon a tool after adoption. Auditing can also bring much needed transparency when audits are shared with the public, providing critical information for job applicants, researchers, and regulators. On the other hand, algorithm audits that are not carefully crafted can be gamed to present a misleading picture of the system in question or can serve as a cursory box-checking exercise, potentially legitimizing systems that may be discriminatory.

As regulators and legislators are increasingly focused on addressing the impacts of automated systems in critical areas like hiring and employment, including creating requirements for auditing, these efforts must be carefully crafted to ensure that the audits increase accountability in practice. While there is no one-size-fits-all approach to algorithm auditing, audits for bias and discrimination should:

  • Evaluate the system’s performance using carefully selected metrics — metrics that consider both when the system works and when it fails.
  • Break down performance for people in different groups, including but not limited to race, sex, age, and disability status, and the intersections of those groups.
  • Use data that faithfully represents how the system is used in practice.
  • Be conducted by auditors who are independent from the entity that built or deployed the algorithm.

In many cases, audits can and should be conducted by interdisciplinary teams of subject matter experts, including social scientists, lawyers and policy researchers, that consult with people who will be impacted by these tools and the users of the system itself. Researchers and practitioners have created many different resources describing how these kinds of audits can be operationalized.

Why the details of algorithm audits are so critical

Examining emerging “bias audits” produced in connection with a recently enacted law in New York City (Local Law 144) helps demonstrate why these details are so critical. Because of this law, employers using some of these kinds of technologies are required to publish “bias audits” with statistics about how often job applicants advance in the hiring process when an automated tool is used, broken down for people of different races and sexes.

Some news coverage has described this law as requiring employers to “prove their AI hiring software isn’t sexist or racist.” But a closer look at these “bias audits” indicates that they are incomplete evaluations of bias and discrimination. First, the auditing requirement only applies to a limited set of the types of automated tools used in hiring processes today. So far, we’ve only been able to locate around a dozen bias audits — even though 99 percent of Fortune 500 companies reportedly use some type of automated system in their hiring processes. The law also doesn’t require the audits to assess possible biases related to many characteristics where discrimination in hiring and employment has long been a concern, including disability, age, and pregnancy.

When it comes to what’s in the audits, the statistics required to be calculated and reported can provide some basic information about which automated tools employers are using in their hiring processes and the number of job applications being evaluated by these tools. But these audits fall short of meaningful transparency in several ways. For one example, some of the audits we’ve seen so far don’t even provide the name or vendor of the tool being audited. The audits also don’t examine whether the tools work as advertised or whether they accurately assess the relevant skills or capabilities needed for a job. In addition, these bias audits may not fully portray the experiences of candidates or practices of employers for multiple reasons. Several of the audits, including this one of an AI-driven candidate screening tool and this one of an AI-driven applicant scoring tool are missing a lot of data on candidates who were evaluated by the automated tool in question.

The published audits also frequently rely on data that is pooled together from multiple employers that use the same tool, even though they may be using the tool in very different ways. Companies characterize these audits as designed to “ensure non-discrimination against protected groups,” when in fact this data pooling may mask stark disparities or discriminatory practices by employers.

More generally, algorithm audits should be publicly available and easy to access as a matter of transparency. Even though employers are required to publish the audits on their websites, so far, we’ve found it quite difficult to locate these bias audits. That’s why we worked with the New York Civil Liberties Union to create a public tracker of all the ones we’ve seen so far (if you know of Local Law 144 bias audits that employers have posted that we missed, let us know by emailing analytics_inquiry@aclu.org).

As automated systems become more entrenched in every part of our lives, audits of these systems can be crucial to identifying and preventing their harms. But for that to be the case, algorithm audits must be holistic, ongoing, and reflective of the ways automated systems are used in practice. Technologists, civil rights advocates, policymakers, and interdisciplinary researchers should work together to ensure that algorithm audits live up to their potential.

We need you with us to keep fighting
Donate today

Comments

Popular posts from this blog

Arizona’s High Court Must Protect Abortion Access

Today, the Arizona Supreme Court will consider whether to resurrect a more than 150-year-old criminal ban on virtually all abortions. The court’s decision could allow that law to take precedence over Arizona’s modern abortion laws, including those passed just last year by the people’s current elected representatives. This ban was originally struck down in 1973, thanks to a lawsuit brought by Planned Parenthood and physicians in Arizona, and since that time has been superseded by a comprehensive scheme that regulates abortion as a lawful medical procedure. But an anti-abortion activist and County Attorney are now asking the Arizona Supreme Court to turn back the clock. No one should be forced to carry a pregnancy to term against their will and face the life-altering consequences of being denied essential health care, but reviving this antiquated law in full would do just that — and, at the same time, throw Arizona’s entire contemporary legal code into confusion. The origins of Arizo

Fighting Back Against Discriminatory Laws That Impact People Living with HIV

As a Black transgender woman and a former sex worker, it’s not unusual for me to face harassment and profiling from police. Regardless of whether we’re engaged in sex work or not, police frequently target transgender women like myself for searches and arrest, using anything from condoms to cash as “proof” we were engaged in sex work. For those who actually do engage in sex work, the criminalization of that livelihood raises the stakes of police encounters, and laws that criminalize our HIV status even more so. In 2010, I was arrested in Memphis, Tennessee, and charged under the state’s aggravated prostitution statute, a law that raises sex work from a misdemeanor to a felony strictly on the basis of my HIV diagnosis. The law, passed in a wave of fear and panic following the height of the AIDS epidemic in 1991, doesn’t require transmission of HIV, or even an act that could possibly transmit HIV, for prosecution. It applies to everyone living with HIV, regardless of whether they are t

New video by T-Series on YouTube

Aaoge Jab Tum Lofi Mix: Shahid Kapoor, Kareena Kapoor Khan |Jab We Met |Ustad Rashid Khan |Dj Basque Presenting "Aaoge Jab Tum Lofi Mix" from the film Jab We Met. Sung by Ustad Rashid Khan, composed by Sandesh Sandilya and penned by Faaiz Anwar. Remixed by Dj Basque. Song Credits: Song - Aaoge Jab Tum Film - Jab We Met Singer - Ustad Rashid Khan Lyricist - Faaiz Anwar Music Director - Sandesh Sandilya Artist - Kareena Kapoor, Shahid Kapoor Remixed By - Dj Basque Music On - T-Series Download Song Beat: https://bit.ly/3Cjh24R ___________________________________ Enjoy & stay connected with us! 👉 Subscribe to T-Series: https://youtube.com/tseries 👉 Like us on Facebook: https://ift.tt/5cpn7kR 👉 Follow us on X: https://twitter.com/tseries 👉 Follow us on Instagram: https://ift.tt/xMVNSfv View on YouTube