Crime-Predicting AI Fares Worse than Humans in Repeat Offender Study, and It’s Racist Too

Crime Predicting AI Fares Worse Than Normal Human Beings in Study, And Its Racist Too

The 21st century has witnessed AI (Artificial Intelligence) accomplishing tasks like handily defeating humans at chess or teaching them foreign languages quickly.

A more advanced task for the computer would be predicting an offender’s likelihood of committing another crime. That’s the job for an AI system called COMPAS (Correctional Offender Management Profiling for Alternative Sanctions). But it turns out that tool is no better than an average bloke, and can be racist too. Well, that’s exactly what a research team has discovered after extensively studying the AI system which is widely used by judicial institutions.

According to a research paper published by Science Advances, COMPAS was pitted against a group of human participants to test its efficiency and check how well it fares against the logical predictions made by a normal person. The machine and the human participants were provided 1000 test descriptions containing age, previous crimes, gender etc. of offenders whose probability of repeat crimes was to be predicted.

With considerably less information than COMPAS (only 7 features compared to COMPAS’s 137), a small crowd of nonexperts is as accurate as COMPAS at predicting recidivism.

Crime-Predicting AI Fares Worse than Humans in Repeat Offender Study, and It’s Racist Too

COMPAS clocked an overall accuracy of 65.4% in predicting recidivism (the tendency of a convicted criminal to re-offend), which is less than the collective prediction accuracy of human participants standing at 67%.

Now just take a moment and reflect in your mind that the AI system, which fares no better than an average person, was used by courts to predict recidivism.

Commercial software that is widely used to predict recidivism is no more accurate or fair than the predictions of people with little to no criminal justice expertise who responded to an online survey.

Crime-Predicting AI Fares Worse than Humans in Repeat Offender Study, and It’s Racist Too

What’s even worse is the fact that the system was found to be just as susceptible to racial prejudice as its human counterparts when they were asked to predict the probability of recidivism from descriptions which also contained racial information of the offenders. You shouldn’t be too surprised there, as AI is known to take on the patterns that its human teachers will program it to learn.

Although both the sides fall drastically short of achieving an acceptable accuracy score, the point of using an AI tool which is no better than an average human being raises many questions.

VIA TechCrunch
SOURCE ScienceAdvances
#Tags
comment Comments 0
Leave a Reply