US Schools Are Using AI To Monitor Students

This AI software can find any student in a school campus.
students

During the last 10 years, the US has witnessed 180 school shootouts with a staggering 356 victims. With weaker gun control laws in the country, schools are turning to Artificial Intelligence (AI) to protect the students. But a recent AI system installed in almost 9 US public schools, according to Recode, raised some privacy concerns.

The AI-based software is called Appearance Search and is developed by a company named Avigilon. The software is capable of finding a student in a school campus, based on their appearance, age, gender, clothes and certain facial features.

The software allows the school administration to locate a student and find the possible locations where they have been inside the campus where there are surveillance cameras. It uses physical descriptions like age, gender and hair colour of the person. By analysing the characteristics, the software searches for matches from the other video footages. So the tool cannot tell who exactly the person is, but it can tell where they might have been during their time in the school area.

How Appearance Search Works

The software is integrated into the surveillance system of the school. So if a school safety officer witnesses any suspicious student activity, he/she can click on the particular student’s body to attain more information of the student. The software will analyse the physical properties of the student like the clothes, gender, hair colour and what the Avigilon says some facial characteristics. After analysing the said features, the software will then match the image with any image having similar characteristics of other camera feeds.

The Privacy Concerns

As usual, with AI software comes obvious privacy concerns and this software is no different. Many security and privacy experts have questioned Avigilon’s Appearance Search.

“People don’t behave the same when they’re being watched,” reminds Brenda Leong, the director of AI and ethics at the Future of Privacy Forum. “Do we really want both young students and high schoolers, and anybody else, feeling like they’re operating in that environment all the time?”

Although this software is supposed to increase security and decrease violence in schools, it is used in a different way. School administrators are using this tool to monitor bullying, code of conduct violations by students and also to aid investigations of school employees.

This technology is not fully facial recognition based, however, it is a software which can recognise certain features of the face. Varoon Mathur, a scholar studying machine learning at the AI Now Institute considers Appearance Search to use “object detection”. While Logan Koepke of digital rights nonprofit upturn calls it a “person detection” software as it analyses other features of the person and not exclusively the face.

With this software in the works in most of the schools of the US, students can never have privacy. School administrators reported that there have been no reports of privacy invasion regarding the software.

But former Marjory Stoneman Douglas High School (a school where the software has already been installed) student, Koerber said, “A school shooting could happen at any time and any place. We know that. But do we need to invade the privacy of every person who enrols in a particular school to prevent that? I don’t think that’s the case.”

SOURCE Vox
Comments 1
Leave a Reply

Loading comments...