Facial recognition (FR) cameras are increasingly used in schools around the world to prevent violence and crimes. Instead, it exacerbates racism, commodifies data, and normalizes surveillance, a recent study shows.“We found that in schools where students of color predominate, more surveillance technologies are used,” Shobita Parthasarathy, professor at the University of Michigan and lead author of the study, told CyberNews.From year-long research together with fellow researchers from the University of Michigan’s Ford School of Science Claire Galligan, Hannah Rosenfeld, and Molly Kleinman, she concluded that FR cameras erode privacy, define the notion of an “acceptable” student, incite racism, commodify data, and institutionalize inaccuracy. Therefore, researchers strongly recommend banning FR technology in schools.
The global FR industry is currently valued at $3.2 billion.You can read the full study here.
Surveillance merely displaces violent behaviorSchools, colleges, and universities are increasingly using FR in schools in the United States and around the world, explained Shobita Parthasarathy.“Perhaps the most well-known case is in the city of Lockport, New York, near Niagara Falls. In 2018, the school district began to use the technology to enhance school security and specifically to identify outsiders,” told the professor.
The technology is also being used in a number of other states including Texas and Oklahoma: “The technology is primarily used for school security, but in some cases it’s used for identification purposes, including for taking attendance, checking books out from the library, or paying for lunch.”
For the time being, there’s not enough data to support the hypothesis that FR technology might be deployed more often in dangerous neighborhoods, districts with more minorities, etc. However, the report discusses how similar technologies have been used in the past.“We found that in schools where students of color predominate, more surveillance technologies are used (more closed-circuit TV, more metal detectors, more “school resource” officers that have law enforcement powers),” said Shobita Parthasarathy.
The group of researchers found that this has both psychological and educational implications on these students, as it creates feelings of anxiety and uncertainty, and worry of being constantly surveilled.
“It also punishes non-conformity, as these kinds of surveillance technologies tend to reinforce narrow ideas of “normal” behaviors and increasingly define non-conforming behavior as criminal or pathological. And yes, I think that given the history, we can expect that data from these students is more likely to be collected and commodified, and these students will be more likely to be subject to privacy and security breaches,” said Shobita Parthasarathy.
She also believes this data could be fed into the law enforcement system, which could then be used to further criminalize these young people of color.
There are no significant results in reducing violence or crime rate while using FR cameras.“And it would be hard to tell whether they really have a significant effect. School intruders and significant violence (e.g., school shootings) are not frequent events, so it would be hard to find a statistically significant effect. But there is some evidence that suggests that the use of surveillance technology in schools merely displaces any kind of violent behavior to off-campus settings,” Shobita Parthasarathy told CyberNews.
Get used to it: Big Brother is always watching“FR is poised to expand surveillance beyond even the scope of CCTV, fingerprinting, and Aadhaar, because it will collect biometric data and be able to track all student actions throughout the school day: arrival and departure times, where the student goes and when, library books checked out, and even what the student eats,” the new study reads.
As it combines more data, researchers expect that FR will significantly normalize and entrench surveillance among young people.
“We expect that such surveillance of our children will teach them that it is normal to have little autonomy over their personal data. In an environment in which students have no control over their biometric data, they are likely to leave school with a sense of powerlessness and a distorted understanding of whether and how they can and should protect their data privacy,” the researchers report.
Teaching students that they are distrusted, criminalized, and powerless in school will likely have harmful impacts on their education and development.
Studies have shown that the FR algorithm is less accurate in identifying people of color than white people. Technology performs better for lighter-skinned individuals, particularly men, and performs worst for darker-skinned subjects and women.
The higher error rates of FR for non-white subjects mean that the technology will malfunction for them more than it will for their white counterparts.
“This means that black and brown students will be misidentified more often, or not identified at all. The consequences of this will range from inconvenient, such as barring students of color from checking out library books to extremely damaging, such as criminal investigations. In part because of this, we strongly urge against the use of facial recognition in schools,” the study reads.
FR defines what is normal
FR could create a narrow definition of a normal, acceptable student, researchers argue. Those who don’t fit standards could be excluded and punished.“Like CCTV, Aadhaar, and predictive policing, FR privileges some subjects over others. It is more likely to accurately identify white, cisgender, abled students than non-white, gender non-conforming, or disabled students,” the report states.
School administrators could use technology to punish students who are, for example, not “acceptably” dressed. “Unacceptable” students might face barriers when paying for lunch, checking into class, or gaining access to certain rooms or resources.
“These students might either be unable or unwilling to participate in school activities as a result, which could degrade their educational experiences and opportunities,” researchers say. “Unacceptable” students, they believe, would more often be minorities, gender non-conforming, disabled people.
Also, there’s a fear that FR data from schools can become a commodity.
“Their data can be either sold or stolen without their knowledge and consent. Not only does this invade student privacy and compromise their sensitive data, but it also creates a culture that teaches students that it is normal and unremarkable to give away sensitive information. For these reasons, we strongly advise against the implementation of FR in schools,” the study reads.