School Monitoring Software Sacrifices Student Privacy for Unproven Promises of Safety

Deeplinks 2024-09-06

Summary:

Imagine your search terms, key-strokes, private chats and photographs are being monitored every time they are sent. Millions of students across the country don’t have to imagine this deep surveillance of their most private communications: it’s a reality that comes with their school districts’ decision to install AI-powered monitoring software such as Gaggle and GoGuardian on students’ school-issued machines and accounts. As we demonstrated with our own Red Flag Machine, however, this software flags and blocks websites for spurious reasons and often disproportionately targets disadvantaged, minority and LGBTQ youth.

The companies making the software claim it’s all done for the sake of student safety: preventing self-harm, suicide, violence, and drug and alcohol abuse. While a noble goal, given that suicide is the second highest cause of death among American youth 10-14 years old, no comprehensive or independent studies have shown an increase in student safety linked to the usage of this software. Quite to the contrary: a recent comprehensive RAND research study shows that such AI monitoring software may cause more harm than good.

That study also found that how to respond to alerts is left to the discretion of the school districts themselves. Due to a lack of resources to deal with mental health, schools often refer these alerts to law enforcement officers who are not trained and ill-equipped to deal with youth mental crises. When police respond to youth who are having such episodes, the resulting encounters can lead to disastrous results. So why are schools still using the software–when a congressional investigation found a need for “federal action to protect students’ civil rights, safety, and privacy”? Why are they trading in their students’ privacy for a dubious-at-best marketing claim of safety?

Experts suggest it's because these supposed technical solutions are easier to implement than the effective social measures that schools often lack resources to implement. I spoke with Isabelle Barbour, a public health consultant who has experience working with schools to implement mental health supports. She pointed out that there are considerable barriers to families, kids, and youth accessing health care and mental health supports at a community level. There is also a lack of investment in supporting schools to effectively address student health and well-being. This leads to a situation where many students come to school with needs that have been unmet and these needs impact the ability of students to learn. Although there are clear and proven measures that work to address the burdens youth face, schools often need support (time, mental health expertise, community partners, and a budget) to implement these measures. Edtech companies market largely unproven plug-and-play products to educational professionals who are stretched thin and seeking a path forward to help kids. Is it any wonder why schools sign contracts which are easy to point to when questioned about what they are doing with regard to the youth mental health epidemic?

One example: Gaggle in marketing to school districts claims to have saved 5,790 student lives between 2018 and 2023, according to shaky metrics they themselves designed. All the while they keep the inner-workings of their AI monitoring

Link:

https://www.eff.org/deeplinks/2024/09/school-monitoring-software-sacrifices-student-privacy-unproven-promises-safety

From feeds:

Fair Use Tracker » Deeplinks
CLS / ROC » Deeplinks

Tags:

privacy student

Authors:

Bill Budington

Date tagged:

09/06/2024, 19:13

Date published:

09/06/2024, 18:12