fbpx
May 8, 2024

Student Monitoring Tools Should Not Flag LGBTQ Keywords

Student Monitoring Tools Should Not Flag LGBTQ+ Keywords

One of the more dangerous features of student monitoring tools like GoGuardian, Gaggle, and Bark is their “flagging” functionality. The tools can scan web pages, documents in students’ cloud drives, emails, video content, and more for keywords about topics like sex, drugs, and violence. They then either block or flag this content for review by school administrators. 

But in practice, these flags don’t work very well—many of the terms flagged by these student monitoring applications are often ambiguous, implicating whole swathes of the net that contain benign content. Worse still, these tools can alert teachers or parents to content that indicates something highly personal about the student—like the phrase “am I gay”—in a context that implies such a phrase is dangerous. Numerous reports show that the regular flagging of LGBTQ+ content creates a harmful atmosphere for students, some of whom have been outed because of it. This is particularly problematic when school personnel or family members do not understand or support a queer identity.

We call on all student monitoring tools to remove LGBTQ+ terms from their blocking and flagging lists

Thankfully, some student monitoring software companies have heard the concerns of students and civil liberties groups. Gaggle recently removed LGBTQ+ terms from their keyword list, and GoGuardian has done the same, per our correspondence with the company. We commend these companies for improving their keyword lists—it’s a good step forward, though not nearly enough to solve the general problem of over-flagging by the apps. In our research, LGBTQ+ resources are still commonly flagged for containing words like ‘sex,’ ‘breasts,’ or ‘vagina.’ 

Though these tools are intended to detect signs that a student may be at-risk of depression, self-harm, suicide, bullying, or violence, their primary use is disciplinary. Eighty-nine percent of teachers say their schools use some form of student monitoring software, which has a greater negative impact on students from low-income families, black students, hispanic students, and those with learning differences. And as we’ve written before, combined with the criminalization of reproductive rights, and an increasing number of anti-trans laws, software that produces and forwards data—sometimes to police—about what young people type into their laptops is a perfect storm of human rights abuses. 

The improvements by Gaggle and GoGuardian don’t solve all the problems with student monitoring apps. But these companies have rightly recognized that flagging young people for online activity related to sexual preference or gender identity creates more danger for students than allowing them to research such topics. We call on all student monitoring tools to remove LGBTQ+ terms from their blocking and flagging lists to ensure that young people’s privacy isn’t violated, and to ensure that sexual and gender identity is not penalized. 

This article is part of our EFF Pride series. Read other articles highlighting this years work at the intersection of digital rights and LGBTQ+ on our issue page.


Published June 22, 2023 at 01:34PM
Read more on eff.org

%d bloggers like this: