96

Imagine your search terms, key-strokes, private chats and photographs are being monitored every time they are sent. Millions of students across the U.S. don’t have to imagine this deep surveillance of their most private communications: it’s a reality that comes with their school districts’ decision to install AI-powered monitoring software such as Gaggle and GoGuardian on students’ school-issued machines and accounts.

"As we demonstrated with our own Red Flag Machine, however, this software flags and blocks websites for spurious reasons and often disproportionately targets disadvantaged, minority and LGBTQ youth," the Electronic Software Foundation (EFF) says.

The companies making the software claim it’s all done for the sake of student safety: preventing self-harm, suicide, violence, and drug and alcohol abuse. While a noble goal, given that suicide is the second highest cause of death among American youth 10-14 years old, no comprehensive or independent studies have shown an increase in student safety linked to the usage of this software. Quite to the contrary: a recent comprehensive RAND research study shows that such AI monitoring software may cause more harm than good.

you are viewing a single comment's thread
view the rest of the comments
[-] TexMexBazooka@lemm.ee 2 points 2 months ago

Sooo schools should just provide devices to kids with no monitoring at all?

There shouldn’t be an expectation of privacy on school/company provided devices, that isn’t how it works literally anywhere. It’s on the parents to teach their children not to use the device for personal reasons.

Ideally the school machines should be limited to only allowing coursework and limited messaging between classmates and teachers, it’s a tool not a toy.

Idk I just can’t get upset about this. Kids and privacy is kind of a tough one to begin with, I personally think kids shouldn’t have unregulated access to communication devices at all until like 14-15, maybe.

[-] PotentiallyApricots@beehaw.org 12 points 2 months ago

An issue here for me is that the kids can't op out. Their guardians aren't the ones checking up on their digital behavior, it's an ai system owned by a company on a device they are forced or heavily pressured to use by a school district. That's just too much of a power imbalance for an informed decision to my mind, even if the user in question were an adult. Kids are even more vulnerable. I do not think it is a binary option between no supervision and complete surveillance. We have to find ways to address potential issues that uphold the humanity of all the humans involved. This seems to me like a bad but also very ineffective way to meet either goal.

[-] TexMexBazooka@lemm.ee 3 points 2 months ago* (last edited 2 months ago)

We just fundamentally disagree on what rights someone is afforded on a company provided devices. They can’t opt out because obviously not, you don’t get to just opt out of information security policies.

It would be a different beast if the school didn’t allow you access coursework on a personal machine without installing their bullshit, thats a huge issue.

[-] hazelnoot@beehaw.org 5 points 2 months ago

It would be a different beast if the school didn’t allow you access coursework on a personal machine without installing their bullshit, thats a huge issue.

That's exactly how it works at many places. Students can only use a personal device if it's enrolled in the school's MDM, which grants them just as much control.

load more comments (6 replies)
load more comments (6 replies)
load more comments (9 replies)
this post was submitted on 08 Sep 2024
96 points (100.0% liked)

Technology

37742 readers
980 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS