Bad bills and invasive monitoring systems, though sometimes well-meaning, hurt students rather than protect them from the perceived dangers of the internet and social media.
We saw many efforts to bar young people, and students, from digital spaces, censor what they are allowed to see and share online, and monitor and control when and how they can do it.
In response, we doubled down on exposing faulty surveillance software, long a problem in many schools across the country.
We launched a new project called the Red Flag Machine, an interactive quiz and report demonstrating the absurd inefficiency-and potential dangers-of student surveillance software that schools across the country use and that routinely invades the privacy of millions of children.
The project grew out of our investigation of GoGuardian, computer monitoring software used in about 11,500 schools to surveil about 27 million students-mostly in middle and high school-according to the company.
Our investigation showed that the software inaccurately flags massive amounts of useful material.
The software flagged sites about black authors and artists, the Holocaust, and the LGBTQ+ rights movement.
The software flagged the official Marine Corps' fitness guide and the bios of the cast of Shark Tank.
In addition to reading our research about the software, you can take a quiz that presents websites flagged by the software, and guess which of five possible words triggered the flag.
Congress this year resurrected the Kids Online Safety Act, a bill that would increase surveillance and restrict access to information in the name of protecting children online-including students.
We also called out the brazen Eyes on the Board Act, which aims to end social media use entirely in schools.
We can understand the desire to ensure students are focusing on schoolwork when in class, but this bill tells teachers and school officials how to do their jobs, and enforces unnecessary censorship.
Many schools already don't allow device use in the classroom and block social media sites and other content on school issued devices.
We've seen a slew of state bills that also seek to control what students and young people can access online.
Finally, teachers and school administrators are grappling with whether generative AI use should be allowed, and if they should deploy detection tools to find students who have used it.
AI detection tools are very inaccurate and carry significant risks of falsely flagging students for plagiarism.
AI use is growing exponentially and will likely have significant impact on students' lives and futures.
Demonizing it only deprives students from gaining knowledge about a technology that may change the world around us.
We'll continue to fight student surveillance and censorship, and we are heartened to see students fighting back against efforts to supposedly protect children that actually give government control over who gets to see what content.
If you're interested in learning more about protecting your privacy at school, take a look at our Surveillance Self-Defense guide on privacy for students.
This Cyber News was published on www.eff.org. Publication date: Thu, 28 Dec 2023 16:43:04 +0000