top of page

Unsettling observation: Reimagining digital school safety

ideathon.JPG
frame.png

With the transition to online learning during the COVID-19 pandemic, schools sought a way to maintain student safety through technological means. Districts around the country, including Minneapolis Public Schools, contracted with software companies such as Gaggle, which monitor and report on student activity and claim to keep students safe. However, the role of these technologies is increasingly being questioned by parents, teachers, and lawmakers. Concerns have been raised regarding student privacy, mental health, and disproportionate discipline in relation to surveillance software. The aims of this project are 1) To engage University of Minnesota students in a participatory design process, 2) To generate actionable solutions that address the concerns of student privacy, possible mental health impacts, and disproportionate school discipline related to student surveillance, and 3) To determine what additional data, research, or resources are needed to continue addressing the issues related to student surveillance.

On October 12, 2021, The Guardian published “A boy wrote about his suicide attempt. He didn’t realize his school’s Gaggle software was watching,” a story about Teeth Logsdon-Wallace, a 13-year-old from Minneapolis who was “flagged” and received calls home from school representatives - for writing the word suicide in a homework assignment (Keierleber, 2021). This story is one of many highlighting concerns about the surveillance technologies becoming increasingly embedded in our educational system. With the move to online learning ushered by COVD-19, schools have quickly adopted many technologies that efficiently track, monitor, store and share information about students. Software like Gaggle, Bark, and GoGuardian are used by school districts across the country to monitor students’ internet browsing activity, documents, and even private chat messages. This software operates by identifying a list of keywords such as “drunk,” “hit,” “gay” and “suicide” (Desai-Hunt, 2021). Once received, school administrators decide what action to take regarding the report (Desai-Hunt, 2021). 

According to a recent report by the Center for Democracy and Technology, parents and teachers have expressed several concerns with student monitoring software (Hankerson et al., 2021). Students and parents are typically unaware of the extent of the monitoring or whom the information is shared with, and the majority of students reported that they alter their online behavior when they know they are being surveilled. These concerns have raised the attention of the federal government, with a group of democratic leaders recently demanding increased transparency from student surveillance companies, saying that it is a clear invasion of student privacy (Keierleber, 2021c). In addition to violating student privacy, these systems are likely to “out” LGBTQIA students to unsupportive family members or cause other trauma. For example, while companies like Gaggle claim to use their proprietary software to protect LGBTQIA students from bullying (CBSN AM, 2021), at least one MPS student has been unintentionally outed to their parents (Desai-Hunt, 2021). Experts and parents have expressed concerns that, instead of identifying situations where students need help, surveillance technology could add to feelings of shame, increase suffering, and decrease feelings of safety (Keierleber, 2021b). Another concern is that this software will contribute to the already disproportionate disciplining of students of color (Hankerson et al., 2021). 

This project centers on an ideathon, which is a type of hackathon.  It will serve as a generative event in which participants radically imagine futures of digital schooling that are inclusive, representative, and truly safe for all students. Research on the effectiveness of Gaggle and similar software is all but nonexistent (Fedders, 2019). Hackathons are venues of technological innovation, but there is a lack of existing information on how hackathons have been useful venues through which to support data justice efforts in the current age of increased digital control, algorithmic bias, and virtual surveillance of young people. Hackathons pose a promising avenue through which to explore and generate solutions to the problem of the digitized surveillance and criminalization of young people in the age of digitized learning, working, and socializing. Using the five principles for a participatory design process outlined by Hope et al., (2019), along with Moses’ process of anti-racist design, participants will ideate for two days as part of a larger project seeking to add to the data justice literature. 

Surveillance Camera
bottom of page