Surveillance – The 74 https://www.the74million.org America's Education News Source Fri, 19 Jan 2024 06:34:56 +0000 en-US hourly 1 https://wordpress.org/?v=6.1.1 https://www.the74million.org/wp-content/uploads/2022/05/cropped-74_favicon-32x32.png Surveillance – The 74 https://www.the74million.org 32 32 Experts on Kids & Social Media Weigh the Pros and Cons of ‘Growing Up in Public’ https://www.the74million.org/article/experts-on-kids-social-media-weigh-the-pros-and-cons-of-growing-up-in-public/ Wed, 17 Jan 2024 13:30:00 +0000 https://www.the74million.org/?post_type=article&p=720576 Parents are more concerned than ever about their kids’ social media habits, worried about everything from oversharing and cyberbullying to anxiety, depression, sleep and study time. 

Recent surveys of young people show that parents’ concerns may be justified: More than half of U.S. teens spend at least four hours a day on these apps. Girls, who are particularly vulnerable, spend an average of nearly an hour more on them per day than boys. Many parents are searching for support. 

Perhaps more than anyone, Carla Engelbrecht and Devorah Heitner are qualified to offer it. They’ve spent years puzzling over how families can help understand media from the inside out, and how schools both help and hurt kids’ ability to cope.


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


Engelbrecht is a longtime children’s media developer. A veteran of Sesame Workshop and PBS Kids Interactive, she spent seven years at Netflix, most recently as its director of product innovation. Engelbrecht was one of the minds behind the network’s Black Mirror “Bandersnatch” episode in 2018, which allowed viewers to choose among five possible endings. 

Carla Engelbrecht (second from right) appears onstage with colleagues during a Netflix event on Black Mirror’s “Bandersnatch” episode in 2019. Engelbrecht, who was director of product innovation for the streaming service, is now testing a social media platform for children under 13. (Charley Gallay/Getty Images for Netflix)

Engelbrecht is now in public beta testing for Betweened, a new social media platform for kids under 13. She calls it a “course correction” for young people’s social media, aiming to teach them to be more mindful, thoughtful and responsible online.

Heitner is an author and speaker who specializes in helping parents and educators understand how digital technology, especially social media and interactive gaming, shape kids’ realities. Her books include 2016’s Screenwise: Helping Kids Thrive and Survive in Their Digital World and her new work Growing Up In Public: Coming of Age in a Digital World

Speaking to either one would be enlightening, but we decided to facilitate a broader conversation by inviting them to come together (virtually) to share insights and offer a bit of advice for both parents and schools. 

Their conversation with The 74’s Greg Toppo was wide-ranging, covering the effects of the pandemic, the pressures kids feel online and the women’s experiences communicating with their own children.

Devorah Heitner spoke in 2017 at the Roads to Respect Conference in Los Angeles. Heitner’s new book explores the impact of modern technology on childhood, including the effects of increased adult supervision of kids through tracking devices. (Joshua Blanchard/Getty Images for Rape Treatment Center)

The solutions they offer aren’t simple. In Heitner’s words, parents seeking to learn more about their kids’’ media usage should pull back their surveillance and “lead with curiosity.” 

The conversation has been edited for length and clarity.

The 74: Devorah, tell us a little bit about your new book.

Devorah Heitner: I wrote Growing Up in Public because I was speaking for years about Screenwise in schools and all these other environments, and people said, “O.K., I get that we want to think about quality over quantity with screen time. But we also want to understand what kids’ subjective experience is and not just focus on how many minutes are good or bad.”

People lie about that anyway. People are sort of oblivious to their own screen use sometimes and get over-focused on their kids’. A lot of adults are recognizing: If I could have had a Tumblr or a Twitter or Instagram as a kid, I could have really done a lot of damage to my prospects and opportunities by so openly sharing.

What are we doing to our reputations?

As I started digging into that question, I recognized that parents are really part of the surveillance culture with kids. So are schools, with grading apps like Life360 or Bark [which keep track of kids’ location, among other functions]. I really started understanding in a fuller way how kids are scrutinized. Kids are growing up very searchable, very public, and some of that is awesome. They have a platform, they can be activists. Some of it is problematic. 

The title of your book, Growing Up in Public, says so much about kid’s lives these days. I saw this term the other day: not FOMO, “Fear of Missing Out,” but FOMU, “Fear of Messing Up.” Are those competing interests for young people?

Heitner: Well, there’s definitely a fear of messing up and especially being called out. There’s a lot of “gotcha” culture going on, and kids documenting each others’ screw-ups. And as much as you patiently explain, as I have to my own 14-year-old, the concept of mutually assured destruction, if you’re on a group text with somebody for long enough, both of you have probably said a few things you don’t want repeated outside of that context.

I think it’s modeled by adults, but this kind of “gotcha” culture is very insidious and terrifying. And it should be terrifying. 

Carla, tell us a little bit about yourself.

Carla Engelbrecht: I’m a longtime product developer and researcher in the kids’ space. I’ve spent a lot of time making products for kids. I’ve seen for years kids wanting access to Twitter and Facebook and MySpace and TikTok, all through the generations of social media. And they always want what is not made for them. They’re aspirational.

Kids are just plopped into this. And just as you wouldn’t give a new driver the keys to the car and just say, “Go!” — you need to teach them how to drive — there’s the same concept for me with media use. We need to teach our kids. Parents don’t know what they’re doing, because none of us have really been through this before, and they abstain. They need support in learning how to do this. Where Devorah talks about things from that guidance perspective, I’m looking at: How can we build a product for kids that helps them learn? 

It seems to me like Betweened is a site for parents as much as anybody. 

Engelbrecht: There’s definitely two audiences here. There’s absolutely a path where I could build a product for kids and launch them onto it. But I wouldn’t be addressing all the pain points.

Kids want short-form content. They want to create. They want to connect with their peers. In order to successfully set kids up to do that, parents need tools, too. And so it is really a product for both kids and parents.

Carla mentioned all these different apps coming down the road. Devorah, I’m thinking about you saying to someone recently how you’ve been working on this book for five years. A lot has changed in five years. We didn’t have TikTok five years ago. 

Heitner: Screenwise came out in the fall of 2016, which was a memorable time for many reasons: a lot of social forces happening in our world with Trump’s election. 

And then you have the pandemic in 2020. That’s around the time I had sold the book and was trying to interview people. Suddenly, I’m not in schools anymore. I’m on Zoom with kids, which is a whole research problem: How do you get a wider range of kids, not just the super-compliant kids who show up to a Zoom? And the pandemic was an accelerant to a lot of things happening already with kids in tech.

“Parents are really part of the surveillance culture with kids. So are schools.”

Devorah Heitner

It was certainly not the beginning of kids being too young and not COPPA-compliant [the federal Children’s Online Privacy Protection Act gives parents control over what information websites can collect from their kids]. But it accelerated, and there was kind of a push toward things like Kids Messenger [on Facebook] and other things that I even experimented with at the time. 

The pandemic started when my son was 10. We were like, “Oh, what can we do to help him communicate with friends?” We experimented with Messenger. It was a fail for us, but I also talked to the people at Pinwheel and Gabb [two mobile phone companies marketed for children]. There are people, in different ways, trying to come up with solutions because they have understood that both the adult apps and the adult devices, like a smartphone that does all the things, might not be the ideal thing to give a 10-year-old. 

What’s changed since 2016 is there used to be more worry about one-to-one computing in schools. Now, every school pretty much is one-to-one. It’s really the outlier schools that don’t have tech or aren’t giving kids individual tech. Even as late as 2015, 2016, I was helping schools negotiate that with parents. And parents were like, “I don’t know. I’m not sure about screen time. I don’t know if I want my kid getting a Chromebook.”

Try to find a school now that doesn’t give kids iPads or Chromebooks or something. That’s probably one of the bigger differences. And then just the explosion in server-based gaming like Roblox and Minecraft and the ways kids interact in those digital communities. You see a lot of very complicated, weird ideas among adults who care about children. Like “I’ll wait until eighth grade to give a kid a phone. Meanwhile,my third-grader plays Roblox on a server with strangers.” 

Engelbrecht: Or has access to text messaging through their iPad.

Heitner: Exactly. And they’re very smugly waiting till eighth grade and I’m like, “For what? For your kid to make voice calls?” That’s the one thing they don’t want to do.

Carla, you come from a game design background. People have lots of terrible takes about video games, which I’m sure you’re used to. How has that background informed what you’re doing and what Betweened looks like?

Engelbrecht: A lot of people come to video games and they’re just like, “They’re evil,” or “They’re awful,” or “They’re violent.” And you can say the same thing about television. You can also say the same thing if you only eat broccoli. Anything in excess is not good for you — like running a marathon every day. I take a very pragmatic approach to most things we can actually find good in.

When I look at video games, I can’t classify them as evil. I instead look for the good things. And it’s the same with social media. Social media as part of a balanced media diet gives parents a lot of opportunities to connect, gives kids a lot of opportunity to express creativity and develop skills. 

“There wasn’t social media when I was in college. A bad decision in college couldn’t chase me through my entire life. In that sense, there are risks that feel much larger.”

Carla Engelbrecht

I’ll give you an example on the games side of things: Years ago, I did a South by Southwest talk called “What Left 4 Dead Can Teach Us About Parenting.” Left 4 Dead is not a game that kids should ever play. It’s a violent, first-person zombie apocalyptic shooter. It’s also one of the most beautifully designed cooperative games ever. I’m terrible with thumb sticks on video game controllers. I can’t walk in a straight line in a video game. I’m not great at the actual zombie-killing side of things. But I’m really good at running around and picking up health packs and checking in on people who have been damaged by zombies.

So there are different roles that people can play. I can still participate in the game, even though the primary way of playing Left 4 Dead is not what works for me. 

Also, if I’m playing with people, it fosters communication. I have to talk to people and someone needs to say. “Hey, I need help,” and I can come over. That’s what I’m looking for in games and social media: What are those underlying skills that, with a thoughtful perspective, you can leverage for good?

I wanted to switch gears a little bit and talk about something you mentioned earlier, Devorah: casual surveillance. I think about the stories we hear about parents not even just surveilling their kids — tracking their phones or their cars — but just keeping up in a way that we never even dreamed of. I wonder: Where did this come from? And how do you think a site like Betweened is going to help? 

Engelbrecht: I wish I knew exactly where it came from, but it certainly seems it’s symptomatic of the same thing: Everything has just kind of crept up on us. It’s like, as phones started to be introduced, we just thought, “Oh, well, I need to charge my phone, so I’ll charge it next to my bed.” And then the next thing you know, you’re checking it first thing when you wake up. It’s this slippery slope without the mindfulness of what it’s doing. Something has to happen to stop you, to make you take a step back and think, “How far have I gone? What boundaries have I crossed or what new boundary do I need to establish?” And to Devorah’s earlier point, the pandemic accelerated a lot of this.

Heitner: Part of it is we do it because we can. Even in relationships. I’ve known my husband since before we each had cell phones, but we didn’t used to check in as often because we didn’t have cell phones. It had to really rise to the level of an emergency before I would call him at work.

“As much as you patiently explain, as I have to my own 14-year-old, the concept of mutually assured destruction, if you’re on a group text with somebody for long enough, both of you have probably said a few things you don’t want repeated.”

Devorah Heitner

Remember the days of 9-to-5 office jobs? He left in the morning and was at his job. I was a grad student then and I would go up to Northwestern and not even really have any reachability by phone. Now we have phones, and the expectation is pretty much down-to-the-minute: If I’m 11 minutes late, I’ll probably text and say, “I’m 11 minutes late.” There’s just so much expectation for contact and communication and knowing where other people are. We don’t use location surveillance for that, but a lot of families do, and a lot of people have watches and will check into each other’s location on watches.

Because it’s there, people do it. And then there’s also just tremendous worry right now about kids. Given that we as a society think it’s a good idea for everyone to have assault weapons, parents are a little nervous. That anxiety creeps into everything.

My older daughter is 31, and I remember getting her first cell phone when she was 12 or 13. I remember the intense peer pressure she felt to have a phone. And I really didn’t like it at all. But I kind of justified it by saying to myself, “This is going to keep her safe.” And I remember thinking to myself, “You’re so full of shit. You’re just really trying to smooth things over.” And I guess I wonder: As parents, do we have an overextended sense of peril about our kids these days?

Heitner: There’s a sense of peril. Also, the Internet and online news and targeted algorithms just fuel that worry and outrage. It’s a bit of a vicious cycle.

Engelbrecht: In some ways, it’s almost like there are more risks that could stick with you. There wasn’t social media when I was in college. A bad decision in college couldn’t chase me through my entire life. In that sense, there are risks that feel much larger.

I think about my daughter and I don’t want something to chase her for her entire life. That part of it feels very real. And then it feels out of control. I don’t have the tools or know exactly how I can best help her except for having hard conversations and trying to put some bumpers around her. But there’s not a lot of tools to put the bumpers around her.

Devorah, one of the things you have said is that the kind of surveillance a lot of parents are undertaking is really undermining the trust their kids feel, and backfiring because kids won’t open up to them when they really need to. Can you talk a little bit more about that?

Heitner: You just see kids really getting focused on going deeper underground. If their parents are like, “I’m going to get Bark and read every single thing they text,” then you see some kids who are like, “O.K., I need to go deeper underground, I need a VPN or to only text on Snapchat, or I need to do something where I can be more evasive.” And that concerns me, because then there’s no way to make use of the parent when the parent might be useful.

Engelbrecht: I think about how to create space to allow the kid to have a second chance at telling me the truth. For example, if there’s an empty bag of gummies and the kid is the only one who could have eaten it but says they didn’t, how can I create space to talk about making mistakes versus lying or intentionally hiding the truth? Saying, “I’m going to ask what happened to the gummis again, but first I want you to take a moment to think about your answer — it’s OK to change your answer, because I want to understand the truth. We all make mistakes and we can talk about it. But intentionally hiding the truth has consequences.”

If I later find out that the child lied, then there’s consequences. The hope is that eventually, a parent can say, “If you end up at a party where there’s alcohol, don’t drive home. Call me for a ride home. If you try to hide that there was alcohol and make poor decisions, then there’s additional consequences.”

“I don’t want to be in the place where I’m policing her homework. Now that she’s in seventh grade, it’s time for her to be learning those skills before there’s the consequences of missing your homework in high school or college.”

Carla Engelbrecht

It’s important to be able to say, “I made a mistake” and talk about what to do from there. Hopefully, that provides an alternative to the arms race of increasingly sneaky strategies that Devorah described.

Heitner: That makes a lot of sense. I was just going to say: The surveillance — schools just push it really hard. Every time I go to a school, they’re like, “Are you logged into Canvas?” or “Are you logged into PowerSchool?” They’re just really pushing it so hard.

Are schools culpable in this? Sounds like you’d say, “Yes.” I don’t know if you’d call it surveillance, though. One of the functions of schools is to keep track of things, right?

Heitner: But what about the location tracking? My kid has to scan a QR code to get into the cafeteria. I skipped lunch every day of high school and ate with my drama club friends in the theater. Was that so bad? They have 3,500 kids QR-coding themselves into study hall. It’s pretty locked down. It’s pretty Big Brother, or Little Brother if you read Cory Doctorow. 

Engelbrecht: Homework tracking means having full visibility of my daughter when part of what she needs to learn is the executive function skills to actually be able to plan and follow through and do her homework. I don’t want to be in the place where I’m policing her homework. Now that she’s in seventh grade, it’s time for her to be learning those skills before there’s the consequences of missing your homework in high school or college.

So to me, it’s kind of that same thing: The information is there. Should it be provided? How do you use it? And, for me it’s: How do we better equip administrators, teachers or parents to stop and think about how to leverage this information? So maybe a kid who’s consistently missing their homework, yes, the parents should have more visibility as part of a support program to get the kid back on track and help them learn the skills. But to Devorah’s point, it doesn’t mean everyone needs to be badging into lunch.

Devorah, your message to parents is: There are all these things happening. There are all these things you have to keep track of. There are lots and lots of risks to kids being on social media, especially teenagers. But you shouldn’t panic. And I wanted to just throw this out to both of you: Instead of panicking, what should parents do? 

Heitner: Carla, you’re talking about creating a new community space for kids that’s more of a learning space, and that’s one alternative. Another alternative, in addition to, or potentially instead of, for parents who don’t have access to that, is just leaning into one or two spaces they really want to mentor their kids in.

Maybe their kid’s really involved in Minecraft. And if they want to join Discord [a free voice, chat, gaming and communications app], the parents are waiting and saying, “O.K. You can join your library Discord with Minecraft or your school Minecraft club on Discord, but not general Discord.”

Two 9-year-olds play the open world computer game Minecraft. Parenting expert Devorah Heitner urges parents to know more about what their kids are doing online without resorting to surveillance. (Getty Images)

Parents will tell me their kids are playing Roblox or they’re on YouTube. But I’m like, “What channels? It’s just like if somebody says, “I’m watching TV.” Well, what are you watching? Because that really is a big differentiator in terms of the experience.

Engelbrecht: It goes back to your “Fear of Messing Up.” I think so much about how it’s important for parents to wade in and get involved with their kids. This has been the advice for decades, whatever the newfangled thing was. I was just doing some writing about encouraging parents to actually do dance challenges with their kids. It’s an opportunity to bond. It actually requires some planning and practice. It’s physical activity. I assume most parents are like me, that they’re not a great dancer and it’s uncomfortable and you don’t want to mess up.

But modeling that I’ll do something that’s out of my comfort zone and connect with you over something that I know you enjoy, can be very simple. It doesn’t mean a parent has to suddenly learn all aspects of Roblox or Discord, because they can be intimidating. But just find an entry point and connect with the child and participate with them. It just has so many benefits. It’s true whether they’re into Tonka trucks or Roblox. Parenting means, “Get in there with your kid.”

Devorah, you use the phrase, “Lead with curiosity.”

Engelbrecht: Oh, I love that.

Heitner: You want to be curious and have your kid share it with you. Their expertise and experience as well and their discernment — what do they like or not like about this app? How would they change it if they could? Staying curious is an alternative to spying — being curious and asking kids to be curious even about their own experience. Do I actually feel less stressed when I scroll this app? That’s maybe a lot of mindfulness to expect of kids, who have a lot going on and a lot coming at them. But it’s important for all of us to be curious about how our experience is going.

Engelbrecht: That’s one of the ways I’ve been thinking about it from a product perspective: just how to help build in some scaffolds for mindfulness — things like when you start an app, actually having a timer that’s like, “How long do you want to spend on it right now?”

I set a timer for myself when I use TikTok because I spend a very long time on it. So being able to put that in there as a scaffold, to start being mindful and thoughtful about it. We’re posting content, but we’re actually not posting endless scrolls where you could spend all day.

I don’t want to prioritize the traditional tech metric of “time on task.” To me, success is like, “You can come and use Betweened for 20 minutes and then know you can come back another day and there’s lots of interesting stuff for you.” But it’s not all-consuming, must-do-this-all-the-time. And that’s a different perspective on tech products. It’s not how most products are developed.

]]>
Gaggle Drops LGBTQ Keywords from Student Surveillance Tool Following Bias Concerns https://www.the74million.org/article/gaggle-drops-lgbtq-keywords-from-student-surveillance-tool-following-bias-concerns/ Fri, 27 Jan 2023 12:15:00 +0000 https://www.the74million.org/?post_type=article&p=703034 Digital monitoring company Gaggle says it will no longer flag students who use words like “gay” and “lesbian” in school assignments and chat messages, a significant policy shift that follows accusations its software facilitated discrimination of LGBTQ teens in a quest to keep them safe.

A spokesperson for the company, which describes itself as supporting student safety and well-being, cited a societal shift toward greater acceptance of LGBTQ youth — rather than criticism of its product — as the impetus for the change as part of a “continuous evaluation and updating process.”

The company, which uses artificial intelligence and human content moderators to sift through billions of student communications each year, has long defended its use of LGBTQ-specific keywords to identify students who might hurt themselves or others. In arguing the targeted monitoring is necessary to save lives, executives have pointed to the prevalence of bullying against LGBTQ youth and data indicating they’re significantly more likely to consider suicide than their straight and cisgender classmates. 


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


But in practice, Gaggle’s critics argued, the keywords put LGBTQ students at a heightened risk of scrutiny by school officials and, on some occasions, the police. Nearly a third of LGBTQ students said they or someone they know experienced nonconsensual disclosure of their sexual orientation or gender identity — often called outing — as a result of digital activity monitoring, according to a national survey released in August by the nonprofit Center for Democracy and Technology. The survey encompassed the impacts of multiple monitoring companies who contract with school districts, such as GoGuardian, Gaggle, Securly and Bark. 

Gaggle’s decision to remove several LGBTQ-specific keywords, including “queer” and “bisexual,” from its dictionary of words that trigger alerts was first reported in a recent VICE News documentary. It follows extensive reporting by The 74 into the company’s business practices and sometimes negative effects on students who are caught in its surveillance dragnet. 

Though Gaggle’s software is generally limited to monitoring school-issued accounts, including those by Google and Microsoft, the company recently acknowledged it can scan through photos on students’ personal cell phones if they plug them into district laptops.

The keyword shift comes at a particularly perilous moment, as Republican lawmakers in multiple states push bills targeting LGBTQ youth. Legislation has looked to curtail classroom instruction about sexual orientation and gender identity, ban books and classroom curricula featuring LGBTQ themes and prohibit transgender students from receiving gender-affirming health care, participating in school athletics and using restroom facilities that match their gender identities. Such a hostile political climate and pandemic-era disruptions, a recent youth survey by The Trevor Project revealed, has contributed to an uptick in LGBTQ youth who have seriously considered suicide. 

The U.S. Education Department received 453 discrimination complaints involving students’ sexual orientation or gender identity last year, according to data provided to The 74 by its civil rights office. That’s a significant increase from previous years, including in 2021 when federal officials received 249 such complaints. The Trump administration took a less aggressive tack on civil rights enforcement and complaints dwindled. In 2018, the Education Department received just 57 complaints related to sexual orientation or gender identity discrimination.

The increase in discrimination allegations involving sexual orientation or gender identity are part of a record spike in civil rights complaints overall, according to data obtained by The New York Times. The total number of complaints for 2021-22 grew to 19,000, a historic high and more than double the previous year. 

In September, The 74 revealed that Gaggle had donated $25,000 to The Trevor Project, the nonprofit that released the recent youth survey and whose advocacy is focused on suicide prevention among LGBTQ youth. The arrangement was framed on Gaggle’s website as a collaboration to “improve mental health outcomes for LGBTQ young people.” 

The revelation was met with swift backlash on social media, with multiple Trevor Project supporters threatening to halt future donations. Within hours, the group announced it had returned the donation, acknowledging concerns about Gaggle “having a role in negatively impacting LGBTQ students.” 

The Trevor Project didn’t respond to requests for comment on Gaggle’s decision to pull certain LGBTQ-specific keywords from its systems. 

In a statement to The 74, Gaggle spokesperson Paget Hetherington said the company regularly modifies the keywords its software uses to trigger a human review of students’ digital communications. Certain LGBTQ-specific words, she said, are no longer relevant to the 24-year-old company’s efforts to protect students from abuse and were purged late last year.

“At points in time in the not-too-distant past, those words were weaponized by bullies to harass and target members of the LGBTQ+ community, so as part of an effective methodology to combat that discriminatory harassment and violence, those words were once effective tools to help identify dangerous situations,” Hetherington said. “Thankfully, over the past two decades, our society evolved and began a period of widespread acceptance, especially among the K-12 student population that Gaggle serves. With that evolution and acceptance, it has become increasingly rare to see those words used in the negative, harassing context they once were; hence, our decision to take these off our word/phrases list.”

Hetherington said Gaggle will continue to monitor students’ use of the words “faggot,” “lesbo,” and others that are “commonly used as slurs.” A previous review by The 74 found that Gaggle regularly flagged students for harmless speech, like profanity in fictional articles submitted to a school’s literary magazine, and students’ private journals. 

Anti-LGBTQ activists have used surveillance to target their opponents for generations, and privacy advocates warn that in the era of “Don’t Say Gay” laws and abortion bans, information gleaned from Gaggle and similar services could be weaponized against students.

Gaggle executives have minimized privacy concerns and claim the tool saved more than 1,400 lives last school year. That statistic hasn’t been independently verified and there’s a dearth of research to suggest digital monitoring is an effective school-safety tool. A recent survey found a majority of parents and teachers believe the benefits of student monitoring outweigh privacy concerns. The Vice News documentary included the perspective of a high school student who was flagged by Gaggle for writing a paper titled “Essay on the Reasons Why I Want to Kill Myself but Can’t/Didn’t.” Adults wouldn’t have known she was struggling without Gaggle, she said. 

“I do think that it’s helpful in some ways,” the student said, “but I also kind of think that it’s — I wouldn’t say an invasion of privacy — but if obviously something gets flagged and a person who it wasn’t intended for reads through that, I think that’s kind of uncomfortable.” 

Student surveillance critic Evan Greer, director of the nonprofit digital rights group Fight for the Future, said the tweaks to Gaggle’s keyword dictionary are unlikely to have a significant effect on LGBTQ teens and blasted the company’s stated justification for the move as being “out of touch” with the state of anti-LGBTQ harassment in schools. Meanwhile, Greer said that LGBTQ youth frequently refer to each other using “reclaimed slurs,” reappropriating words that are generally considered derogatory and remain in Gaggle’s dictionary. 

“This is just like lipstick on a pig — no offense to pigs — but I don’t see how this actually in any meaningful way mitigates the potential for this software to nonconsensually out LGBTQ students to administrators,” Greer said. “I don’t see how it prevents the software from being used to invade the privacy of students in a wide range of other circumstances.”

Gaggle and its competitors — including GoGuardian, Bark and Securly — have faced similar scrutiny in Washington. In April, Democratic Sens. Elizabeth Warren and Ed Markey argued in a report that the tools could be misused to discipline students and warned they could be used disproportionately against students of color and LGBTQ youth. 

Jeff Patterson

In a letter to the lawmakers, Gaggle founder and CEO Jeff Patterson said the company cannot test the potential for bias in its system because the software flags student communications anonymously and the company has “no context or background on students,” including their race or sexual orientation. They also said their monitoring services are not meant to be used as a disciplinary tool. 

In the survey released last summer by the Center for Democracy and Technology, however, 78% of teachers reported that digital monitoring tools were used to discipline students. Black and Hispanic students reported being far more likely than white students to get into trouble because of online monitoring. 

In October, the White House cautioned school districts against the “continuous surveillance” of students if monitoring tools are likely to trample students’ rights. It also directed the Education Department to issue guidance to districts on the safe use of artificial intelligence. The guidance is expected to be released early this year.

Evan Greer (Twitter/@evan_greer)

As an increasing number of districts implement Gaggle for bullying prevention efforts, surveillance critic Greer said the company has failed to consider how adults can cause harm.

“There is now a very visible far-right movement attacking LGBTQ kids, and particularly trans kids and teenagers,” Greer said. “If anything, queer kids are more in the crosshairs today than they were a year ago or two years ago — and that’s why this surveillance is so dangerous.”

If you are in crisis, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255), or contact the Crisis Text Line by texting TALK to 741741. For LGBTQ mental health support, contact The Trevor Project’s toll-free support line at 866-488-7386.

]]>
DHS Sec. Mayorkas: Relationships, Not Tech, Central to Creating Safe Schools https://www.the74million.org/article/dhs-sec-mayorkas-relationships-not-tech-central-to-creating-safe-schools/ Fri, 11 Nov 2022 12:15:00 +0000 https://www.the74million.org/?post_type=article&p=699629 Homeland Security Secretary Alejandro Mayorkas leads an agency — born in the aftermath of the Sept. 11, 2001, terrorist attacks — perhaps best known for mass surveillance and rigid airport security checkpoints. But to Mayorkas, the key to keeping students safe at school rests with strong relationships. 

Time and again, gunmen have displayed a range of warning signs before opening fire in schools, including fascinations with violence and a history of trauma. As cryptic — and at times explicit — social media posts emerge post-attacks, conversations often center on missed opportunities to intervene. It takes a vigilant community, Mayorkas said, to break the cycle. 

“We’re seeing individuals potentially with mental health problems, grievances, and they have manifested their challenges outwardly, they have spoken about violence,” he told The 74. “What we’ve seen is expressions of an interest in violence and an expression of a planning or plotting to conduct an attack. And we need to educate people on identifying those signs, those expressions and also what to do about it to seek help for those individuals.”

Amid a surge in mass school shootings, districts nationwide have pumped more than $3 billion into school security. Campus police have become commonplace, active-shooter drills have grown routine and, for students across the U.S., digital surveillance has been normalized. The Department of Homeland Security has endorsed “threat assessment,” a process where educators, mental health professionals and the police analyze a student’s behaviors and statements to determine if they, as Mayorkas put it, are “descending down a path towards violence.”

The environment has created a balancing act for school leaders who are charged with keeping schools safe while protecting students’ civil liberties. 

The department recently invited The 74 to interview Mayorkas about this complicated landscape ahead of its first-ever National Summit on K-12 School Safety and Security. Mayorkas fielded questions about the sharp uptick in mass school shootings, the botched police response in Uvalde, Texas, and a massive ransomware attack that targeted the Los Angeles Unified School District.

The conversation has been edited for length and clarity.

We’re seeing an uptick in active mass shootings, including those that are targeting schools. What are some of the trends that you’re seeing within these campus attacks and what are some of the key strategies that your agency and other federal agencies are using to combat this increase in violence? 

So, Mark, tragically 2022 saw the greatest number of school shootings in our nation’s history. I think it was just over 250. And we have a multifaceted approach to it, of course, to educate and empower schools to understand how they can be safe environments. 

Every child, every person in this country and frankly around the world, deserves a safe, secure, supportive environment in which to be educated. And so we have our Cybersecurity and Infrastructure Security Agency, CISA as it is known, that has a website schoolsafety.gov that is dedicated to this critical mission set. 


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


We have the United States Secret Service’s’ National Threat Assessment Center, the NTAC, that provides resources to schools about how they can maintain a safe environment. We have critical grant programs that fund innovative efforts to really build resilience, and to help prevention models, as well as our Center for Prevention Programs and Partnership, CP3, which is developing a one-stop shop that identifies federal resources for schools to access.

We have a lot of different efforts underway throughout our department and throughout the administration.

Absolutely. So let’s jump into the threat assessment one. You mentioned the Secret Service. They’ve done this study basically finding that mass school shooters almost always have observable traits before the attack. And it’s basically a “See Something, Say Something” kind of mantra. Can you talk a little bit about threat assessments, and identifying people who might present a serious risk, but doing so in a way that doesn’t trample on people’s civil rights?

That’s right. So it’s very important, Mark, the last part of your question. We have a statutorily created Office for Civil Rights and Civil Liberties and a statutorily created Office of Privacy. It’s very important that we keep those fundamental rights well protected and do not in any way infringe upon them. 

Indeed, if we take a look at recent events, the assailants in Uvalde, Texas, in Buffalo … and Highland Park, these individuals exhibited signs that were observable to individuals around them. And the key is to empower people to educate people about how to identify those characteristics when somebody’s descending down a path that has a connectivity to violence, and really intervene. And to intervene not in a way that delivers accountability, but rather assistance, support. 

We’re seeing individuals potentially with mental health problems, grievances, and they have manifested their challenges outwardly, they have spoken about violence. More generally what we’ve seen is expressions of an interest in violence and an expression of a planning or plotting to conduct an attack. And we need to educate people on identifying those signs, those expressions and also what to do about it to seek help for those individuals.

You mentioned Uvalde and I’m really curious on your thoughts about the law enforcement response to the tragedy. More than 350 officers from local, state and federal agencies descended on the school. And ultimately, officers under your watch were the ones who were able to stop the gunman. But I’m curious about the delay. It took more than an hour for law enforcement officers to ultimately confront the gunman. I’m curious if you have any insight into the factors that led to that delay, and what lessons educators, law enforcement officials and anybody in the security space can take from that police response?

I think there are going to be a lot of lessons learned from the response in Uvalde. That response has been the subject of a number of investigations and some of those investigations are, in fact, ongoing. So I think I’m going to refrain from commenting upon the reported delays in the response. 

That was an unspeakable tragedy and I think there are different responses in different situations. There is a great body of training and active shooter training and how law enforcement should respond. I think the critical part is to take a look at every incident — unfortunately they occur all too often — and to learn from them to refine those best practices, to make sure that we’re disseminating those best practices throughout the law enforcement community. And not just the law enforcement community, but the health care community and the like. 

One of the things that we’ve focused on in this administration is an all-of-government and all-of-community response to this threat. So we are engaged with the Department of Education, we’re engaged with the Department of Health and Human Services, we’re looking at local community groups, parent associations, school systems, local health, mental health networks and providers. This really requires an all-of-community response to the fact that individuals are expressing their infirmities, their challenges, through acts of violence and through acts of violence targeted at children.

One of the interesting things about the response to the shooting has been a lot of concern about law enforcement officers in schools. The federal government has put a lot of money over the last several decades into putting police officers in schools. I’m curious what your response is to advocates who’ve been calling for police-free schools? 

This is a very difficult issue and it’s an issue that we do encounter not only in the school system but also in other contexts as well. This is a conversation I’ve had with faith leaders about how to make places of congregation, of learning, of worship, welcoming, open and the like, and also safe and secure, to not be foreboding. 

I don’t think it’s a one-size-fits-all. I think we have to take a look at the safety imperative. I am not opposed to having security guards in schools, I myself. But how they are deployed, how they are integrated into the fabric of the school community, I think is vitally important. 

We’re going to talk a little bit about cybersecurity and dark corners of the internet. In Uvalde the school district used a company called Social Sentinel, basically to monitor social media and try to identify potentially threatening social media posts. School districts across the country use a large range of different surveillance tools to basically monitor how kids interact on the internet and to try to identify violence before it happens. But the White House recently came out with what they’re calling a Bill of Rights for AI, and it basically says to schools, ‘limit the continuous surveillance of students if it has a potential to infringe on their civil rights.’ I’m curious on your thoughts on this idea of monitoring students’ behaviors on social media and other internet platforms to identify threats of violence?

The key is to create with one’s children an open line of communication so that one can learn what type of online activity one’s child is engaging in. So an open, communicative environment is absolutely critical, as is digital literacy so children can understand what is credible and what is not credible. 

We can employ privacy settings — parents, not the government — the parents can employ privacy settings and understand what their children are doing and communicate about it. It’s really important that children who are online are educated with respect to their own behavior and the behavior of others. I think that is what is key, that open, communicative environment, an environment of digital literacy and an environment where if children see something, they understand what it is they are seeing and know how to respond to it. And also, for parents, friends, relatives, school teachers and the like to pick up on the signs when a child is descending down a path towards violence.

If we’re talking to parents here for a second, what do you think are some of the most critical signs that folks should be looking out for?

It gets very difficult and I would really defer to mental health professionals and the like but let me give you a few examples. If we are dealing with an individual who expresses an intent to commit violence, who expresses a fascination with violence and begins to withdraw from societal communications with friends and the like, I think it is time to communicate, to ask questions, to engage with that child to learn more.

Many communities in the last few months haven’t even experienced shootings — but have been told that they are. A bunch of schools across the country in the last few months are being subjected to swatting calls.

Swatting is a very dangerous phenomenon that we’re seeing an increase of. That prank call to emergency personnel to deploy when, in fact, they’re not needed. That’s a criminal activity and it really puts innocent people at risk.

I’m curious when you can tell us about the surge right now. It appears that many of these are connected. Can you give us any insight into what’s going on and why schools are suddenly experiencing a surge in these kinds of calls?

One of the things that’s of concern when it comes to swatting, and it’s also applicable to malicious cyber activity, is the ease of replication. That if a swatting incident occurs in one geography, others may be motivated, unfortunately, to do the very same thing in a very different venue. We seek to prevent it. We work with the state, local, tribal territorial partners, campus law enforcement, to educate students, to educate people about the danger of swatting. It’s not an innocent prank call. It’s the deployment of precious law enforcement resources and could have unintended consequences. Education and prevention are key here.

Speaking of cybersecurity, the Los Angeles school district, America’s second-largest school district, was just the victim of a ransomware attack. They ultimately did not pay the ransom and as a result had some of their data posted on the dark web. I’m curious what you can tell us about the threat actors who we’re holding LAUSD ransom and in general the threat actors who are targeting schools?

We’ve seen a tremendous rise in ransomware over the last several years by criminal actors. They target not only schools, they target hospitals, law enforcement organizations, businesses, the range of victims is quite wide. We caution, we recommend that victim entities not pay the ransom. We are very well aware of the precarious situation in which they find themselves when they’re held hostage to a ransomware actor. But we have only increased our defenses, really only enhanced our defenses, and also strengthened law enforcement’s response to it.

Now, if I’m a school leader and I’m the victim of a ransomware attack or some sort of cyber threat, what kind of assistance can I receive from the federal government? What role do you play in helping school districts respond to this?

Our Cybersecurity and Infrastructure Security Agency, CISA, is very well equipped to assist a ransomware victim as is the Federal Bureau of Investigation, the United States Secret Service. We have a whole suite of capable agencies that can assist in identifying the intrusion, assisting in expelling the intruder, helping in patching the vulnerability that the intruder exploited and, of course, holding hopefully the intruder accountable. And the FBI has done an extraordinary job in investigating and identifying bad actors.

A recent Pew poll found that about a third of parents are very or extremely worried about a school shooting occurring at their child’s school. You’re a parent. I’m curious, have you had these similar concerns from a parental perspective? And to what degree do you think that parents should be concerned about a shooting unfolding at their school?

It’s a tragic state of affairs when parents are concerned about sending their children to school because of a potential attack that impairs the safety and security of their children. 

It is important for schools to train their personnel and their students on how to respond in the case of an active shooter. When I was a child in Los Angeles, California, where I spent much of my youth, we were trained on responding to fires, to earthquakes, even to a bomb. School shooting was not in the panoply of threats to which we were trained to respond. Now, tragically, it is, and schools need to train and parents need to communicate in an informed way with their children — not in a way to create hysteria — but in a way to create vigilance and alertness.

Online platforms like forums have been used over the last several years to radicalize young people, whether that be to become mass school shooters, or to go down a path of white nationalism. I’m curious if you can elaborate a little bit on the landscape of these online forums and ways that we can combat that without stepping on the First Amendment? 

So the threats, the diversity of the threats is much broader than what you identified, of course. And this is where I spoke earlier about the need to communicate with children, with youth, who are impressionable, to be able to create a safe environment where they feel comfortable communicating with what they’re seeing. For parents to be vigilant in terms of privacy settings, to really develop digital literacy amongst our youth so that they can understand what is credible, what is not credible, what is threatening, and what is innocent.

We really have to do that, and we’re working in partnership with industry, with the private sector, with think tanks about how to best build that digital literacy. This also requires an all-of-community response, it is not for the government exclusively to engage in this. 

We are working with online gaming companies to really build a safe environment to really instruct children about the perils of the online environment, to really guard against cyberbullying as well as extremism that seeks to draw people to violence.

What is it about the gaming community? It’s interesting that you’re specifically reaching out to people in that space. Why? 

Well, we’re reaching out much more broadly. We engage with social media companies, we engage with thought leaders that are important voices. The gaming community reaches so many children, they’re a critical partner in developing a safe and secure ecosystem so people can understand the benefits of, as well as the perils of, the online environment. 

Our increased connectivity is a tremendous tool for achieving prosperity. It also brings risks to it.

Thank you so much for taking the time to field these questions and talk about this really important topic. Is there anything else that I haven’t asked, that you think is important? 

I want to return to a point of sadness and a point of vigilance. The point of sadness is, of course, we’re speaking about school safety and the fact that it is such a phenomenon right now. 

On the other hand, the community — and the federal government is a member of that community, but the community is much broader — is very, very alert to this phenomenon, and very vigilant in addressing it in a really productive and constructive way.

]]>
Can Educators and Police Predict the Next School Shooter? https://www.the74million.org/article/can-educators-and-police-predict-the-next-school-shooter/ Wed, 02 Nov 2022 19:24:27 +0000 https://www.the74million.org/?post_type=article&p=699197 Every school shooting can be stopped — but educators and police must identify youth with an affinity for violence and spring to action before a single shot is fired.

That’s the message that federal law enforcement officials touted Tuesday during a first-ever National Summit on K-12 School Safety and Security hosted by the Cybersecurity and Infrastructure Security Agency, a division of the Department of Homeland Security. While a demographic profile of school shooters doesn’t exist, according to Secret Service research, soon-to-be gunmen exhibit signs that can be identified prior to attacks — such as a fixation on violence or a history of depression. 

Officials endorsed “threat assessment,” an approach pioneered by the Secret Service that’s become a common but controversial strategy in schools to predict future perpetrators and prevent targeted campus violence. The Secret Service is part of Homeland Security.

“We’ve seen the tragedies that have happened when that information, on behavior that objectively elicits concern, was not acted on,” said Lina Alathari, chief of the Secret Service National Threat Assessment Center. “But we also need to make sure we’re setting a lower threshold for what we want to intervene with — such as being bullied, depression, suicidality — because we’ve also seen those in the background of these students that resorted to violence.”

Lina Alathari

Following the mass school shooting in May at a Uvalde, Texas, elementary school, districts statewide are set to receive nearly $8 million in federal funding for campus security, including for the creation of threat assessment teams. 

Yet the deployment of such teams, which generally include school administrators, mental health officials and police officers, has civil rights groups on edge. Critics warn the approach could misidentify struggling students as future gunmen and unnecessarily push them into the juvenile justice system. While school shootings remain statistically rare, student behaviors that are factors in threat assessments — like alcohol use and a history of mental health issues — are exceedingly common.

Such concerns were largely downplayed at this week’s summit, a three-day virtual event where law enforcement officials, educators and other experts gathered to offer recommendations in responding to a range of campus security risks, including mass school shootings, cyber attacks and online extremism. 

Steven Driscoll, the threat assessment center’s assistant chief, stressed that the approach is not “based on profiles or identifying types of students” but rather a focus on identifying threatening behaviors and intervening early. 

“Schools need training not only on the behavioral threat assessment process best practices but also on things like implicit biases which have historically permeated a variety of school-based programs,” Driscoll said. 

In a letter to the Education Department last year, a coalition of 50 student civil rights groups warned that the adoption of threat assessment in schools is “likely pushing many children of color and children with disabilities out of school, into the school-to-prison pipeline.” 

“These ‘threat assessments’ are likely to target large numbers of children who aren’t actual threats — including disproportionate numbers of children of color and children with disabilities — and cause them significant and lasting harm, while doing little or nothing to increase safety in schools,” according to the letter, which was signed by groups including the National Center for Youth Law, the National Disability Rights Network and the Council of Parent Attorneys and Advocates. “In addition, they may refer children to services that do not exist.”

Last year, a Secret Service report analyzed 67 school violence plots that were thwarted between 2006 and 2018, finding that plotters in each case were met with criminal charges or arrests. Yet the “primary objective” of threat assessments is not to administer discipline, the report notes, but to “identify students in crisis or distress and provide robust interventions, before their behavior escalates to the point of criminality.” 

Amy Lowder, the director of student safety and well-being at a suburban Charlotte, North Carolina school district, acknowledged during the summit that threat assessments conducted improperly can have detrimental effects on youth, including unnecessary student expulsions and juvenile justice referrals. That’s why it’s important, she said, for threat assessment teams to take “a whole-child approach in gathering the necessary information” about students causing concerns. 

Meanwhile, Greg Johnson, a high school principal from West Liberty, Ohio, said that school leaders must balance students’ civil rights against their need to ensure campuses are secure. Johnson was principal of West Liberty High School in 2017 when a student shot and injured a classmate. 

“You’ve got that balance because you want to support student rights and individual rights but you also want to keep people safe and that’s a huge responsibility,” Johnson said. “That’s a huge responsibility to keep your students safe.” 

In an interview with The 74 that opened the summit, Homeland Security Secretary Alejando Mayorkas noted that there have been more than 250 reports of gunfire at schools in 2022, more than any other year on record. 

Given the reality that school shooters often leak their plans to friends or online, summit panelists also endorsed a need to monitor students on the internet — a practice that has raised a separate set of civil rights and digital privacy concerns. That’s why it’s important for districts to employ experts in digital analyses, said Colton Easton, the project and training manager at Safer Schools Together, a Canadian-based, for-profit company that offers threat-assessment training and a team of threat analysts to assist districts in investigations. 

“Maybe a student made a threat involving a gun and we see that gun posted on TikTok, we would consider that behaviors consistent with the threat and law enforcement could obtain a search warrant and remove access to the means,” Easton said. “Today, digital leakage is that golden ticket for school safety and threat assessment teams.”

Sign up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

]]>
The School (in)Security Newsletter Spotlights Student Safety and Civil Rights https://www.the74million.org/article/the-school-insecurity-newsletter-coming-soon-to-an-inbox-near-you/ Thu, 22 Sep 2022 16:45:00 +0000 https://www.the74million.org/?post_type=article&p=696812 Following the tragic shooting in Uvalde, Texas, this spring, the back-to-school season has been particularly fraught for parents, whose fears for their children’s safety on campus have surged to their highest point in more than two decades

That’s why we’ve launched the School (in)Security newsletter to highlight how educators and lawmakers are responding to heightened parental anxieties in starkly divergent ways: from new mental health check-ins for students to militarized campus police with collapsible rifles strapped to their chests

Mark Keierleber

Edited by Investigative Reporter Mark Keierleber, the newsletter is a twice-monthly hub for the most critical news and information about the rights, safety and well-being of students in K-12 schools nationally. 

Keeping kids safe at school while safeguarding their individual rights is complex terrain. Keierleber, who has covered the school security industry, online student surveillance and student civil rights issues for years, will elevate the best journalism, research and advocacy devoted to exploring the tension. From the surge in book bans to new anti-LGBTQ rules, this is a moment when student thought and expression is often on a collision course with adult interests in their schools and communities. We want to be at the center of that conversation. 

Sign up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

By providing your email address, you consent to receiving newsletters from The 74. We will not sell your information to third parties. Read our Privacy Policy.

Sign up here to subscribe. Have a tip? Click here to send an email to Mark.

]]>
With ‘Don’t Say Gay’ Laws & Abortion Bans, Student Surveillance Raises New Risks https://www.the74million.org/article/with-dont-say-gay-laws-abortion-bans-student-surveillance-raises-new-risks/ Thu, 08 Sep 2022 10:30:00 +0000 https://www.the74million.org/?post_type=article&p=696150 While growing up along the Gulf Coast in Mississippi, Kenyatta Thomas relied on the internet and other teenagers to learn about sex.

Thomas and their peers watched videos during high school gym class that stressed the importance of abstinence — and the horrors that can come from sex before marriage. But for Thomas, who is bisexual and nonbinary, the lessons didn’t explain who they were as a person. 

“It was very confusing trying to navigate understanding who I am and my identity,” said Thomas, now a student at Arizona State University. It was on the internet that Thomas learned about a whole community of young people with similar experiences. Blog posts on Tumblr helped them make sense of their place in the world and what it meant to be bisexual. “I was able to find the words to understand who I am — words that I wouldn’t be able to piece together in a sentence if the internet wasn’t there.” 


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


But now, as states adopt anti-LGBTQ laws and abortion bans, the digital footprint that Thomas and other students leave may come back to harm them, privacy and civil rights advocates warn, and it could be their school-issued devices that end up exposing them to that legal peril.

For years, schools across the U.S. have used digital surveillance tools that collect a trove of information about youth sexuality — intimate details that are gleaned from students’ conversations with friends, diary entries and search histories. Meanwhile, student information collected by student surveillance companies are regularly shared with police, according to a recent survey conducted by the nonprofit Center for Democracy and Technology. These two realities are concerning to Elizabeth Laird, the center’s director of equity in civic technology. Following the Supreme Court’s repeal of Roe v. Wade in June, she said information about youth sexuality could be weaponized. 

 “Right now — without doing anything — schools may be getting alerts about students” who are searching the internet for resources related to reproductive health,” Laird said. “If you are in a state that has a law that criminalizes abortion, right now this tool could be used to enforce those laws.”

Teens across the country are already organizing and disseminating information to fill the void for themselves and their peers in the current climate. Thomas, the ASU student and an outspoken reproductive justice activist, said that while students are generally aware that school devices and accounts are monitored, the repeal of Roe has led some to take extra privacy precautions. 

Kenyatta Thomas, an Arizona State University student and activist, participates in an abortion-rights protest. (Photo courtesy Kenyatta Thomas)

“I have switched to using Signal to talk to friends and colleagues in this space,” they said, referring to the encrypted instant-messaging app. “The fear, even though it’s been common knowledge for basically my generation’s entire life that everything you do is being surveilled, it definitely has been amplified tenfold.”

Police have long used social media and other online platforms to investigate people for breaking abortion rules, including a recent case in Nebraska where police obtained a teen’s private Facebook messages through a search warrant before charging the then-17-year-old and her mother with violating the state’s ban on abortions after 20 weeks of pregnancy. 

LGBTQ students face similar risks as lawmakers in Florida and elsewhere impose rules that prohibit classroom discussions about sexuality and gender. This year alone, lawmakers have proposed 300 anti-LGBTQ bills and about a dozen have become law. They include so-called “Don’t Say Gay” laws in Florida and Alabama that ban classroom discussions about gender and sexuality and require school officials to tell the parents of children who share that they may be gay or transgender. 

In a survey, a fifth of LGBTQ students told the Center for Democracy and Technology that they or another student they knew had their sexual orientation or gender identity disclosed without their consent due to online student monitoring. They were more likely than straight and cisgender students to report getting into trouble for their web browsing activity and to be contacted by the police about having committed a crime. 

LGBTQ youth are nearly twice as likely as their straight and cisgender classmates to search for health information online, according to research by the nonprofit LGBT Tech. But as anti-LGBTQ laws proliferate, student surveillance tools should reconsider collecting data about youth sexuality, Christopher Wood, the group’s co-founder and executive director, told The 74. 

“Right now, we are not in a landscape or an environment where that is safe for a company to be doing,” Wood said. “If there is a remote possibility that the information that they are trying to provide to help a student could potentially lead them into more harm, then they need to be looking at that very carefully and considering whether that is the appropriate direction for a company to be taking.”

Digital student monitoring tools have a negative disparate impact on LGBTQ youth, according to a recent student survey by the nonprofit Center for Democracy and Technology. (Photo courtesy Center for Democracy and Technology)

‘Extraordinarily concerned’

For decades, federal law has required school technology to block access to images that are obscene, child pornography or deemed “harmful to minors,” and schools have used web-filtering software to prevent students from accessing sexually explicit content. But in some cases, the filtering software has been programmed to block pro-LGBTQ websites that aren’t explicit, including those that offer crisis counseling.  

Many student monitoring tools, which saw significant growth during the pandemic, go far beyond web filtering and employ artificial intelligence to track students across the web to identify issues like depression and violent impulses. The tools can sift through students’ social media posts, follow their digital movements in real time and scan files on school-issued laptops — from classroom assignments to journal entries — in search of warning signs. 

They’ve also come under heightened scrutiny. In a report this year, Democratic Sens. Elizabeth Warren and Ed Markey warned that schools’ widespread adoption of the tools could trample students’ civil rights. By flagging words related to sexual orientation, the report notes, LGBTQ youth could be subjected to disproportionate disciplinary rates and be unintentionally outed to their parents. 

In a follow-up letter in July, Warren and Markey cautioned that the tools could pose new risks following the repeal of Roe and asked four leading student surveillance companies — GoGuardian, Gaggle, Securly and Bark — whether they flag students for using keywords related to reproductive health, such as “pregnant” and “abortion.”

“We are extraordinarily concerned that your software could result in punishment or criminalization of students seeking contraception, abortion or other reproductive health care,” Markey and Warren wrote. “With reproductive rights under attack nationwide, it would represent a betrayal of your company’s mission to support students if you fail to provide appropriate protections for students’ privacy related to reproductive health information.”

Student activity monitoring tools are more often used to discipline students than protect them from violence and mental health crises, according to a recent teacher survey by the nonprofit Center for Democracy and Technology. (Photo courtesy Center for Democracy and Technology)

The scrutiny is part of a larger concern over digital privacy in the post-Roe world. In August, the Federal Trade Commission sued a data broker and accused the company of selling the location data from hundreds of millions of cell phones that could be used to track peoples’ movements. Such precise location data, the government wrote in a complaint, “may be used to track consumers to sensitive locations, including places of religious worship, places that may be used to infer an LGBTQ+ identification, domestic abuse shelters, medical facilities and welfare and homeless shelters.” 

School surveillance companies have acknowledged their tools track student references to sex but sought to downplay the risks they pose to students. Bark spokesperson Adina Kalish said the company began to immediately purge all data related to reproductive health after a leaked Supreme Court draft opinion suggested Roe’s repeal was imminent – despite maintaining a 30-day retention period for most other data. 

“By immediately and permanently deleting data which contains a student’s reproductive health data or searches for reproductive health information, such data is not in our possession and therefore not produce-able under a court order, subpoena, etc.,” Bark CEO Brian Bason wrote in a response letter, which the company shared with The 74. 

GoGuardian spokesperson Jeff Gordon said its tools “cannot be used by educators or schools to flag reproductive health-related search terms” and its web filter cannot “flag reproductive health-related searches.” Securly didn’t respond to requests for comment. Last year a Vice News investigation found its web-filtering tool categorized health resources for LGBTQ teens as pornography. 

Gaggle founder and CEO Jeff Patterson wrote in a letter to the senators that his company does not “collect health data of any kind including reproductive health information,” specifying that the monitoring tool does not flag students who use the terms “pregnant, abortion, birth control, contraception or Planned Parenthood. ” 

Yet tracking conversations about sex is a primary part of Gaggle’s business — more than references to suicide, violence or drug use, according to nearly 1,300 incident reports generated by the company for Minneapolis Public Schools during a six-month period in 2020. The reports, obtained by The 74, showed that 38% were prompted by content that was pornographic or sexual in nature, including references to “sexual activity involving a student.” Students were regularly flagged for using keywords like “virginity,” “rape,” and, simply, “sex.” 

Patterson, the Gaggle CEO, has acknowledged that a student’s private diary entry about being raped wasn’t off limits. In touting the tool’s capabilities, he told The 74 his company uncovered the girl’s diary entry, where she discussed how the assault led to self-esteem issues and guilt. Nobody knew she was struggling until Gaggle notified school officials about what they’d learned from her diary, Patterson said. 

“They were able to intervene and get this girl help for things that she couldn’t have dealt with on her own,” Patterson said.

Any information that surveillance companies collect about students’ sexual behaviors could be used against them by police during investigations, privacy experts warned. And it’s unclear, Laird said, how long the police can retain any data gleaned from the tools. 

‘Don’t Say Gay’

Internet search engines are “particularly potent” tools to track the behaviors of pregnant people, according to a recent report by the nonprofit Surveillance Technology Oversight Project. In 2017, for example, a Mississippi woman was charged with second-degree murder of her stillborn fetus after police scoured her browser history and identified a search for an abortion pill. 

While GoGuardian and other companies offer web filtering to schools, Gaggle has sought to differentiate itself. In his letter to the senators, Patterson said the company — which sifts through files and chat messages on students’ school-issued Microsoft and Google accounts — is not a web filter and therefore “does not track students’ online searches.” Yet Patterson’s assurance to lawmakers appears misleading. The company acknowledges on its website that it partners with several web-filtering companies, including Linewize, to analyze students’ online searches. By working in tandem, flags triggered by Linewize’s web filtering “can be sent straight to the Gaggle Safety Team,” which will determine if the material “should be forwarded to the school or district.” 

In an email, Gaggle spokesperson Paget Hetherington said that in “a very small number of school systems,” the company reviews alerts from web filters before they’re sent to school officials to “alleviate the large number of false positives” and ensure that “only the most critical and imminent issues are being seen by the district.” 

Gaggle has also faced scrutiny for including LGBTQ-specific keywords in its algorithm, including “gay” and “lesbian.” Patterson said the heightened surveillance of LGBTQ youth is necessary because they face a disproportionately high suicide rate, and Hetherington shared examples where the keywords were used to spot cyberbullying incidents. 

But critics have accused the company of discrimination. Wood of the nonprofit LGBT Tech said that anti-LGBT activists have used surveillance to target their opponents for generations. Prior to the seminal 1969 riots after New York City police raided the Stonewall Inn gay bar, officers routinely raided LGBTQ spaces and made arrests for “inferring sexual perversion” and “serving gay people.” From the colonial era and into the 19th century, anti-sodomy laws carried the death penalty and police used the rules to investigate and incarcerate people suspected of same-sex intimate behaviors. 

Now, in the era of “Don’t Say Gay” laws, digital surveillance tools could be used to out LGBTQ students and put them in danger, Wood said. Student surveillance companies can claim their decision to include LGBTQ terminology is designed to help students, but historically such data have “been used against us in very detrimental ways.” 

Companies, he said, are unable to control how officials use that information in an era “where teachers and administrators and other students are encouraged to out other students or blame them or somehow get them in trouble for their identity.” In Texas, Republican Gov. Greg Abbott issued a February directive calling on child protective services to investigate as child abuse any parents who provide gender-affirming health care to their transgender children. 

“They can’t control what’s going to happen in Florida or Texas and they can’t control what’s going to happen in an individual home,” where students could be subjected to abuse, Wood said. “Any person in their right mind would be horrified to learn that it was their technology that ended up harming a youth or driving a youth to the point of feeling so isolated that they felt the only way out was suicide.” 

When private thoughts become public

Susan, a 14-year-old from Cincinnati, knows firsthand how surveillance companies can target students for discussing their sexuality. In middle school, she was assigned to write a “time capsule” letter to her future self. 

Until Susan retrieved the letter after high school graduation, her teacher said that no one — not even him — would read it. So Susan, who is now a freshman and asked to remain anonymous, used the private space to question her gender identity. 

But her teacher’s assurance wasn’t quite true, she learned. Someone had been reading the letter — and would soon hold it against her. 

In an automated May 2021 email, Gaggle notified her that the letter to her future self was “identified as inappropriate” and urged her to “refrain from storing or sharing inappropriate content.” In a “second warning,” sent to her inbox, she was told a school administrator was given “access to this violation.” After a third alert, she said, access to her school email account was restricted. She said the experience left her with “a sense of betrayal from my school.” She said she had no idea words like “gay” or “sex” could get flagged by Gaggle’s algorithm.

Susan, a student from Cincinnati, received an email alert from Gaggle notifying her that her classroom assignment, a “time capsule” letter to her future self, had been “identified as inappropriate.” (Courtesy Susan)

“It’s frustrating to know that this program finds the need to have these as keywords, and quite depressing,” she said. “There’s always going to be oppression against the community somewhere, it seems, and it’s quite disheartening.” 

School administrators reviewed the time capsule letter and determined it didn’t contain anything inappropriate, her mother Margaret said. While Susan lives in an LGBTQ-affirming household, Thomas, who grew up in Mississippi, warned that’s not the case for everyone.

“That’s not just the surveillance of your activities, that’s the surveillance of your thoughts,” Thomas said of Susan’s experience. “I know that wouldn’t have gone very well for me and I know for a lot of young people that would place them in a lot of danger.”

Such harms could be exacerbated, Margaret said, if authorities use student data to enforce Ohio’s strict abortion ban, which has already become the subject of national debate after a 10-year-old girl traveled to Indiana for an abortion. A 27-year-old man has been indicted and accused of raping the child. 

Cincinnati Public Schools spokesman Mark Sherwood said in an email that “law enforcement is immediately contacted” if the district receives an alert from Gaggle suggesting that a student poses “an imminent threat of harm to self or others.” 

Given the state of abortion rules in Ohio, Susan said she’s concerned that student conversations and classroom assignments that discuss gender and sexuality could wind up in the hands of the police. She lost faith in school-issued technology after her assignment got flagged by Gaggle. 

“I just flat out don’t trust adults in positions of power or authority,” Susan said. “You don’t really know for sure what their true motives are or what they could be doing with the tools they have at their disposal.”

]]>
Survey Reveals Extent that Cops Surveil Students Online — in School and at Home https://www.the74million.org/article/survey-reveals-extent-that-cops-surveil-students-online-in-school-and-at-home/ Wed, 03 Aug 2022 04:01:00 +0000 https://www.the74million.org/?post_type=article&p=694119 When Baltimore students sign into their school-issued laptops, the police log on, too. 

Since the pandemic began, Baltimore City Public Schools officials have tracked students’ online lives with GoGuardian, a digital surveillance tool that promises to identify youth at risk of harming themselves or others. When GoGuardian flags students, their online activities are shared automatically with school police, giving cops a conduit into kids’ private lives — including on nights and weekends.


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


Such partnerships between schools and police appear startlingly widespread across the country with significant implications for youth, according to the results of a national survey released Wednesday by the nonprofit Center for Democracy and Technology. Nearly all teachers — 89% — reported that digital student monitoring tools like GoGuardian are used in their schools. And nearly half — 44% — said students have been contacted by the police as a result of student monitoring. 

The pandemic has led to major growth in the number of schools that rely on activity monitoring software to uncover student references to depression and violent impulses. The tools, offered by a handful of tech companies, can sift through students’ social media posts, follow their digital movements in real-time and scan files on school-issued laptops — from classroom assignments to journal entries — in search of warning signs. 

Educators say the tools help them identify youth who are struggling and get them the mental health care they need at a time when youth depression and anxiety are spiraling. But the survey suggests an alternate reality: Instead of getting help, many students are being punished for breaking school rules. And in some cases, survey results suggest, students are being subjected to discrimination. 

The report raises serious questions about whether digital surveillance tools are the best way to identify youth in need of mental health care and whether police officers should be on the front lines in responding to such emergencies. 

“If we’re saying this is to keep students safe, but instead we’re using it punitively and we’re using it to invite law enforcement literally into kids’ homes, is this actually achieving its intended goal?” asked Elizabeth Laird, a survey author and the center’s director of equity in civic technology. “Or are we, in the name of keeping students safe, actually endangering them?”

Among teachers who use monitoring tools at their schools, 78% said the software has been used to flag students for discipline and 59% said kids wound up getting punished as a result. Yet just 45% of teachers said the software is used to identify violent threats and 47% said it is used to identify students at risk of harming themselves. 

Center for Democracy and Technology

The findings are a direct contradiction of the stated goal of student activity monitoring, Laird said. School leaders and company executives have long maintained that the tools are not a disciplinary measure but are designed to identify at-risk students before someone gets hurt.

The Supreme Court’s recent repeal of Roe v. Wade, she said, further muddles police officers’ role in student activity monitoring. As states implement anti-abortion laws, civil rights groups have warned that data from student activity monitoring tools could help the police identify youth seeking reproductive health care. 

“We know that law enforcement gets these alerts,” she said. “If you are in a state where they are looking to investigate these kinds of incidents, you’ve invited them into a student’s house to be able to do that.”

A tale of discrimination

In Baltimore, counselors, principals and school-based police officers receive all alerts generated by GoGuardian during school hours, according to an October 2021 investigative report by The Real News Network, a nonprofit media outlet. Outside of school hours, including on weekends and holidays, the responsibility to monitor alerts falls on the police, the outlet reported, and on numerous occasions officers have shown up at students’ homes to conduct wellness checks. On multiple occasions, students have been transported to the hospital for emergency mental health care. 

In a statement to The 74, district spokesperson Andre Riley said that GoGuardian helps officials “identify potential risks to the safety of individual students, groups or schools,” and that “proper accountability measures are taken” if students violate the code of conduct or break laws.

“The use of GoGuardian is not simply a prompt for a law enforcement response,” Riley added.

Leading student surveillance companies, including GoGuardian, have maintained that their interactions with police are limited. In April, Democratic Sens. Elizabeth Warren and Ed Markey warned in a report that schools’ reliance on the tools could violate students’ civil rights and exacerbate “the school-to-prison pipeline by increasing law enforcement interactions with students.” Warren and Markey focused their report on four companies: GoGuardian, Gaggle, Securly and Bark. 

In a letter to Warren and Markey, Gaggle executives said the company contacts law enforcement for wellness checks if they are unable to reach school-based emergency contacts and a child appears to be “in immediate danger.” In blog posts on the company’s website, school officials in Wichita Falls, Texas, Cincinnati, Ohio, and Miami, Florida, acknowledged contacting police in response to Gaggle alerts.

In some cases, school leaders ask Securly to contact the police directly and request they conduct welfare checks on students, the company wrote in its letter to lawmakers. Executives at Bark said “there are limited options” beyond police intervention if they identify a student in crisis but they cannot reach a school administrator. 

“While we have witnessed many lives saved by police in these situations, unfortunately many officers have not received training in how to handle such crises,” the company acknowledged in its letter. “Irrespective of training there is always a risk that a visit from law enforcement can create other negative outcomes for a student and their family.” 

In its privacy policy, GoGuardian states the company may disclose student information “if we believe in good faith that doing so is necessary or appropriate to comply with any law enforcement, legal or regulatory process.” 

Center for Democracy and Technology

Meanwhile, survey results suggest that student surveillance tools have a negative disparate impact on Black and Hispanic students, LGBTQ youth and those from low-income households. In a letter on Wednesday to coincide with the survey’s release, a coalition of education and civil rights groups called on the U.S. Department of Education to issue guidance warning schools that their digital surveillance practices could violate federal civil rights laws. Signatories include the American Library Association, the Data Quality Campaign and the American Civil Liberties Union.

“This is becoming a conversation not just about privacy, but about discrimination,” Laird said. “Without a doubt, we see certain groups of students having outsized experiences in being directly targeted.”

In a youth survey, researchers found that student discipline as a result of activity monitoring fell disproportionately along racial lines, with 48% of Black students and 55% of Hispanic students reporting that they or someone they knew got into trouble for something that was flagged by an activity monitoring tool. Just 41% of white students reported having similar experiences. 

Nearly a third of LGBTQ students said they or someone they know experienced nonconsensual disclosure of their sexual orientation or gender identity — often called outing — as a result of activity monitoring. LGBTQ youth were also more likely than straight and cisgender students to report getting into trouble at school and being contacted by the police about having committed a crime. 

Some student surveillance companies, like Gaggle, monitor references to words including “gay” and “lesbian,” a reality company founder and CEO Jeff Patterson has said was created to protect LGBTQ youth, who face a greater risk of dying by suicide. But survey results suggest the heightened surveillance comes with significant harm to youth, and Laird said if monitoring tools are designed with certain students in mind, such as LGBTQ youth, that in itself is a form of discrimination. 

Center for Democracy and Technology

In its letter to the Education Department’s Office for Civil Rights Wednesday, advocates said the disparities outlined in the survey run counter to federal laws prohibiting race-, sex- and disability-based discrimination. 

“Student activity monitoring is subjecting protected classes of students to increased discipline and interactions with law enforcement, invading their privacy, and creating hostile environments for students to express their true thoughts and authentic identities,” the letter states. 

The Education Department’s civil rights division, they said, should condemn surveillance practices that violate students’ civil rights and launch “enforcement action against violations that result in discrimination.”

Lawmakers consider youth privacy

The report comes at a moment of increasing alarm about student privacy online. In May, the Federal Trade Commission announced plans to crack down on tech companies that sell student data for targeted advertising and that “illegally surveil children when they go online to learn.” 

It also comes at a time of intense concern over students’ emotional and physical well-being. While the pandemic has led to a greater focus on youth mental health, the May mass school shooting in Uvalde, Texas, has sparked renewed school safety efforts. In June, President Joe Biden signed a law with modest new gun-control provisions and an influx of federal funding for student mental health care and campus security. The funds could lead to more digital student surveillance.

The results of the online survey, which was conducted in May and June, were likely colored by the Uvalde tragedy, researchers acknowledged. A majority of parents and students have a favorable view of student activity monitoring during school hours to protect kids from harming themselves or others, researchers found. But just 48% of parents and 30% of students support around-the-clock surveillance. 

“Schools are under a lot of pressure to find ways to keep students safe and, like in many aspects of our lives, they are considering the role of technology,” Laird said. 

Last week, the Senate approved two bipartisan bills designed to improve children’s safety online, including new restrictions on youth-focused targeted advertising. The effort comes a year after a whistleblower disclosed research showing that the social media app Instagram had a harmful effect on youth mental well-being, especially teenage girls. One bill, the Kids Online Safety Act, would require tech companies to identify and mitigate any potential harms their products may pose to children, including exposure to content that promotes self-harm, eating disorders and substance abuse.

Yet the legislation has faced criticism from privacy advocates, who argue it would mandate digital monitoring similar to that offered by student surveillance companies. Among critics is the Electronic Frontier Foundation, a nonprofit focused on digital privacy and free speech. 

“The answer to our lack of privacy isn’t more tracking,” the group argued in a report. The legislation “is a heavy-handed plan to force technology companies to spy on young people and stop them from accessing content that is ‘not in their best interest,’ as defined by the government, and interpreted by tech platforms.” 

Attorney Amelia Vance, the founder and president of Public Interest Privacy Consulting, said she worries the provisions will have a negative impact on at-risk kids, including LGBTQ students. Students from marginalized groups, she said, “will now be more heavily surveilled by basically every site on the internet, and that information will be available to parents” who could discipline teens for researching LGBTQ content. She said the legislation could force tech companies to censor content to avoid potential liability, essentially making them arbiters of community standards. 

“When you have conflicting values in the different jurisdictions that the companies operate in, oftentimes you end up with the most conservative interpretations, which right now is anti-LGBT,” she said.

]]>
FTC Targets Ed Tech Companies that ‘Illegally Surveil Children’ https://www.the74million.org/article/ftc-announces-plan-to-target-ed-tech-tools-that-illegally-surveil-children/ Fri, 20 May 2022 21:53:00 +0000 https://www.the74million.org/?post_type=article&p=589724 The Federal Trade Commission announced ramped-up enforcement of education technology companies that sell student data for targeted advertising and that “illegally surveil children when they go online to learn,” in violation of federal student privacy rules.

“It is against the law for companies to force parents and schools to surrender their children’s privacy rights in order to do schoolwork online or attend class remotely,” the federal agency said in a media release Thursday. “Under the federal Children’s Online Privacy Protection Act (COPPA), companies cannot deny children access to educational technologies when their parents or school refuse to sign up for commercial surveillance.” 


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


Through a new policy statement, the commission signaled its intent to “scrutinize compliance” with COPPA, the federal law that limits the data that technology companies can collect on children under 13 without parental consent. The statement, approved through a unanimous bipartisan vote by the five commissioners, reminds education technology companies that they are prohibited from using student data for commercial purposes, including for marketing and advertising, should not retain student data for a period longer than what’s deemed “reasonably necessary,” and must have sufficient security to ensure data remain confidential. Additionally, tech companies must not exclude students who do not disclose more personal information “than is reasonably necessary for the child to participate in that activity.” 

The policy statement comes at a critical moment for education technology companies. When the pandemic shuttered schools nationally and forced children into remote learning, their place in the education landscape grew exponentially as educators relied more heavily on their services. But they’ve also faced scrutiny for their data collection practices, particularly in the wake of high-profile breaches. School districts in at least four states recently notified students that their personal data was compromised in a breach at the company Illuminate Education. The hack exposed the personal information of some 820,000 current and former students in New York City, the nation’s largest school district.

The FTC statement does not introduce any new rules, yet it makes clear that education technology and student privacy are an enforcement priority. Weak enforcement of student privacy rules has been a longstanding problem, said Cody Venzke, senior counsel at the nonprofit Center for Democracy and Technology.

Suggesting that the federal government had gone too easy on ed tech companies in the past, President Joe Biden criticized student surveillance practices on Thursday and signaled his support for greater student privacy protections. 

“When children and parents access online educational products, they shouldn’t be forced to accept tracking and surveillance to do so,” Biden said in a statement. The FTC, he said, “will be cracking down on companies that persist in exploiting our children to make money.” 

Among the services and applications that saw significant growth during the pandemic are those that monitor students’ online activities on school-issued devices and technology. Company executives say their digital products are critical to identify youth who are at risk of harming themselves or others, but critics argue the surveillance violates students’ privacy rights. 

The 74 has reported extensively on the expanding presence of such student surveillance companies, including Gaggle, which sifts through billions of student communications on school-issued Google and Microsoft accounts each year in search of references to violence and self-harm. Company executives say the tools save live,s but critics argue they could surveil students inappropriately, compound racial disparities in school discipline and waste tax dollars.

In one recent story, former content moderators on the front lines of Gaggle’s student monitoring efforts raised significant questions about the company’s efficacy and its effects on students’ civil rights. The former moderators reported insufficient safeguards to protect students’ sensitive data, a work culture that prioritized speed over quality, limited training and frequent exposure to explicit content that left some traumatized. 

In remarks on Thursday, FTC Chair Lina Khan said that “commercial surveillance cannot be a condition of doing schoolwork.” 

“Though widespread tracking, surveillance and expansive use of data across contexts have become increasingly common practices across the broader economy,” Khan said, the policy makes clear that federal law “forbids companies from wholesale extending these practices into the context of schools and learning.” 

The FTC’s comments on surveillance, Venzke said in an email, suggest that the agency will scrutinize the practices of education technology vendors that collect “troves of sensitive information about students’ lives, including student activity monitoring software vendors.” 

“Student activity monitoring companies must ensure they are taking appropriate steps to not only secure the sensitive data they collect on students, but also to ensure that they are collecting only the absolute minimum data that they need to achieve a legitimate educational purpose — and then that they delete the data when it is no longer needed,” Venzke said.

A Gaggle spokesperson didn’t immediately respond to a request for comment. In a blog post on Thursday, the company noted that it takes “data security very seriously,” only uses student information for educational purposes, has a strict data retention policy and has comprehensive security standards. The post said the company does not sell student data or engage in targeted advertising. 

Numerous companies have faced fines in recent years for violating the federal privacy law. In 2019, for example, YouTube paid a record $170 million FTC fine to settle allegations it collected childrens’ data without parental consent and used it for targeted advertising. TikTok paid a $5.7 million fine that same year to settle similar allegations. 

Amelia Vance

Despite the commission’s harsh critique of surveillance, the enforcement of student privacy rules will likely go beyond companies that monitor students online, said attorney Amelia Vance. the co-founder and president of Public Interest Privacy Consulting. She interpreted the FTC announcement to broadly encompass “surveillance capitalism,” where personal data are collected and sold for profit. However, she noted that Gaggle and other monitoring companies could have particular problems. In its announcement, the FTC said it is unreasonable for education technology companies to retain student data “for speculative future potential purposes.”

“So much of the monitoring information collected and kept, especially when it comes to tracking the mental health of students, it could easily, arguably be speculative,” she said. “That could cause confusion from companies about what obligations they have to either collect certain data or not collect certain data or not retain certain data even when the school has asked for it.” 

The FTC announcement follows a recent investigation into student monitoring companies by Democratic Sens. Elizabeth Warren and Ed Markey, which warned of surveillance companies’ potential harms and called on the Federal Communications Commission to clarify the provisions of another federal law, the Children’s Internet Protection Act, which requires schools to monitor students’ online activities.

In response to the FTC statement, a bipartisan group of senators cautioned that threats to online privacy have reached “a crisis point.” 

“We applaud the FTC’s attention to this urgent problem and its acknowledgment that a child’s education should never come at the expense of their privacy,” said a statement released by Markey, fellow Democratic Sen. Richard Blumenthal and Republican Sens. Bill Cassidy and Cynthia Lummis. “The FTC’s policy statement is an important step in the right direction, but it is not a replacement for legislative action.”

]]>
Schools Bought Security Cameras to Fight COVID. Did it Work? https://www.the74million.org/article/from-face-mask-detection-to-temperature-checks-districts-bought-ai-surveillance-cameras-to-fight-covid-why-critics-call-them-smoke-and-mirrors/ Wed, 30 Mar 2022 11:01:00 +0000 https://www.the74million.org/?post_type=article&p=587174 This story is part of a series produced in partnership with The Guardian exploring the increasing role of artificial intelligence and surveillance in our everyday lives during the pandemic, including in schools.

When students in suburban Atlanta returned to school for in-person classes amid the pandemic, they were required to cover their faces with cloth masks like in many places across the U.S. Yet in this 95,000-student district, officials took mask compliance a step further than most. 

Through a network of security cameras, officials harnessed artificial intelligence to identify students whose masks drooped below their noses. 


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


“If they say a picture is worth a thousand words, if I send you a piece of video — it’s probably worth a million,” said Paul Hildreth, the district’s emergency operations coordinator. “You really can’t deny, ‘Oh yeah, that’s me, I took my mask off.’”

The school district in Fulton County had installed the surveillance network, by Motorola-owned Avigilon, years before the pandemic shuttered schools nationwide in 2020. Under a constant fear of mass school shootings, districts in recent years have increasingly deployed controversial surveillance networks like cameras with facial recognition and gun detection.

With the pandemic, security vendors switched directions and began marketing their wares as a solution to stop the latest threat. In Fulton County, the district used Avigilon’s “No Face Mask Detection” technology to identify students with their faces exposed. 

During remote learning, the pandemic ushered in a new era of digital student surveillance as schools turned to AI-powered services like remote proctoring and digital tools that sift through billions of students’ emails and classroom assignments in search of threats and mental health warning signs. Back on campus, districts have rolled out tools like badges that track students’ every move

But one of the most significant developments has been in AI-enabled cameras. Twenty years ago, security cameras were present in 19 percent of schools, according to the National Center for Education Statistics. Today, that number exceeds 80 percent. Powering those cameras with artificial intelligence makes automated surveillance possible, enabling things like temperature checks and the collection of other biometric data.

Districts across the country have said they’ve bought AI-powered cameras to fight the pandemic. But  as pandemic-era protocols like mask mandates end, experts said the technology will remain. Some educators have stated plans to leverage pandemic-era surveillance tech for student discipline while others hope AI cameras will help them identify youth carrying guns. 

The cameras have faced sharp resistance from civil rights advocates who questioned their effectiveness and argue they trample students’ privacy rights.

Noa Young, a 16-year-old junior in Fulton County, said she knew that cameras monitored her school but wasn’t aware of their high-tech features like mask detection. She agreed with the district’s now-expired mask mandate but felt that educators should have been more transparent about the technology in place.

“I think it’s helpful for COVID stuff but it seems a little intrusive,” Young said in an interview. “I think it’s strange that we were not aware of that.”

‘Smoke and mirrors’

Outside of Fulton County, educators have used AI cameras to fight COVID on multiple fronts. 

In Rockland Maine’s Regional School Unit 13, officials used federal pandemic relief money to procure a network of cameras with “Face Match” technology for contact tracing. Through advanced surveillance, the cameras by California-based security company Verkada allow the 1,600-student district to identify students who came in close contact with classmates who tested positive for COVID-19. In its marketing materials, Verkada explains how districts could use federal funds tied to the public health crisis to buy its cameras for contact tracing and crowd control. 

At a district in suburban Houston, officials spent nearly $75,000 on AI-enabled cameras from Hikvision, a surveillance company owned in part by the Chinese government, and deployed thermal imaging and facial detection to identify students with elevated temperatures and those without masks. 

The cameras can screen as many as 30 people at a time and are therefore “less intrusive” than slower processes, said Ty Morrow, the Brazosport Independent School District’s head of security. The checkpoints have helped the district identify students who later tested positive for COVID-19, Morrow said, although a surveillance testing company has argued Hikvision’s claim of accurately scanning 30 people at once is not possible. 

“That was just one more tool that we had in the toolbox to show parents that we were doing our due diligence to make sure that we weren’t allowing kids or staff with COVID into the facilities,” he said.  

Yet it’s this mentality that worries consultant Kenneth Trump, the president of Cleveland-based National School Safety and Security Services. Security hardware for the sake of public perception, the industry expert said, is simply “smoke and mirrors.”

“It’s creating a façade,” he said. “Parents think that all the bells and whistles are going to keep their kids safer and that’s not necessarily the case. With cameras, in the vast majority of schools, nobody is monitoring them.”

‘You don’t have to like something’

When the Fulton County district upgraded its surveillance camera network in 2018, officials were wooed by Avigilon’s AI-powered “Appearance Search,” which allows security officials to sift through a mountain of video footage and identify students based on characteristics like their hairstyle or the color of their shirt. When the pandemic hit, the company’s mask detection became an attractive add-on, Hildreth said.

He said the district didn’t actively advertise the technology to students but they likely became aware of it quickly after students got called out for breaking the rules. He doesn’t know students’ opinions about the cameras — and didn’t seem to care. 

“I wasn’t probably as much interested in their reaction as much as their compliance,” Hildreth said. “You don’t have to like something that’s good for you, but you still need to do it.”

A Fulton County district spokesman said they weren’t aware of any instances where students were disciplined because the cameras caught them without masks. 

After the 2018 mass school shooting in Parkland, Florida, the company Athena Security pitched its cameras with AI-powered “gun detection” as a promising school safety strategy. Similar to facial recognition, the gun detection system uses artificial intelligence to spot when a weapon enters a camera’s field of view. By identifying people with guns before shots are fired, the service is “like Minority Report but in real life,” a company spokesperson wrote in an email at the time, referring to the 2002 science-fiction thriller that predicts a dystopian future of mass surveillance. During the pandemic, the company rolled out thermal cameras that a company spokesperson wrote in an email could “accurately pre-screen 2,000 people per hour.”

The spokesperson declined an interview request but said in an email that Athena is “not a surveillance company” and did not want to be portrayed as “spying on” students. 

Among the school security industry’s staunchest critics is Sneha Revanur, a 17-year-old high school student from San Jose, California, who founded the youth-led group Encode Justice to highlight the dangers of artificial intelligence on civil liberties. 

Revanur said she’s concerned by districts’ decisions to implement surveillance cameras as a public health strategy and that the technology in schools could result in harsher discipline for students, particularly youth of color. 


Sneha Revanur

Verkada offers a cautionary tale about the potential harms of pervasive school surveillance and student data collection. Last year, the company suffered a massive data breach when a hack exposed the live feeds of 150,000 surveillance cameras, including those inside Tesla factories, jails and at Sandy Hook Elementary School in Newtown, Connecticut. The Newtown district, which suffered a mass school shooting in 2012, said the breach didn’t expose compromising information about students. The vulnerability hasn’t deterred some educators from contracting with the California-based company. 

After a back-and-forth with the Verkada spokesperson, the company would not grant an interview or respond to a list of written questions. 

Revanur called the Verkada hack at Sandy Hook Elementary a “staggering indictment” of educators’ rush for “dragnet surveillance systems that treat everyone as a constant suspect” at the expense of student privacy. Constant monitoring, she argued, “creates this culture of fear and paranoia that truly isn’t the most proactive response to gun violence and safety concerns.” 

In Fayette County, Georgia, the district spent about $500,000 to purchase 70 Hikvision cameras with thermal imaging to detect students with fevers. But it ultimately backtracked and disabled them after community uproar over their efficacy and Hikvision’s ties to the Chinese government. In 2019, the U.S. government imposed a trade blacklist on Hikvision, alleging the company was implicated in China’s “campaign of repression, mass arbitrary detention and high-technology surveillance” against Muslim ethnic minorities.

 The school district declined to comment. In a statement, a Hikvision spokesperson said the company “takes all reports regarding human rights very seriously” and has engaged governments globally “to clarify misunderstandings about the company.” The company is “committed to upholding the right to privacy,” the spokesperson said. 

Meanwhile, Regional School Unit 13’s decision to use Verkada security cameras as a contact tracing tool could run afoul of a 2021 law that bans the use of facial recognition in Maine schools. The district didn’t respond to requests for comment. 

Michael Kebede, the ACLU of Maine’s policy counsel, cited recent studies on facial recognition’s flaws in identifying children and people of color and called on the district to reconsider its approach. 

“We fundamentally disagree that using a tool of mass surveillance is a way to promote the health and safety of students,” Kobede said in a statement. “It is a civil liberties nightmare for everyone, and it perpetuates the surveillance of already marginalized communities.”

Security officials at the Brazosport Independent School District in suburban Houston use AI-enabled security cameras to screen educators for elevated temperatures. District leaders mounted the cameras to carts so they could be used in various locations across campus. (Courtesy Ty Morrow)

White faces

In Fulton County, school officials wound up disabling the face mask detection feature in cafeterias because it was triggered by people eating lunch. Other times, it identified students who pulled their masks down briefly to take a drink of water. 

In suburban Houston, Morrow ran into similar hurdles. When white students wore light-colored masks, for example, the face detection sounded alarms. And if students rode bikes to school, the cameras flagged their elevated temperatures. 

“We’ve got some false positives but it was not a failure of the technology,” Hildreth said. “We just had to take a look and adapt what we were looking at to match our needs.”

With those lessons learned, Hildreth said he hopes to soon equip Fulton County campuses with AI-enabled cameras that identify students who bring guns to school. He sees a future where algorithms identify armed students “in the same exact manner” as Avigilon’s mask detection. 

In a post-pandemic world, Albert Fox Cahn, founder of the nonprofit Surveillance Technology Oversight Project, worries the entire school security industry will take a similar approach. In February, educators in Waterbury, Connecticut, spurred controversy when they proposed a new network of campus surveillance cameras with weapons detection. 

“With the pandemic hopefully waning, we’ll see a lot of security vendors pivoting back to school shooting rhetoric as justification for the camera systems,” he said. Due to the potential for errors, Cahn called the embrace of AI gun detection “really alarming.” 

Disclosure: This story was produced in partnership with The Guardian. It is part of a reporting series that is supported by the Open Society Foundations, which works to build vibrant and inclusive democracies whose governments are accountable to their citizens. All content is editorially independent and overseen by Guardian and 74 editors.

]]>
Opinion: School Surveillance of Students Via Laptops May Do More Harm Than Good https://www.the74million.org/article/school-surveillance-of-students-via-laptops-may-do-more-harm-than-good/ Wed, 19 Jan 2022 14:01:00 +0000 https://www.the74million.org/?post_type=article&p=583295 Ever since the start of the pandemic, more and more public school students are using laptops, tablets or similar devices issued by their schools.

The percentage of teachers who reported their schools had provided their students with such devices doubled from 43% before the pandemic to 86% during the pandemic, a September 2021 report shows.


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


In one sense, it might be tempting to celebrate how schools are doing more to keep their students digitally connected during the pandemic. The problem is, schools are not just providing kids with computers to keep up with their schoolwork. Instead – in a trend that could easily be described as Orwellian – the vast majority of schools are also using those devices to keep tabs on what students are doing in their personal lives.

Indeed, 80% of teachers and 77% of high school students reported that their schools had installed artificial intelligence-based surveillance software on these devices to monitor students’ online activities and what is stored in the computer.

This student surveillance is taking place – at taxpayer expense – in cities and school communities throughout the United States.

For instance, in the Minneapolis school district, school officials paid over $355,000 to use tools provided by student surveillance company Gaggle until 2023. Three-quarters of incidents reported – that is, cases where the system flagged students’ online activity – took place outside school hours.

In Baltimore, where the public school system uses the GoGuardian surveillance app, police officers are sent to children’s homes when the system detects students typing keywords related to self-harm.

Safety versus privacy

Vendors claim these tools keep students safe from self-harm or online activities that could lead to trouble. However, privacy groups and news outlets have raised questions about those claims.

Vendors often refuse to reveal how their artificial intelligence programs were trained and the type of data used to train them.

Privacy advocates fear these tools may harm students by criminalizing mental health problems and deterring free expression.

As a researcher who studies privacy and security issues in various settings, I know that intrusive surveillance techniques cause emotional and psychological harm to students, disproportionately penalize minority students and weaken online security.

Artificial intelligence not intelligent enough

Even the most advanced artificial intelligence lacks the ability to understand human language and context. This is why student surveillance systems pick up a lot of false positives instead of real problems.

In some cases, these surveillance programs have flagged students discussing music deemed suspicious and even students talking about the novel “To Kill a Mockingbird.”

Harm to students

When students know they are being monitored, they are less likely to share true thoughts online and are more careful about what they search. This can discourage vulnerable groups, such as students with mental health issues, from getting needed services.

When students know that their every move and everything read and written is watched, they are also less likely to develop into adults with a high level of self-confidence. In general, surveillance has a negative impact on students’ ability to act and use analytical reasoning. It also hinders the development of the skills and mindset needed to exercise their rights.

More adverse impact on minorities

U.S. schools disproportionately discipline minority students. African American students’ chances of being suspended are more than three times higher than that of their white peers.

After evaluating flagged content, vendors report any concerns to school officials, who take disciplinary actions on a case-by-case basis. The lack of oversight in schools’ use of these tools could lead to further harm for minority students.

The situation is worsened by the fact that Black and Hispanic students rely more on school devices than their white peers do. This in turn makes minority students more likely to be monitored and exposes them to greater risk of some sort of intervention.

When both minority students and their white peers are monitored, the former group is more likely to be penalized because the training data used in developing artificial intelligence programs often fails to include enough minorities. Artificial intelligence programs are more likely to flag languages written and spoken by such groups. This is due to the underrepresentation of languages written and spoken by minorities in the datasets used to train such programs and the lack of diversity of people working in this field.

Leading AI models are 50% more likely to flag tweets written by African Americans as “offensive” that those written by others. They are 2.2 times more likely to flag tweets written in African American slang.

These tools also affect sexual and gender minorities more adversely. Gaggle has reportedly flagged “gay,” “lesbian” and other LGBTQ-related terms because they are associated with pornography, even though the terms are often used to describe one’s identity.

Increased security risk

These surveillance systems also increase students’ cybersecurity risks. First, to comprehensively monitor students’ activities, surveillance vendors compel students to install a set of certificates known as root certificates. As the highest-level security certificate installed in a device, a root certificate functions as a “master certificate” to determine the entire system’s security. One drawback is that these certificates compromise cybersecurity checks that are built into these devices.

Gaggle, which scans digital files of more than 5 million students each year, installs such certificates. This tactic of installing certificates is similar to the approach that authoritarian regimes, such as the Kazakhstani government, use to monitor and control their citizens and that cybercriminals use to lure victims to infected websites.

Second, surveillance system vendors use insecure systems that hackers can exploit. In March 2021, computer security software company McAfee found several vulnerabilities in student monitoring system vendor Netop’s Vision Pro Education software. For instance, Netop did not encrypt communications between teachers and students to block unauthorized access.

The software was used by over 9,000 schools worldwide to monitor millions of students. The vulnerability allowed hackers to gain control over webcams and microphones in students’ computers.

Finally, personal information of students that is stored by the vendors is susceptible to breaches. In July 2020, criminals stole 444,000 students’ personal data – including names, email addresses, home addresses, phone numbers and passwords – by hacking online proctoring service ProctorU. This data was then leaked online.

Schools would do well to look more closely at the harm being caused by their surveillance of students and to question whether they actually make students more safe – or less.The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.

]]>
Gaggle Surveils Millions of Kids in the Name of Safety. Targeted Families Argue it’s ‘Not That Smart’ https://www.the74million.org/article/gaggle-surveillance-minnesapolis-families-not-smart-ai-monitoring/ Tue, 12 Oct 2021 11:15:00 +0000 https://www.the74million.org/?post_type=article&p=578988 In the midst of a pandemic and a national uprising, Teeth Logsdon-Wallace was kept awake at night last summer by the constant sounds of helicopters and sirens. 

For the 13-year-old from Minneapolis who lives close to where George Floyd was murdered in May 2020, the pandemic-induced isolation and social unrest amplifed his transgender dysphoria, emotional distress that occurs when someone’s gender identity differs from their sex assigned at birth. His billowing depression landed him in the hospital after an attempt to die by suicide. During that dark stretch, he spent his days in an outpatient psychiatric facility, where therapists embraced music therapy. There, he listened to a punk song on loop that promised how things would soon “get better.” 

Eventually they did. 


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


Logsdon-Wallace, a transgender eighth-grader who chose the name Teeth, has since “graduated” from weekly therapy sessions and has found a better headspace, but that didn’t stop school officials from springing into action after he wrote about his mental health. In a school assignment last month, he reflected on his suicide attempt and how the punk rock anthem by the band Ramshackle Glory helped him cope — intimate details that wound up in the hands of district security. 

In a classroom assignment last month, Minneapolis student Teeth Logsdon-Wallace explained how the Ramshackle Glory song “Your Heart is a Muscle the Size of Your Fist” helped him cope after an attempt to die by suicide. In the assignment, which was flagged by the student surveillance company Gaggle, Logsdon-Wallace wrote that the song was “a reminder to keep on loving, keep on fighting and hold on for your life.” (Photo courtesy Teeth Logsdon-Wallace)

The classroom assignment was one of thousands of Minneapolis student communications that got flagged by Gaggle, a digital surveillance company that saw rapid growth after the pandemic forced schools into remote learning. In an earlier investigation, The 74 analyzed nearly 1,300 public records from Minneapolis Public Schools to expose how Gaggle subjects students to relentless digital surveillance 24 hours a day, seven days a week, raising significant privacy concerns for more than 5 million young people across the country who are monitored by the company’s digital algorithm and human content moderators. 

But technology experts and families with first-hand experience with Gaggle’s surveillance dragnet have raised a separate issue: The service is not only invasive, it may also be ineffective. 

While the system flagged Logsdon-Wallace for referencing the word “suicide,” context was never part of the equation, he said. Two days later, in mid-September, a school counselor called his mom to let her know what officials had learned. The meaning of the classroom assignment — that his mental health had improved — was seemingly lost in the transaction between Gaggle and the school district. He felt betrayed. 

 “I was trying to be vulnerable with this teacher and be like, ‘Hey, here’s a thing that’s important to me because you asked,” Logsdon-Wallace said. “Now, when I’ve made it clear that I’m a lot better, the school is contacting my counselor and is freaking out.”

Jeff Patterson, Gaggle’s founder and CEO, said in a statement his company does not “make a judgement on that level of the context,” and while some districts have requested to be notified about references to previous suicide attempts, it’s ultimately up to administrators to “decide the proper response, if any.”  

‘A crisis on our hands’

Minneapolis Public Schools first contracted with Gaggle in the spring of 2020 as the pandemic forced students nationwide into remote learning. Through AI and the content moderator team, Gaggle tracks students’ online behavior everyday by analyzing materials on their school-issued Google and Microsoft accounts. The tool scans students’ emails, chat messages and other documents, including class assignments and personal files, in search of keywords, images or videos that could indicate self-harm, violence or sexual behavior. The remote moderators evaluate flagged materials and notify school officials about content they find troubling. 

In Minneapolis, Gaggle flagged students for keywords related to pornography, suicide and violence, according to six months of incident reports obtained by The 74 through a public records request. The private company also captured their journal entries, fictional stories and classroom assignments. 

Gaggle executives maintain that the system saves lives, including those of more than 1,400 youth during the 2020-21 school year. Those figures have not been independently verified. Minneapolis school officials make similar assertions. Though the pandemic’s effects on suicide rates remains fuzzy, suicide has been a leading cause of death among teenagers for years. Patterson, who has watched his business grow by more than 20 percent during COVID-19, said Gaggle could be part of the solution. Though not part of its contract with Minneapolis schools, the company recently launched a service that connects students flagged by the monitoring tool with teletherapists. 

“Before the pandemic, we had a crisis on our hands,” he said. “I believe there’s a tsunami of youth suicide headed our way that we are not prepared for.” 

Schools nationwide have increasingly relied on technological tools that purport to keep kids safe, yet there’s a dearth of independent research to back up their claims.

Minneapolis student Teeth Logsdon-Wallace poses with his dog Gilly. (Photo courtesy Alexis Logsdon)

Like many parents, Logsdon-Wallace’s mother Alexis Logsdon didn’t know Gaggle existed until she got the call from his school counselor. Luckily, the counselor recognized that Logsdon-Wallace was discussing events from the past and offered a measured response. His mother was still left baffled. 

“That was an example of somebody describing really good coping mechanisms, you know, ‘I have music that is one of my soothing activities that helps me through a really hard mental health time,’” she said. “But that doesn’t matter because, obviously, this software is not that smart — it’s just like ‘Woop, we saw the word.’” 

‘Random and capricious’

Many students have accepted digital surveillance as an inevitable reality at school, according to a new survey by the Center for Democracy and Technology  in Washington, D.C. But some youth are fighting back, including Lucy Dockter, a 16-year-old junior from Westport, Connecticut. On multiple occasions over the last several years, Gaggle has flagged her communications — an experience she described as “really scary.”

“If it works, it could be extremely beneficial. But if it’s random, it’s completely useless.”
Lucy Dockter, 16, Westport, Connecticut student mistakenly flagged by Gaggle

On one occasion, Gaggle sent her an email notification of “Inappropriate Use” while she was walking to her first high school biology midterm and her heart began to race as she worried what she had done wrong. Dockter is an editor of her high school’s literary journal and, according to her, Gaggle had ultimately flagged profanity in students’ fictional article submissions. 

“The link at the bottom of this email is for something that was identified as inappropriate,” Gaggle warned in its email while pointing to one of the fictional articles. “Please refrain from storing or sharing inappropriate content in your files.” 

Gaggle emailed a warning to Connecticut student Lucy Dockter for profanity in a literary journal article. (Photo courtesy Lucy Dockter)

But Gaggle doesn’t catch everything. Even as she got flagged when students shared documents with her, the articles’ authors weren’t receiving similar alerts, she said. And neither did Gaggle’s AI pick up when she wrote about the discrepancy in a student newspaper article where she included a four-letter swear word to make a point. In the article, which Dockter wrote with Google Docs, she argued that Gaggle’s monitoring system is “random and capricious,” and could be dangerous if school officials rely on its findings to protect students. 

Her experiences left the Connecticut teen questioning whether such tracking is even helpful. 

“With such a seemingly random service, that doesn’t seem to — in the end — have an impact on improving student health or actually taking action to prevent suicide and threats” she said in an interview. “If it works, it could be extremely beneficial. But if it’s random, it’s completely useless.”

Lucy Dockter

Some schools have asked Gaggle to email students about the use of profanity, but Patterson said the system has an error that he blamed on the tech giant Google, which at times “does not properly indicate the author of a document and assigns a random collaborator.”

“We are hoping Google will improve this functionality so we can better protect students,” Patterson said. 

Back in Minneapolis, attorney Cate Long said she became upset when she learned that Gaggle was monitoring her daughter on her personal laptop, which 10-year-old Emmeleia used for remote learning. She grew angrier when she learned the district didn’t notify her that Gaggle had identified a threat. 

This spring, a classmate used Google Hangouts, the chat feature, to send Emmeleia a death threat, warning she’d shoot her “puny little brain with my grandpa’s rifle.”

Minneapolis mother Cate Long said a student used Google Hangouts to send a death threat to her 10-year-old daughter Emmeleia. Officials never informed her about whether Gaggle had flagged the threat. (Photo courtesy Cate Long)

When Long learned about the chat, she notified her daughter’s teacher but was never informed about whether Gaggle had picked up on the disturbing message as well. Missing warning signs could be detrimental to both students and school leaders; districts could be held liable if they fail to act on credible threats.

“I didn’t hear a word from Gaggle about it,” she said. “If I hadn’t brought it to the teacher’s attention, I don’t think that anything would have been done.” 

The incident, which occurred in April, fell outside the six-month period for which The 74 obtained records. A Gaggle spokesperson said the company picked up on the threat and notified district officials an hour and a half later but it “does not have any insight into the steps the district took to address this particular matter.” 

Julie Schultz Brown, the Minneapolis district spokeswoman, said that officials “would never discuss with a community member any communication flagged by Gaggle.” 

“That unrelated but concerned parent would not have been provided that information nor should she have been,” she wrote in an email. “That is private.” 

Cate Long poses with her 10-year-old daughter Emmeleia. (Photo courtesy Cate Long)

‘The big scary algorithm’

When identifying potential trouble, Gaggle’s algorithm relies on keyword matching that compares student communications against a dictionary of thousands of words the company believes could indicate potential issues. The company scans student emails before they’re delivered to their intended recipients, said Patterson, the CEO. Files within Google Drive, including Docs and Sheets, are scanned as students write in them, he said. In one instance, the technology led to the arrest of a 35-year-old Michigan man who tried to send pornography to an 11-year-old girl in New York, according to the company. Gaggle prevented the file from ever reaching its intended recipient.  

Though the company allows school districts to alter the keyword dictionary to reflect local contexts, less than 5 percent of districts customize the filter, Patterson said. 

That’s where potential problems could begin, said Sara Jordan, an expert on artificial intelligence and senior researcher at the Future of Privacy Forum in Washington. For example, language that students use to express suicidal ideation could vary between Manhattan and rural Appalachia, she said.

“We’re using the big scary algorithm term here when I don’t think it applies,” This is not Netflix’s recommendation engine. This is not Spotify.”
Sara Jordan, AI expert and senior researcher, Future of Privacy Forum

Sara Jordan

On the other hand, she noted that false-positives are highly likely, especially when the system flags common swear words and fails to understand context. 

“You’re going to get 25,000 emails saying that a student dropped an F-bomb in a chat,” she said. “What’s the utility of that? That seems pretty low.” 

She said that Gaggle’s utility could be impaired because it doesn’t adjust to students’ behaviors over time, comparing it to Netflix, which recommends television shows based on users’ ever-evolving viewing patterns. “Something that doesn’t learn isn’t going to be accurate,” she said. For example, she said the program could be more useful if it learned to ignore the profane but harmless literary journal entries submitted to Dockter, the Connecticut student. Gaggle’s marketing materials appear to overhype the tool’s sophistication to schools, she said. 

“We’re using the big scary algorithm term here when I don’t think it applies,” she said. “This is not Netflix’s recommendation engine. This is not Spotify. This is not American Airlines serving you specific forms of flights based on your previous searches and your location.” 

“Artificial intelligence without human intelligence ain’t that smart.”
Jeff Patterson, Gaggle founder and CEO

Patterson said Gaggle’s proprietary algorithm is updated regularly “to adjust to student behaviors over time and improve accuracy and speed.” The tool monitors “thousands of keywords, including misspellings, slang words, evolving trends and terminologies, all informed by insights gleaned over two decades of doing this work.” 

Ultimately, the algorithm to identify keywords is used to “narrow down the haystack as much as possible,” Patterson said, and Gaggle content moderators review materials to gauge their risk levels. 

“Artificial intelligence without human intelligence ain’t that smart,” he said. 

In Minneapolis, officials denied that Gaggle infringes on students’ privacy and noted that the tool only operates within school-issued accounts. The district’s internet use policy states that students should “expect only limited privacy,” and that the misuse of school equipment could result in discipline and “civil or criminal liability.” District leaders have also cited compliance with the Clinton-era Children’s Internet Protection Act, which became law in 2000 and requires schools to monitor “the online activities of minors.” 

Patterson suggested that teachers aren’t paying close enough attention to keep students safe on their own and “sometimes they forget that they’re mandated reporters.” On the Gaggle website, Patterson says he launched the company in 1999 to provide teachers with “an easy way to watch over their gaggle of students.” Legally, teachers are mandated to report suspected abuse and neglect, but Patterson broadens their sphere of responsibility and his company’s role in meeting it. As technology becomes a key facet of American education, Patterson said that schools “have a moral obligation to protect the kids on their digital playground.” 

But Elizabeth Laird, the director of equity in civic technology at the Center for Democracy and Technology, argued the federal law was never intended to mandate student “tracking” through artificial intelligence. In fact, the statute includes a disclaimer stating it shouldn’t be “construed to require the tracking of internet use by any identifiable minor or adult user.” In a recent letter to federal lawmakers, her group urged the government to clarify the Children’s Internet Protection Act’s requirements and distinguish monitoring from tracking individual student behaviors. 

Sen. Elizabeth Warren, a Democrat from Massachusetts, agrees. In recent letters to Gaggle and other education technology companies, Warren and other Democratic lawmakers said they’re concerned the tools “may extend beyond” the law’s intent “to surveil student activity or reinforce biases.” Around-the-clock surveillance, they wrote, demonstrates “a clear invasion of student privacy, particularly when students and families are unable to opt out.” 

“Escalations and mischaracterizations of crises may have long-lasting and harmful effects on students’ mental health due to stigmatization and differential treatment following even a false report,” the senators wrote. “Flagging students as ‘high-risk’ may put them at risk of biased treatment from physicians and educators in the future. In other extreme cases, these tools can become analogous to predictive policing, which are notoriously biased against communities of color.”

A new kind of policing

Shortly after the school district piloted Gaggle for distance learning, education leaders were met with an awkward dilemma. Floyd’s murder at the hands of a Minneapolis police officer prompted Minneapolis Public Schools to sever its ties with the police department for school-based officers and replace them with district security officers who lack the authority to make arrests. Gaggle flags district security when it identifies student communications the company believes could be harmful. 

Some critics have compared the surveillance tool to a new form of policing that, beyond broad efficacy concerns, could have a disparate impact on students of color, similar to traditional policing. Algorithms have long been found to suffer biases. 

Matt Shaver, who taught at a Minneapolis elementary school during the pandemic but no longer works for the district, said he was concerned that racial bias could be baked into Gaggle’s algorithm. Absent adequate context or nuance,  he worried the tool could lead to misunderstandings. 

Data obtained by The 74 offer a limited window into Gaggle’s potential effects on different student populations. Though the district withheld many details in the nearly 1,300 incident reports, just over 100 identified the campuses where the involved students attended school. An analysis of those reports failed to identify racial discrepancies. Specifically, Gaggle was about as likely to issue incident reports in schools where children of color were the majority as it was at campuses where most children were white. It remains possible that students of color in predominantly white schools may have been disproportionately flagged by Gaggle or faced disproportionate punishment once identified. Broadly speaking, Black students are far more likely to be suspended or arrested at school than their white classmates, according to federal education data. 

Gaggle and Minneapolis district leaders acknowledged that students’ digital communications are forwarded to police in rare circumstances. The Minneapolis district’s internet use policy explains that educators could contact the police if students use technology to break the law and a document given to teachers about the district’s Gaggle contract further highlights the possibility of law enforcement involvement. 

Jason Matlock, the Minneapolis district’s director of emergency management, safety and security, said that law enforcement is not a “regular partner,” when responding to incidents flagged by Gaggle. It doesn’t deploy Gaggle to get kids into trouble, he said, but to get them help. He said the district has interacted with law enforcement about student materials flagged by Gaggle on several occasions, but only in cases related to child pornography. Such cases, he said, often involve students sharing explicit photographs of themselves. During a six-month period from March to September 2020, Gaggle flagged Minneapolis students more than 120 times for incidents related to child pornography, according to records obtained by The 74.

Jason Matlock, the director of emergency management, safety and security at the Minneapolis school district, discusses the decision to partner with Gaggle as students moved to remote learning during the pandemic. (Screenshot)

“Even if a kid has put out an image of themselves, no one is trying to track them down to charge them or to do anything negative to them,” Matlock said, though it’s unclear if any students have faced legal consequences. “It’s the question as to why they’re doing it,” and to raise the issue with their parents.

Gaggle’s keywords could also have a disproportionate impact on LGBTQ children. In three-dozen incident reports, Gaggle flagged keywords related to sexual orientation including “gay, and “lesbian.” On at least one occasion, school officials outed an LGBTQ student to their parents, according to a Minneapolis high school student newspaper article

Logsdon-Wallace, the 13-year-old student, called the incident “disgusting and horribly messed up.” 

“They have gay flagged to stop people from looking at porn, but one, that is going to be mostly targeting people who are looking for gay porn and two, it’s going to be false-positive because they are acting as if the word gay is inherently sexual,” he said. “When people are just talking about being gay, anything they’re writing would be flagged.” 

The service could also have a heavier presence in the lives of low-income families, he added, who may end up being more surveilled than their affluent peers. Logsdon-Wallace said he knows students who rely on school devices for personal uses because they lack technology of their own. Among the 1,300 Minneapolis incidents contained in The 74’s data, only about a quarter were reported to district officials on school days between 8 a.m. and 4 p.m.

“That’s definitely really messed up, especially when the school is like ‘Oh no, no, no, please keep these Chromebooks over the summer,’” an invitation that gave students “the go-ahead to use them” for personal reasons, he said.

“Especially when it’s during a pandemic when you can’t really go anywhere and the only way to talk to your friends is through the internet.”

]]>
An Inside Look at Spy Tech Used on Students During Remote Classes — and Beyond https://www.the74million.org/article/gaggle-spy-tech-minneapolis-students-remote-learning/ Tue, 14 Sep 2021 10:30:00 +0000 https://www.the74million.org/?post_type=article&p=577556 A week after the pandemic forced Minneapolis students to attend classes online, the city school district’s top security chief got an urgent email, its subject line in all caps, alerting him to potential trouble. Just 12 seconds later, he got a second ping. And two minutes after that, a third.

In each instance, the emails warning Jason Matlock of “QUESTIONABLE CONTENT” pointed to a single culprit: Kids were watching cartoon porn.

Over the next six months, Matlock got nearly 1,300 similar emails from Gaggle, a surveillance company that monitors students’ school-issued Google and Microsoft accounts. Through artificial intelligence and a team of content moderators, Gaggle tracks the online behaviors of millions of students across the U.S. every day. The sheer volume of reports was overwhelming at first, Matlock acknowledged, and many incidents were utterly harmless. About 100 were related to animated pornography and, on one occasion, a member of Gaggle’s remote surveillance team flagged a fictional story that referenced “underwear.”

Hundreds of others, however, suggested imminent danger.

In emails and chat messages, students discussed violent impulses, eating disorders, abuse at home, bouts of depression and, as one student put it, “ending my life.” At a moment of heightened social isolation and elevated concern over students’ mental health, references to self-harm stood out, accounting for nearly a third of incident reports over a six-month period. In a document titled “My Educational Autobiography,” students at Roosevelt High School on the south side of Minneapolis discussed bullying, drug overdoses and suicide. “Kill me,” one student wrote in a document titled “goodbye.”

Nearly a year after The 74 submitted public records requests to understand the Minneapolis district’s use of Gaggle during the pandemic, a trove of documents offer an unprecedented look into how one school system deploys a controversial security tool that grew rapidly during COVID-19, but carries significant civil rights and privacy implications.

The data, gleaned from those 1,300 incident reports in the first six months of the crisis, highlight how Gaggle’s team of content moderators subject children to relentless digital surveillance long after classes end for the day, including on weekends, holidays, late at night and over the summer. In fact, only about a quarter of incidents were reported to district officials on school days between 8 a.m and 4 p.m., bringing into sharp relief how the service extends schools’ authority far beyond their traditional powers to regulate student speech and behavior, including at home.

Now, as COVID-era restrictions subside and Minneapolis students return to in-person learning this fall, a tool that was pitched as a remote learning necessity isn’t going away anytime soon. Minneapolis officials reacted swiftly when the pandemic engulfed the nation and forced students to learn from the confines of their bedrooms, paying more than $355,000 — including nearly $64,000 in federal emergency relief money — to partner with Gaggle until 2023. Faced with a public health emergency, the district circumvented normal procurement rules, a reality that prevented concerned parents from raising objections until after it was too late.

A mental health dilemma

With each alert, Matlock and other district officials were given a vivid look into students’ most intimate thoughts and online behaviors, raising significant privacy concerns. It’s unclear, however, if any of them made kids safer. Independent research on the efficacy of Gaggle and similar services is all but nonexistent.

When students’ mental health comes into play, a complicated equation emerges. In recent years, schools have ramped up efforts to identify and provide interventions to children at risk of harming themselves or others. Gaggle executives see their tool as a key to identify youth who are lamenting over hardships or discussing violent plans. On average, Gaggle notifies school officials within 17 minutes after zeroing in on student content related to suicide and self-harm, according to the company, and officials claim they saved more than 1,400 lives during the 2020-21 school year.

Jeff Patterson

“As a parent you have no idea what’s going on in your kid’s head, but if you don’t know you can’t help them,” said Jeff Patterson, Gaggle’s founder and CEO. “And I would always want to err on trying to identify kids who need help.”

Critics, however, have questioned Gaggle’s effectiveness and worry that rummaging through students personal files and conversations — and in some cases outing students for exhibiting signs of mental health issues including depression — could backfire.

Using surveillance to identify children in distress could exacerbate feelings of stigma and shame and could ultimately make students less likely to ask for help, said Jennifer Mathis, the director of policy and legal advocacy at The Bazelon Center for Mental Health Law in Washington, D.C.

“Most kids in that situation are not going to share anything anymore and are going to suffer for that,” she said. “It suggests that anything you write or say or do in school — or out of school — may be found and held against you and used in ways that you had not envisioned.”

Minneapolis parent Holly Kragthorpe-Shirley had a similar concern and questioned whether kids “actually have a safe space to raise some of their issues in a safe way” if they’re stifled by surveillance.

In Minneapolis, for instance, Gaggle flagged the keywords “feel depressed” in a document titled “SEL Journal,” a reference to social-emotional learning. In another instance, Gaggle flagged “suicidal” in a document titled “mental health problems workbook.”

District officials acknowledged that Gaggle had captured student assignments and other personal files, an issue that civil rights groups have long been warning about. The documents obtained by The 74 put hard evidence behind those concerns, said Amelia Vance, the director of Youth and Education Privacy at The Future of Privacy Forum, a Washington-based think tank.

Amelia Vance

“The hypotheticals we’ve been talking about for a few years have come to fruition,” she said. “It is highly likely to undercut the trust of students not only in their school generally but in their teacher, in their counselor — in the mental health problems workbook.” 

Patterson shook off any privacy reservations, including those related to monitoring sensitive materials like journal entries, which he characterized as “cries for help.”

“Sometimes when we intervene we might cause some challenges, but more often than not the kids want to be helped,” he said. Though Gaggle only monitors student files tied to school accounts, he cited a middle school girl’s private journal in a success story. He said the girl wrote in a digital journal that she suffered with self esteem issues and guilt after getting raped.

“No one in her life knew about this incident and because she journaled about it,” Gaggle was able to notify school officials about what they’d learned, he said. “They were able to intervene and get this girl help for things that she couldn’t have dealt with on her own.”

‘Needles in haystacks’

Tools like Gaggle have become ubiquitous in classrooms across the country, according to forthcoming research by the D.C.-based Center for Democracy & Technology. In a recent survey, 81 percent of teachers reported having such software in place in their schools. Though most students said they’re comfortable being monitored, 58 percent said they don’t share their “true thoughts or ideas” as a result and 80 percent said they’re more careful about what they search online.

Such data suggest that youth are being primed to accept surveillance as an inevitable reality, said Elizabeth Laird, the center’s director of equity in civic technology. In return, she said, they’re giving up the ability to explore new ideas and learn from mistakes.

Gaggle, in business since 1999 and recently relocated to Dallas, monitors the digital files of more than 5 million students across the country each year with the pandemic being very good for its bottom line. Since the onset of the crisis, the number of students surveilled by the privately held company, which does not report its yearly revenue, has grown by more than 20 percent. Through artificial intelligence, Gaggle scans students’ emails, chat messages and other materials uploaded to students’ Google or Microsoft accounts in search of keywords, images or videos that could indicate self-harm, violence or sexual behavior. Moderators evaluate flagged material and notify school officials about content they find troubling — a bar that Matlock acknowledged is quite low as “the system is always going to err on the side of caution” and requires district administrators to evaluate materials’ context.

“We’re looking for needles in haystacks to basically save kids.”
—Jeff Patterson, founder and CEO of Gaggle, which analyzed more than 10 billion online student communications in the 2020-21 school year.

In Minneapolis, Gaggle officials discovered a majority of offenses in files within students’ Google Drive, including in word documents and spreadsheets. More than half of incidents originated on the Drive. Meanwhile, 22 percent originated in emails and 23 percent came from Google Hangouts, the chat feature.

School officials are alerted to only a tiny fraction of student communications caught up in Gaggle’s dragnet. Last school year, Gaggle collected more than 10 billion items nationally but just 360,000 incidents resulted in notifications to district officials, according to the company. Nationally, 41 percent of incidents during the 2020-21 school year related to suicide and self-harm, according to Gaggle, and a quarter centered on violence.

“We are looking for needles in haystacks to basically save kids,” Patterson said.

‘A really slippery slope’

It was Google Hangouts that had Matt Shaver on edge. When the pandemic hit, classrooms were replaced by video conferences and casual student interactions in hallways and cafeterias were relegated to Hangouts. For Shaver, who taught at a Minneapolis elementary school during the pandemic, students’ Hangouts use became overwhelming.

Students were so busy chatting with each other, he said, that many had lost focus on classroom instruction. So he proposed a blunt solution to district technology officials: Shut it down.

“The thing I wanted was ‘Take the temptation away, take the opportunity away for them to use that,’” said Shaver, who has since left teaching and is now policy director at the education reform group EdAllies. “And I actually got pushback from IT saying ‘No we’re not going to do that, this is a good social aspect that we’re trying to replicate.’”

But unlike those hallway interactions, nobody was watching. Matlock, the district’s security head, said he was initially in the market for a new anonymous reporting tool, which allows students to flag their friends for behaviors they find troubling. He turned to Gaggle, which operates the anonymous reporting system SpeakUp for Safety, and saw the company’s AI-powered digital surveillance tool, which goes well beyond SpeakUp’s powers to ferret out potentially alarming student behavior, as a possibility to “enhance the supports for students online.”

“We wanted to get something in place quickly, as we were moving quickly with the lockdown,” he said, adding that going through traditional procurement hoops could take months. “Gaggle had a strong national presence and a reputation.”

The district signed an initial six-month, $99,603 contract with Gaggle just a week after the virus shuttered schools in Minneapolis. Board of Education Chair Kim Ellison signed a second, three-year contract at an annual rate of $255,750 in September 2020.

The move came with steep consequences. Though SpeakUP was used just three times during the six-month window included in The 74’s data, Gaggle’s surveillance tool flagged students nearly 1,300 times.

During that time, which coincided with the switch to remote learning, the largest share of incidents — 38 percent — were pornographic or sexual in nature, including references to “sexual activity involving a student,” professional videos and explicit, student-produced selfies which trigger alerts to the National Center for Missing and Exploited Children.

“I’m trying to imagine finding out about this as a high schooler, that every single word I’ve written on a Google Hangout or whatever is being monitored … we live in a country with laws around unreasonable search and seizure — and surveillance is just a really slippery slope.”
—Matt Shaver, former Minneapolis Public Schools teacher

An additional 30 percent were related to suicide and self-harm, including incidents that were triggered by keywords including “cutting,” “feeling depressed,” “want to die,” and “end it all.” an additional 18 percent were related to violence, including threats, physical altercations, references to weapons and suspected child abuse. Such incidents were triggered by keywords including “Bomb,” “Glock,” “going to fight,” and “beat her.” About a fifth of incidents were triggered by profanity.

Concerns over Gaggle’s reach during the pandemic weren’t limited to Minneapolis. In December 2020, a group of civil rights organizations including the American Civil Liberties Union of Northern California argued in a letter that by using Gaggle, the Fresno Unified School District had violated the California Electronic Communications Privacy Act, which requires officials to obtain search warrants before accessing electronic information. Such monitoring, the groups contend, infringe on students’ free-speech and privacy rights with little ability to opt out.

Shaver, whose students used Google Hangouts to the point of it becoming a distraction, was alarmed to learn that those communications were being analyzed by artificial intelligence and poured over by a remote team of people he didn’t even know.

“I’m trying to imagine finding out about this as a high schooler, that every single word I’ve written on a Google Hangout or whatever is being monitored,” he said. “There is, of course, some lesson in this, obviously like, ‘Be careful of what you put online.’ But we live in a country with laws around unreasonable search and seizure — and surveillance is just a really slippery slope.”

Jason Matlock, the director of emergency management, safety and security at the Minneapolis school district, discusses the decision to partner with Gaggle as students moved to remote learning during the pandemic. (Screenshot)

The potential to save lives

To Matlock, Gaggle is a lifesaver — literally. When the tool flagged a Minneapolis student’s suicide note in the middle of the night, Matlock said he rushed to intervene. In a late-night phone call, the security chief said he warned the unnamed parents, who knew their child was struggling but didn’t fully recognize how bad things had become. Because of Gaggle, school officials were able to get the student help. To Matlock, the possibility that he saved a student’s life offers a feeling he “can’t even measure in words.”

“If it saved one kid, if it supported one caregiver, if it supported one family, I’ll take it,” he said. “That’s the bottom line.”

Despite heightened concern over youth mental health issues during the pandemic, its effect on youth suicide rates remains fuzzy. Preliminary data from the Minnesota health department show a significant decline in suicides statewide during the pandemic. Between 2019 and 2020, suicides among people 24 years old and younger decreased by more than 20 percent statewide. Nationally, the proportion of youth emergency room visits related to suspected suicide attempts has surged during the pandemic, according to the Centers for Disease Control and Prevention, but preliminary mortality data for people of all ages show a 5.6 percent decline in self-inflicted fatalities in 2020 compared to 2019.

Meanwhile, Gaggle reported that it identified a significant increase of threats related to suicide, self-harm and violence nationwide between March 2020 and March 2021. During that period, Gaggle observed a 31 percent increase in flagged content overall, including a 35 percent increase in materials related to suicide and self-harm. Gaggle officials said the data highlight a mental health crisis among youth during the pandemic. But other factors could be at play. Among them is a 50 percent surge in students’ screen-time during the pandemic, creating additional opportunities for Gaggle to tag youth behavior. Meanwhile, the number of students monitored by Gaggle nationally grew markedly during the pandemic.

But that hasn’t stopped Gaggle from citing pandemic-era mental illness in sales pitches as it markets a new service: Gaggle Therapy. In school districts that sign up for the service, students who are flagged by Gaggle’s digital monitoring tool are matched with counselors for weekly teletherapy sessions. Therapists available through the service are independent contractors for Gaggle and districts can either pay Gaggle for “blanket coverage,” which makes all students eligible, or a “retainer” fee, which allows them to “use the service as you need it,” according to the company. Under the second scenario, Gaggle would have a financial incentive to identify more students in need of teletherapy.

In Minneapolis, Matlock said that school-based social workers and counselors lead intervention efforts when students are identified for materials related to self-harm. “The initial moment may be a shock” when students are confronted by school staff about their online behaviors, he said, but providing them with help “is much better in the long run.”

A presentation sent to Minneapolis teachers explains how the district responds after Gaggle flags a “possible student situation” that officials say present an imminent threat. (Photo obtained by The 74)

As the district rolled out the service, many parents and students were out of the loop. Among them was Nathaniel Genene, a recent graduate who served as the Minneapolis school board’s student representative at the time. He said that classmates contacted him after initial news of the Gaggle contract was released.

“I had a couple of friends texting me like ‘Nathaniel, is this true?’” he said. “It was kind of interesting because I had no idea it was even a thing.”

Yet as students gained a greater awareness that their communications were being monitored, Matlock said they began to test Gaggle’s parameters using potential keywords “and then say ‘Hi’ to us while they put it in there.”

As students became conditioned to Gaggle, “the shock is probably a little bit less,” said Rochelle Cox, an associate superintendent at the Minneapolis school district. Now, she said students have an outlet to get help without having to explicitly ask. Instead, they can express their concerns online with an understanding that school officials are listening. As a result, school-based mental health professionals are able to provide the care students need, she said.

Mathis, with The Bazelon Center for Mental Health Law, called that argument “ridiculous.” Officials should make sure that students know about available mental health services and ensure that they feel comfortable reaching out for help, she said.

“That’s very different than deciding that we’re going to catch people by having them write into the ether and that’s how we’re going to find the students who need help,” she said. “We can be a lot more direct in communicating than that, and we should be a lot more direct and a lot more positive.”

In fact, subjecting students to surveillance could push them further into isolation and condition them to lie when officials reach out to inquire about their digital communications, argued Vance of the Future of Privacy Forum.

“Effective interventions are rarely going to be built on that, you know, ‘I saw what you were typing into a Google search last night’ or ‘writing a journal entry for your English class,’” Vance said. “That doesn’t feel like it builds a trusting relationship. It feels creepy.”

]]>
‘Don’t Get Gaggled’: Minneapolis School District Spends Big on Student Surveillance Tool, Raising Ire After Terminating Its Police Contract https://www.the74million.org/article/dont-get-gaggled-minneapolis-school-district-spends-big-on-student-surveillance-tool-raising-ire-after-terminating-its-police-contract/ Sun, 18 Oct 2020 17:01:00 +0000 https://www.the74million.org/?post_type=article&p=562914 Minneapolis education leaders have spent hundreds of thousands of dollars this year to surveil children online, even after the district ended its police department contract and launched school safety reforms that officials said would build trust between adults and students.

The district terminated its longstanding relationship with the city’s police department after George Floyd died at the hands of a Minneapolis officer in May. But since the pandemic closed campuses in March and required students to attend online classes from home, the district has shelled out more than $355,000 for a digital surveillance tool called Gaggle, according to contracts obtained by The 74 through a public records request.

Gaggle is currently used in hundreds of districts across the U.S., relying on artificial intelligence and a team of moderators paid as little as $10 an hour to scan billions of student emails, chat messages and files each year in search of references to sex, drugs and violence.

Even while the police-free schools movement has garnered momentum in the wake of Floyd’s death, with districts nationwide reexamining the role of cops on campus, it has not appeared to slow the recent growth of the nearly $3 billion-a-year school security industry. A Gaggle executive said their service is key to student safety, and the company saw a sales surge with more than 100 school districts becoming new customers since schools went virtual in March.

“With school now taking place in our students’ living rooms and bedrooms, safety is more important than ever,” Jeff Patterson, Gaggle’s founder and CEO, said in the media release. “Many educators are concerned that without in-person school, they may not be able to identify students in abusive situations or those suffering from mental illness.”

But there’s little research to back up the company’s claims and critics argue that Gaggle and similar products could be detrimental to child development and amount to pervasive government surveillance. Civil rights groups and racial justice advocates are especially concerned about online surveillance tools during the pandemic as students across the country spend the majority of their academic lives in front of screens.

In Minneapolis, the latest revelation further outraged activists who cheered the district’s decision to terminate the police contract but grew wary after officials sought to substitute campus cops with “public safety support specialists” with law enforcement backgrounds.

“My concern was that they would replace physical policing with technological policing, which appears to be something like Gaggle,” said Marika Pfefferkorn, executive director of the Midwest Center for School Transformation and a proponent of the police-free schools movement. Pfefferkorn pushed the district to split with the cops but said the move was just the first step in curtailing the policing and surveillance of students — particularly those of color. Instead, Gaggle “has the potential to further criminalize students.”

No such thing as confidentiality online

An initial six-month district contract with Gaggle, signed by Chief Operations Officer Karen DeVet just a week after the virus shuttered city schools, totaled $99,603 and was in place through the end of September. A second, three-year contract was signed months after Floyd’s death and went into effect this month at an annual rate of $255,750. School Board Chair Kim Ellison signed the second contract on Sept. 18. District and school board officials didn’t respond to multiple requests for comment.

In the contract, Gaggle notes that it “cannot guarantee security and confidentiality through its services” and “may choose to turn over” student messages to the police. However, the company said it “shall not be responsible for contacting, notifying or alerting” law enforcement and cannot guarantee that “all unsafe communications can or will be detected while monitoring your student communications or website content.”

Through the contracts, Gaggle helps the district monitor student activities on a range of Google services, including email, Docs, a video platform, the chat service Google Hangouts and other Google Classroom tools. Through artificial intelligence, the company scans students’ emails, chat messages and other materials for specific words and phrases that may indicate harm. Moderators evaluate flagged content and notify school officials about references to self-harm, depression, drug use and violent threats. Gaggle’s algorithm scans student content for trigger words including “bomb,” “drunk,” “gun” and “kill me,” according to a 2019 Buzzfeed News investigation. But it also scans for LGBTQ-specific words like “gay” and “lesbian,” which are often flagged as potential bullying.

Such keywords could lead Gaggle to disproportionately subject LGBTQ students to school surveillance, Pfefferkorn said.

“Over and over again, we continue to see with algorithms that bias is often baked in,” she said.


“Any time you have a service turned on, you see pornography, you can see drugs and alcohol use or use being talked about. You, of course, have anxiety, depression and suicide being talked about.”Bill McCullough, Gaggle’s vice president of sales. 


In a brief message buried on one Minneapolis high school’s website — with the headline “Don’t Get Gaggled” — district staff noted that distance learning presents new challenges in supporting students’ mental and emotional health needs and offers a reminder that “there is no such thing as confidentiality online.” The webpage links to a video featuring counseling services manager Derek Francis, who notes that the district “will be monitoring chats and postings for inappropriate content and will follow up as is appropriate.”

“Make sure you’re not saying things online that you would never say to someone’s face,” Francis warns students. “We don’t want you to end up regretting something that you post.”

Prior to the pandemic, the Minneapolis district didn’t believe Gaggle’s services were necessary, said Bill McCullough, the company’s vice president of sales. But when the virus closed buildings, “they wanted us to start the service as quickly as possible,” he said. After an initial six-month pilot, the district “realized that this service is extremely valuable and moved to a full contract this fall.”

“Any time you have a service turned on, you see pornography, you can see drugs and alcohol use or use being talked about,” he said. “You, of course, have anxiety, depression and suicide being talked about.”

Ben Feist, the chief programs officer at the American Civil Liberties Union of Minnesota, has urged the state to adopt student data privacy protections for years, due in part to surveillance concerns with companies like Gaggle. In an interview, he said the Minneapolis district’s partnership with Gaggle is “massively intrusive” at a time when students’ use of technology for school has reached “complete saturation.”


“As far as I can tell, nobody has really thought this through, at least from any type of privacy lens. It’s hugely troubling.” —Ben Feist, chief program officer for ACLU of Minnesota


By terminating the police contract, district leaders have said they’re working to dismantle what they called a “white supremacist culture.” But Feist said that Gaggle could perpetuate racial disparities in student discipline. The Minneapolis district educates about 35,000 students, roughly 65 percent of whom are youth of color.

“There’s every reason to believe that the implementation of this type of surveillance is going to have a disproportionate impact on students of color and bring more people into a surveillance net that could have been avoided,” he said. “As far as I can tell, nobody has really thought this through, at least from any type of privacy lens. It’s hugely troubling.”

Searching for ‘sad kids’

Gaggle and similar student surveillance platforms have long marketed themselves as crucial to preventing school violence. After the 2018 mass school shooting in Parkland, Florida, for example, companies bombarded education leaders with sales pitches touting their wares as the key ingredient to precluding more carnage. In the pandemic era, Gaggle is marketing itself as a tool for mental health intervention.

“People are using our product to identify, largely, who the sad kids are in the school district,” said McCullough, who noted concerns that the pandemic has taken a toll on students’ emotional wellbeing and could lead to a spike in youth suicides. Such a trend has emerged in students’ emails and other digital communications, he said, with an uptick in student comments about depression, suicide and domestic abuse. “But thankfully kids are still talking about it and we’re able to go and identify those kids who are in crisis” to connect them with mental health services. McCullough declined to detail how his service has been used in Minneapolis, citing student privacy concerns.

Last school year, Gaggle monitored more than 4.5 million students’ online activities across the U.S., efforts it claims saved 927 lives, according to a company media release. In total, the company scanned 6.25 billion items within school accounts for content deemed harmful, including 64,000 references to suicide or self harm, 38,000 references of violence toward others and 18,000 instances of nudity or sexual content.

School surveillance doesn’t stop when classes end for the day. Prior to the pandemic, about 40 percent of incidents occurred after school hours, according to company data. But since March, incidents happening after hours increased to 55 percent. While threats of violence decreased by 43 percent after the pandemic closed campuses, the platform observed an uptick in students sending each other nude selfies.

Several years ago, concerning material was most often found in student emails, McCullough said. But now, messages are most often flagged in Google Docs, which students have used as makeshift chat rooms. In this context, students are often “their most authentic self” and typically share documents “with just a few friends,” he said.

Minneapolis and other districts have also paid Gaggle to monitor student communications in the chat tool Google Hangouts, which has taken on a new role in education during the pandemic. Without face-to-face interaction between students, they’re using Hangouts to collaborate on science projects and other assignments, McCullough said.

But critics argue that schools’ use of tools like Gaggle could discourage students from expressing themselves. Elizabeth Laird, the senior fellow of student privacy at the Center for Democracy and Technology, questioned whether such surveillance runs counter to schools’ mission of providing supportive environments where students can speak freely and learn from mistakes.

“When people are surveilled in this way, it really limits that kind of free expression and can have a chilling effect on what they’re comfortable saying and doing,” she said. Laird also raised concerns about the accuracy of algorithms, which often struggle to decipher true threats from slang or humor. Such noisy data could flag some students unnecessarily and miss signs that are genuinely concerning, she said.

School interactions with police are also a concern. In a recent parent survey, the Center for Democracy and Technology found that while most parents support the use of education technology, they’re also concerned about protecting their children’s digital privacy from vulnerabilities like hacks. Of the 1,200 participants who completed the online survey in May and June, 55 percent of parents — and 61 percent of those who are Black — said they’re concerned that student data could be shared with the police.

Though Gaggle rarely contacts the police directly, McCullough said, the district hasn’t said how it’s responding to tips generated from the surveillance service.

The district completed a chaotic candidate search last month and hired 11 “public safety support specialists” to replace the school-based police. The district has refused to disclose the names and qualifications of the 11 people who filled the openings, but documents obtained by The 74 suggest that more than half bring experience in policing, security or corrections — bolstering critics’ fears that the district ended the police contract but created an internal security force. According to an August school board agenda, the specialists training is supposed to encompass “school security 101,” de-escalating conflicts, dismantling the “school-to-prison pipeline” — and Gaggle.

Pfefferkorn, the local activist, blasted Minneapolis schools for a lack of transparency in its student surveillance practices and demanded that officials answer hard questions.

“It’s an opportunity for the district to hold a meeting where they share” how they’re using Gaggle to monitor students, she said. “Although you’re in a contract, contracts have been broken.”

]]>