FETC '19: 'Alexa, tell me what you know about FERPA'
Personal assistant devices raise a number of student privacy questions, but careful planning can make them valuable classroom tools.
Personal assistant devices including Google Home and Amazon Echo are proliferating rapidly, with an estimated 20% of U.S. adults (about 47.3 million) now owning one. With that spread, it was only a matter of time until they found their way into classrooms.
Artificial intelligence itself is no stranger to schools. Platforms such as TurnItIn are already used to grade students' work, detect plagiarism or provide other guidance. But these programs are all used under district licenses with student privacy safeguards in place, unlike consumer-focused personal assistant devices.
During a Sunday morning session at the Future of Education Technology Conference (FETC) in Orlando, two Massachusetts educators detailed the instructional and ethical challenges these devices present in the classroom, as well as their potential best uses.
The legal concerns
In September, the FBI issued a PSA to warn parents about cybersecurity issues surrounding the increased use of tech and student data collection in K-12 classrooms. The agency advised parents to be aware and ask questions. But despite the massive number of people who have these devices at home, many don’t know they are constantly being recorded. And the recent hacking of a California family's Google Nest device, which resulted in an intercontinental ballistic missile warning scare for the victims, demonstrates just how little security these devices can have.
Once these tools enter the classroom, schools and districts are responsible for adhering to the Children's Internet Protection Act, the Children's Online Privacy Protection Act, and the Family Educational Rights and Privacy Act (FERPA) in collecting and using student data.
Rayna Freedman, a 5th-grade teacher at Jordan/Jackson Elementary School in Mansfield — also the president of MassCUE, a chapter of the formerly named Computer-Using Educators — stressed that administrators must explain these regulations to staff members as well as the importance of ensuring students' personally identifiable information is secure. Educators must also tell parents about all devices being used and get approval.
Under FERPA, for example, schools are legally deemed designated authorities and cannot generally disclose parents' or students' personally identifiable information from education records to a third-party without consent. But when teachers use personal assistant devices in classrooms, they become the designated authority figures, and because this technology is always listening and recording, they're potentially disclosing that personal information.
School and district tech directors ultimately have to realize, however, that these devices aren’t just coming to the classroom — they're appearing there already. And without understanding the issues they could cause, acquiring these machines can bring risks to both students and the teachers themselves.
What Google, Amazon and Apple are doing with the information gathered is a big question, since these devices aren’t currently used under a district/school license in the classroom, unlike a tool from a standard vendor might be. An FETC attendee from Arizona noted that this can create issues in the event of a public records request, with the device in use needing to be confiscated for any investigation that might ensue.
In using a personal assistant device in the classroom, Freedman said she has met with concerned parents, and they sat down and looked at research and talked it through. A mother asked for one good reason why her child should use this technology, she said. Freedman had her imagine her daughter in a law office one day, at her desk without a keyboard. Who is teaching her to effectively use the voice-activated tech, which has replaced that keyboard, to her advantage?
Professional development on student privacy is key, added Eileen Belastock, the director of academic technology for the Mount Greylock Regional School District. During a professional development session she conducted, 90% of participants said that as long as a student is using an educational app, their data is protected. But this isn’t exactly true, highlighting the need for that training.
Understanding terms of service is also essential. Does Amazon or Google, for example, have different terms of service for the educational use of a device? If they’re vague about how they mine the data, it could present a coaching or teaching moment. Having transparency about this with educators will get more positive reactions from teachers.
Google’s G-Suite terms of service, for instance, don’t cover Google Home. Noting those differences is key.
That's not to say you can't make a product, which isn't designed for the classroom, compliant for educational use. But educators have to be very specific about those uses and other components. Tech administrators can also circumvent headaches by convincing superintendents and school boards of the urgency of having an “approved” list of apps and tech for classroom use.
A use case for personal assistant devices in the classroom
Beyond privacy concerns, these devices face another major hurdle: They’re not necessarily being used for strong instructional tools, but instead to play games as an amusement in the last 10 minutes of class.
Belastock said as a tech director, she isn't finding the uses to often be tied to curriculum or student learning experiences. On this front, she describes Freedman as a "model teacher."
When you search "personal assistant devices in classrooms" on Google, you get 31 million results. There are 184 million for "how to use personal assistant devices in classrooms." Many of these results, Belastock said, are more about playing a game or checking the weather — things where there are probably plenty of other options.
When Freedman began using Google Home in her classroom, the district had an acceptable use policy for tech that still talked about MySpace and other, less popular platforms. She and a colleague worked to move toward an improved responsible use policy (RUP) as an alternative.
A RUP is better than an acceptable use policy, because an RUP is a set of regulations developed by teachers and administrators and is adopted by the whole school community. It involves all stakeholders and is a living document that evolves as necessary.
The path to a RUP included passing through a negotiating committee, realizing there were "too many cooks in the kitchen," and creating a knowledgeable subgroup.
"Our tech director wasn’t even involved in the process," Freedman said.
They brought a draft back to the negotiating committee, got revisions, and received feedback from students and then-staff and coaches. It was then ratified by the union and presented for school committee approval. A facilitator ran a summer institute for professional development.
But educators must also develop a student-friendly RUP. Freedman's district, Mansfield Public Schools, created K-7 and 8-12 versions of the RUP that educators discuss with students before using any tech with them.
Even with the RUP, struggles remain when it comes to the privacy laws, social media use, third-party tech debates, changes to past practices, questions of who owns the work, digital citizenship, parent permission and the policy rollout process.
“I teach, so I can’t just come out and do everything they want me to do,” Freedman said of her role.
'10-year-olds can do this'
But all that said, how can you hit a homerun with Google Home in the classroom?
Freedman first met with her principal about why she wanted to bring the device into the classroom. She saw students using the built-in Google assistant on their Chromebooks instead of typing things out. They found they were able to do things more quickly that way.
When it's available for use, the Google Home saves time for students when they have questions — 30 seconds or less, as opposed to maybe as much as 10 minutes.
But bringing new tech into the classroom doesn’t mean immediately using it — you have to think about what you’re going to do, why, and how it impacts students. Freedman developed a slideshow for her class to go over how the device could be used and what it was there for, and parents contributed their thoughts to a Padlet digital bulletin board.
Freedman went over the district's K-7 RUP with students, asking them to go through and define how the personal assistant would be used, developing a classroom-specific RUP for Google Home.
“Sometimes I have to remind myself that my kids are 10 and that 10-year-olds can do this,” Freedman said.
One rule in the RUP: When the device isn't in use, it must be unplugged so it’s no longer listening. Turning it “off” is not enough because "it’s always listening," Belastock said. This becomes a privacy lesson for students.
This is a lesson even some parents learn from the process, the duo said, as many have never known their Amazon Echo or Google Home, or even Siri, was recording them all day.
Freedman must also know why the students are using Google Home before they use it. To get them acclimated, she had every student come up with their own question and practice speaking to the device as a mini-lesson.
Beyond being a tool for quick answers, the device prompted discussions with students about what they’re agreeing to when they just click "OK" on an app's terms of service without reading first. And it removed a lot of murky waters from classroom discussions on topics such as lynching, which would bring up a lot of sensitive images and videos if students were to search for it on a computer. Google Home provides students a one-sentence answer with a source, leading into further class discussion without the distraction of those other media items.
Despite the challenges, careful planning with these tools can indeed reap valuable rewards.
Follow Roger Riddell on Twitter