A Pedagogy of Facial Recognition: Access, Privacy, and Practice at DPL 2018

DPL 2018 Apr 19, 2018

On July 31-August 3, 2018, Digital Pedagogy Lab will offer a new course, Access, Privacy, and Practice, collaboratively taught by Bill Fitzgerald and Chris Gilliard. This course will address pedagogical practices geared towards building an ethical framework for digital pedagogies and practices. Participants will focus on building ethical classrooms that commit themselves to student agency and privacy by addressing issues such as race, ethnicity, gender identity, and sexual orientation.


In a recent Inside Higher Ed story, Mark Lieberman featured a professor who plans to use facial recognition and sentiment analysis technology in his classroom in order to help instructors tailor their teaching approaches to levels of student interest, and to address areas of concern, confusion and apathy from students. “If most students drift into negative emotions midway through the session, an instructor could enliven that section with an active assignment.”

Sounds expensive, right? You might wonder: how would a professor afford this kind of tech? Well, Microsoft, in its benevolence, is willing to loan the professor “the facial recognition and emotion interfaces, free of charge, (emphasis mine) as well as the PowerBI software that provides a platform for those applications.”

For me, this article brings up a host of issues, some of which I explored briefly on Twitter.

There’s a lot to unpack with the article, but I’ll note just a few things:

The article doesn’t discuss who will profit. Microsoft will likely profit, as might the professor, and perhaps even his team. The students who will be the subjects of the testing—the source of the training data—who make these experiments possible, certainly won’t see any profits from this. The article also doesn’t ask about whether students will have the ability to opt out, whether there are any third-party restrictions, or how this data might be combined with other sets of data to make assumptions about who students are and what they are capable of. All we get is an article cheerfully reporting that the school is “still deciding” how much info to provide the students. So much for “informed consent.”

Nor is there any acknowledgement of the current context: national discussions about what it means to be safe in school. It’s a very short leap from “let’s use facial rec and sentiment analysis to see who is paying attention” to “let’s use facial rec and sentiment analysis to see which students are threats.” Given what we know about how facial rec is notoriously less effective when being used on POC, and that Black girls are disproportionately disciplined in schools, there are many ways that technology like this might be deployed that go beyond what one careless story imagines as a good way to automate teaching.

What does this have to do with Access, Privacy, and Practice at Digital Pedagogy Lab 2018? These are the types of cases and questions we will explore, with an eye towards creating pedagogical spaces and practices, especially in regard to digital technologies, that better model fairness, consent, and equity.

Register for Access, Privacy, and Practice

Bill Fitzgerald

Among with Chris Gilliard, Digital Pedagogy Lab

Bill Fitzgerald is a technologist, privacy advocate, and open education advocate.

Great! You've successfully subscribed.
Great! Next, complete checkout for full access.
Welcome back! You've successfully signed in.
Success! Your account is fully activated, you now have access to all content.