Focus on Pedagogy, Not Just Privacy

This is the fifth of ten questions presented as a Trans-Atlantic dialogue between myself and UK blogger Privatising Schools. A condensed version pulling together content of several responses for UK audiences can be read on the Local Schools Network website.

Privatising Schools: Question Five

Is fixing data security / privacy the answer?

The Cambridge Analytica scandal was a reminder that the business model of the big tech firms is based on extracting and exploiting our personal data. Here in the UK, Ben Williamson has made the link between Cambridge Analytica and the tracking, profiling, and ‘data mining’ of children which education technology makes possible (see here). One of the strengths of your work, I think, is your very clear view that the problems with tech-driven schooling – what we might call the platform model of education – go beyond issues of privacy and data security. It seems that, in this model, human relationships – between teachers and students, or amongst students – are far less important than in more traditional kinds of education. In an interview last year, you warned about the possible ‘feedback loops’ created by tech-driven schooling. Could you say more?

My Response

Does the technology used in your child’s public school empower students to share their own insights and creativity with the larger world, or does it transform them into consumers of corporate content through algorithmic profiling? If a computer program requires a login from a child and cannot function without having access to their previous interactions with said program I have serious reservations about it, even if they promise the data is secure.

Some reformers envision a time within the next two decades when Artificial Intelligence (AI) learning assistants largely supplant human teachers. These futurists imagine AI “guides on the side” functioning as child minders delivering “just-in-time” content for in-demand workforce placements. This dystopian vision includes students outfitted with biometric wearable devices that extract real time data to guide the delivery of online content. Harvard Innovation Lab’s Brainco has already developed a brain wave monitoring device for classroom use and is now selling it to Chinese markets.

The information fed to students will, of course, be determined by profiles tracked on data dashboards, reinforcing the position each child is expected to occupy in society. God forbid a student stumbles learning their multiplication tables in third grade and is shunted over to the prison labor track. In this model AI, and the racially biased training data that often underpins it, becomes a de facto gatekeeper to knowledge. Information restricted; made available only on a need-to-know basis. Your profile says you’ll never need physics. Your data says you have no talent for languages. The dashboard says you’re behaviorally non-compliant, which is too bad given your high intelligence. Education systems have always profiled students as a means of social control, but developments in Big Data, machine learning, and predictive analytics have the potential to make existing systems considerably more oppressive.

The “personalized learning” model conditions students to view themselves as independent operators, free agents attempting to navigate a precarious gig economy alone. Screen-based isolation and an emphasis on data-driven metrics steadily erode children’s innate tendencies to creative cooperation. Which is ultimately better for society, an algorithm that learns each student in a classroom and delivers a pre-determined reading selection that they review and are quizzed on online, or a human teacher who selects an all class reading in which there is lively debate? The first scenario forecloses creative thought in service of data generation and reinforces there is but one correct answer. The second opens up chances for students to gain new insights while limiting opportunities for digital surveillance.

As a parent, I place my trust in teachers and want them to have the resources and support they need to really know the children in their care and guide them on their educational journeys. Learning is not a linear process, but an organic one with occasional doldrums sometimes followed by great leaps of understanding. A human teacher does not view their students as data points subject to precision engineering; they see them as contributing members of a classroom community each with unique talents, strengths, and weaknesses. A good teacher deftly navigates the waters of collective learning, and their students are better prepared to face the world having had the experience of co-created knowledge.

I am fortunate that my child attends a magnet school where they haven’t embraced online adaptive learning programs. The students use technology to write papers, create presentations, and coordinate group projects. They enjoy freedom in their learning and have a great deal of human contact even though their class sizes are quite large. They still undertake projects and skits and non-digital art, which is something that cannot be taken for granted these days. I am glad for it, but it saddens me immensely that children in schools with low test scores, children who have been subjected to the “turnaround” process, are often compelled to use online learning systems for in-school and at-home instruction. The children with the greatest need don’t have access to non-surveilled education as my child does, which is a travesty. The children who most require personal connection are denied that right; instead data-driven “fixes” are substituted for human care.

While student data privacy is important, I caution activists that we don’t want to “win” on privacy but end up with “secure” algorithmic learning. I’ve been led down the wrong path before. We thought opt-out would be an effective tool against privatization, but instead it ended up reinforcing reform arguments for all-the-time testing. We did not realize that until it was too late. I have concerns that student data privacy may be leading us down a similar road. We must not be fooled twice. Many reformers are excited to talk about privacy and appropriate use of data. Let us not allow them frame the discussion in this way. We must ground ourselves in the importance of good pedagogy, one that respects the humanity and personal agency of both student and teacher. Privacy concerns can play a role in educating the public, but to win this war we have to ground our strategy in the rights of parents and teachers to unplug students from adaptive learning systems altogether, not simply to secure the data in those systems of oppression.

Part One: Talking Across the Pond Here

Part Two: Virtual Reality and Globalized Workforce Here

Part Three: “Personalized Learning” Driven By Data Here

Part Four: We Haven’t Won, We’re Testing All The Time Here