Featured image from OpenPTrack, body-based cyberlearning tools, developed as part of an NSF grant “Promoting Learning Through Annotation of Embodiment (PLAE),” Dr. Noel Enyedy, co-principal investigator.
On December 11, 2018 the National Education Policy Center’s e-newsletter confronted the growing backlash against “personalized learning” in general and Mark Zuckerberg’s Summit Learning in particular. Unfortunately, instead of fundamentally opposing the corporate hostile takeover of schools through digital media and learning, they fell back on four data-friendly policy recommendations drawn from Dr. Noel Enyedy’s 2014 brief, which are summarized here:
1) technology investments should not overstep research
2) software developers, researchers, and teachers should partner to determine “what works”
3) dedicate resources to professional development, and
4) be open to new models of technology integration in classrooms
Recent developments around ed-tech and social impact investing make it clear that these recommendations could in fact lead to increased datification of students, especially in light of the passage of Pay for Success legislation embedded in the Every Student Succeeds Act. This post is written as a response to Dr. Enyedy’s recommendations. It also provides additional context around student data privacy concerns and the importance of understanding opposition to online education as a global struggle.
“What Works” Research
Enyedy’s first two recommendations imply technology investments can be made based on research that provides “rigorous evidence of what works and what doesn’t.” It is significant that his brief predates the Every Student Succeeds Act, which mandates tech interventions based on that same concept. Passage of this legislation the following year opened the door to pay for success (PFS) financial investment schemes. Broad adoption of “personalized learning” is key to scaling the PFS model of privatization, because it normalizes digital instruction, a precondition for cheaply documenting “impacts” associated with outcomes-based contracts.
Narrowly defined “success” metrics are embedded in PFS government contracts to determine how much profit will be paid out to investors. The data collection then dictates how services are delivered and requires intrusive tracking and predictive analytics. Black and brown students enrolled in no excuses charters and “turnaround” schools are disproportionately subject to this type of digital instruction. Student “success” must be captured as quantifiable data to evaluate PFS contracts-hence the rise of digital interfaces and data dashboards.
Teachers cannot possibly enter the amount of data required, especially given large class sizes, which is why they must become “guides on the side.” Screen interactions gather considerably more meta-data, so they have taken precedence over face-to-face instruction. If you’ve wondered why students must take demoralizing pre-tests; the answer lies in PFS. The model only works when there is a baseline from which to assess “impact.” ESSA moved districts away from the big year end test to a regimen of online testing all the time. Efficient, scaleable, automated “solutions” are preferred, which is why PFS and ed-tech are a match made in heaven.
In a PFS world, researchers and deal evaluators play critical roles. They are the ones to set the standards for “what works” and determine if interventions meet expectations. In 2017, three years after Enyedy’s brief, the University of Virginia’s Curry School of Education and Jefferson Accelerator, an ed-tech incubator, partnered with Digital Promise to host an invitation-only academic symposium. The “EdTech Efficacy Research Academic Symposium,” concluded a year-long collaboration of “approximately 150 researchers, teachers, entrepreneurs, professors, administrators, and philanthropists” exploring “edtech efficacy.”
Their work advanced a very specific goal: to “develop, fund, pilot, procure, and implement” edtech according to efficacy research, which meshes perfectly with PFS. Among the initiative’s ten working groups were “Investors and Entrepreneurs” and “Education Philanthropies.” The symposium was underwritten by proponents of social impact investing, online learning and competency-based education including Gates and Zuckerberg. The discussions took place behind closed doors. Teachers were excluded. Parents were excluded. Community stakeholders were excluded. It was not meant to be an open discussion, but rather a tactical session to advance the primacy of tech and impact investing in educational spaces.
Unless we understand how edtech is intertwined with privatizing, speculative financial schemes, many of the policy changes taking place make little sense. Why have tablets replaced blocks and kitchen sets in kindergarten? Not because it is good for children, but rather because students must generate data to prove the “success” of online programs, justifying the redirection of public funds into private hands. The parameters for these investments have been largely established by University of Chicago economist Jim Heckman, who with financial support from the Pritzker family, created an equation and toolkit promising a 7-13% return on investment in Pre-K to 3rd grade interventions. Advances in cloud-based computing, deeply subsidized broadband installation, and a significant drop in the cost of devices have coalesced. Now, after years of planning, schools are rapidly being transformed into data factories, the role of student evolving into that of unpaid digital laborer, raw material offered up for online intervention.
With bipartisan political support, PFS finance is now being applied not only to public education but to workforce development, housing, mental health, addiction treatment, and post-prison reentry. It has been embraced not only in the United States, but also in the United Kingdom, Canada, Australia and international development aid circles. PFS is a tool of neoliberal austerity; one that attempts to mask privatization with data-driven “evidence-based” platitudes. It uses research to advance a “what works” narrative, and that research is often underwritten by the same interests that stand to benefit from the findings.
Equal Partners in the Ecosystem?
Enyedy’s second recommendation presupposes a mutually-beneficial partnership among teachers, software developers, and researchers is possible. But is it prudent to make that assumption? If you look at Global Silicon Valley’s 2012 report, “American Revolution 2.0: How Education Innovation is Going to Revitalize America and Transform the US Economy;” the materials associated with GSV’s partnership with Arizona State University going back to 2010; and the “Theory of Change” put forth by Ridge Lane LP’s education division, it is clear the deck is stacked in favor of privatization and global venture capital. The fact that a tremendous power imbalance exists cannot be overstated. With global edtech revenue topping $17 billion in 2017 and anticipated growth projected to be over $40 billion by 2022, it’s hard to imagine teachers entering into a such a partnership on an equal footing.
What about the robots?
The third recommendation addresses the importance of professional development for educators in implementing ed tech. Which begs the question, why is so little funding being allocated for teacher training in ed-tech now? Could it be because venture capitalists don’t actually see a meaningful role for teachers over the long haul? If you take the forecasts of Knowledgeworks and Global Education Futures Forum at face value, the ultimate goal of technology, telecommunication, and venture capital interests is to replace brick and mortar neighborhood schools with digitally-mediated learning ecosystems. Many elements are being piloted now: standards-aligned competencies, badges, digital vouchers, out of school time learning credit, elimination of seat time requirements, and career pathways linked to alternative credentials. As the Knowledgeworks white paper, “Exploring the Future Education Workforce” notes, with the ecosystem model there will be few career teachers. Those stable, unionized jobs with pensions and benefits will be replaced with “flexible” (ie precarious) employment options like competency trackers, data stewards, and pop up reality producers.
Meanwhile, Fourth Industrial Revolution teaching is being digitally platformed and, where possible, automated. Besides growth in online course offerings in traditional schools as well as virtual ones, we have the VIPKid model in which a globalized teaching workforce competes for apple ratings and gig-economy pay. Growing numbers of US teachers, unable to get by on their meager salaries, wake early to log a few hours tutoring kids in China. In some not too distant future, lean production could leverage instruction delivered by virtual pedagogical agents, much of this research being carried out at USC’s Institute of Creative Technologies with US Army support, or humanoid robots like Bina48 that served as a co-lecturer at West Point this fall.
The Saudi sovereign wealth fund, looking to diversify its holdings, is now making significant investments in AI and robotics. They’ve teamed up with Softbank, known for Pepper the robot. Pepper has a academic edition that delivers curriculum from a screen in its chest for the price of $24,990. In Philadelphia classrooms, Milo the robot provides autistic support services. This is not some distant future; this is happening now. So what does professional development mean in this context? Teachers are on the front lines of the reskilling agenda, expected to model micro-credential and badge acquisition to students. Is it reasonable to expect them to engage willingly in ed-tech professional development if the end game for Zuckerberg and Gates is instruction delivered by Pepper or Jett?
Enyedy’s final recommendation suggests other forms of technology implementation be considered beyond traditional adaptive learning systems. For instance, when he published in 2014, an open source “Experience Application Program Interface” (xAPI) was in early development for the Advanced Distributed Learning Initiative (ADL), a project of the US Department of Defense. xAPI advances ADL’s goal of scaling mobile personalized learning systems that incorporate emerging technologies including gamification and simulations. It is meant to go beyond the limitations of laptop-based adaptive learning management systems permitted under SCORM (Shareable Content Object Reference Module), its predecessor. Here, too, Zuckerberg exerts his influence. iFest, an ADL sponsored e-learning symposium held August 26-29, 2018 in Alexandria, VA, featured Bror Saxberg as keynote speaker. Dr. Saxberg, formerly an executive with K12 Inc. and Kaplan, currently serves as the Vice President of Learning Science for the Chan Zuckerberg Initiative. His responsibilities include developing “good learning measurement practice at scale” for pre-K, K-16 and beyond as well as new offline and online learning products.
Using xAPI, proprietary algorithms can track designated “learning” across platforms including wearable technology, ibeacons, immersive simulations, and phone apps. Thus, in an xAPI world, learning is reduced to a trackable noun, verb, and object statement that can be uploaded to a digital locker or learning record store. This tracking is not just for K12 or P20, but is tied to “lifelong learning” as workers are designated as skilled and re-skilled. Of course there will be an expectation that workers self-finance their own reskilling, likely through for-profit credentialing companies using innovative data-capturing technologies. This will open up vast new markets for educational debt, likely presented as “micro-debt” or income sharing arrangements so as to tap the poorest workers. It should come as no surprise that the Nellie Mae Foundation and the Lumina Foundation, both with origins tracing back to the student loan behemoth Sallie Mae (now operating as Navient), would be leaders in the campaign to deconstruct public education and turn it into a skilling-reskilling, badging, workforce pathway, speculative human capital enterprise.
Researchers are working to connect xAPI with IMS Global, a leader in education data analytics and badging. IMS Global, a non-profit with close to 500 members in 22 different countries, began in 1995 as a program of Educause with the goal of using technology to affordably scale “learning impact.” Educause has received over $87 million in funding from the Gates Foundation. IMS Global facilitates collaboration between educational institutions and edtech suppliers on the development innovative products and digital strategies. When school administrators and teachers, are presented with opportunities to try new technologies like “executive function enhancing” video games or Google cardboard activities, they don’t connect this with xAPI or IMS Global. They don’t realize many of these technologies came out of Defense Department research and are paving the way for cloud-based education that puts data collection before human connection.
While tech initiatives like virtual field trips or work-based learning apps may sound intriguing, it is crucial to recognize that Internet of Things tracking captures not only cognitive performance data, but also social-emotional and biometric data. Dr. Enyedy is surely aware of this, given his position as co-principal investigator on an NSF-funded study “Promoting Learning through Annotation of Embodiment (PLAE)” awarded in 2015 as part of the agency’s Cyberlearning Learning Exploration Project initiative. The study involved gathering and analyzing student motion and location data in “embodied play simulations” for the purposes of assessing improvement in science and math concept understanding. Research on augmented learning environments is advancing quickly. So why are we not yet engaging in robust public discussions on classroom surveillance and student profiling?
Those who see schools as data factories value the prospect of alternative technology deployment. Non-screen applications using wearable technologies actually yield more data points than chromebooks or tablets. That data, particularly soft skills data, is in high demand by employers. Emotion sensing software and even brainwave monitoring devices have been developed for classroom use; for example BrainCo, developed with support from Harvard and now available in Chinese markets. It may sound far-fetched, but the data dust collected in makerspaces or on VR field trips that students take in middle school could one day influence the career pathways available to them.
Framing the Narrative
NEPC’s newsletter noted lack of a clear definition for “personalized learning,” which is intentional. It is difficult to challenge a vague concept that has carefully orchestrated branding. In 2010, the MacArthur Foundation hired Frameworks Institute to create a digital media and learning (DML) toolkit lobbyists and hired “experts” could use to overcome “problematic” ways of thinking about educational technology on the part of the general public. The goal was to enact policy and laws mandating tech-mediated learning environments promoting 21st century skills while breaking down the “classroom bubble” and “the caring teacher model,” which were identified as impediments to education reform. In order to gain clarity about what is happening with ed-tech, we must break free from the framing that industry, in partnership with venture philanthropy, has put in place. We must reject their narrative and develop a language that can speak to the profound crisis we face not just here in the US, but around the world.
The protest that took place against Summit Learning at the Secondary School of Journalism in Brooklyn and featured in the NEPC email needs to be understood within the context of ongoing campaigns of resistance already underway. The Global South has been fighting colonial ICT Education exports for a number of years. Chan Zuckerberg is heavily invested not only in Summit Learning but also in Bridge International Academies, a private education company serving 500 schools in Africa and India. Heated opposition to the company’s scripted online curriculum and substandard facilities began in 2016 in Kenya and Uganda. In the spring of 2018, 174 signatories from 50 countries issued a formal letter of concern over the company’s policies calling for divestment. Africa has been fighting back hard; we should look to their example.
Data Profiling for “Risk” and “Impact”
Artificial Intelligence (AI), machine learning, and data science are being woven into instruction and school operations to predictively profile and even “threat score” students. Tech’s currency is data. Market logic dictates data-mining must increase, because that is where the profit lies. Case in point: Clever’s single sign-in feature aggregates data across 300+ online learning programs. Are parents and teachers supposed to feel good if research, which is often funded by venture philanthropy, endorses a fraction of these programs as “evidence-based?” Should we tolerate 5-year olds wearing oversize headsets that slip down over their eyes as they use QR coded badges to facilitate the ongoing harvest of their data as depicted in this video from Rocketship Academy charter schools? Should we accept high school students being put on Achieve3000 when they could be reading books of their choice that might open their eyes to new possibilities beyond the institutional framing offered by an adaptive system?
And how can we respond to the creation of “Datazone,” a massive online warehousing initiative that holds 300 metrics on 900,000+ students across 23 school districts in Silicon Valley? Datazone is connected to the Silicon Valley Regional Data Trust, funded by Chan Zuckerberg, that links education data with data about foster care, court involvement, and health and human services. SVRDT is a founding member of the National Interoperability Collaborative, which has ties to the National Fusion Center Association and the National Council on Crime and Delinquency. SVRDT is being promoted as a national model to scale data interoperability.
The most vulnerable are being targeted for massive data exploitation. Few of us have a clear understanding of the vast quantities of data being collected in schools, where it is going, how it is being used now, and how it may be used in the future. This data-driven profiling will not benefit “at risk” children; rather, their lifelong data will be the fuel that runs the impact investing machine, like an inescapable treadmill.
The “what works” narrative being put forward is about “what works” for investors looking to privatize public services and demonstrate “impact.” Ed-tech is, at its core, a system of social control. Billionaires like Zuckerberg, Gates and Hastings hope we won’t notice in time. The online systems and dashboards they fund are designed to isolate and track us, making it harder to mobilize.
We must recognize that edtech in classrooms can never be neutral. Microsoft, Alphabet (Google), Amazon, Apple, and Facebook are reengineering how we are allowed to access information, how we are expected to relate to one another, how we are forced to interface with digital economic systems, and how we are evaluated for “risk.” Cloud-based computing has been inextricably woven into our lives. It will, admittedly, be difficult to disentangle from it. But Amazon Web Services’ close ties to state surveillance and the existence of Bluffdale make it clear that education for liberation cannot take place in the cloud.
Those who are determined not to submit to algorithmically assigned life pathways for ourselves, for our children, and for our communities, must dedicate ourselves to defending public space, building offline communities where we can develop alternatives grounded in humanity and justice. We cannot swap Summit Learning for some slightly less predatory system. Unsurveilled knowledge sharing will survive only if we fight for it. Our first, most urgent obligation, is to mobilize our popular, legal, and scholarly organizations in defense of the most vulnerable. We must not only ask questions, we must demand answers when institutions dip into the DML toolkit for talking points. We can make space to think otherwise.
What is the power dynamic at play?
Who is being monitored?
What is the profit motive?
What data are being collected?
What profiles are being built?
What is being automated?
Who stands with opposing data-mining and resisting the dehumanization of education and social services?
Let us unite here while reaching out to those struggling abroad. It is our work that can link Brooklyn’s Summit Learning student protestors to their counterparts in Kenya and Uganda resisting Bridge International. Our voices are in a position to change the course of history, and turn it away from their orchestrated narrative.