I published an opinion piece in the Globe and Mail. The link is here and I’m pasting the entire article below.
The pandemic has made us even more dependent on a highly invasive technological ecosystem
RONALD J. DEIBERT
SPECIAL TO THE GLOBE AND MAIL
UPDATED NOVEMBER 20, 2020
Ronald J. Deibert is director of the Citizen Lab at the University of Toronto’s Munk School, the 2020 CBC Massey lecturer and the author of Reset: Reclaiming the Internet for Civil Society.
My son is an undergraduate student at the University of British Columbia. Like many of his peers, he has seen his classes move online – and so have their exams.
Students in his program were recently required to consent to a remote exam invigilation software platform manufactured by a company called Proctorio. As with most tech companies, work-from-home measures and social isolation have been a boon to Proctorio: more than 2.5 million exams were proctored by the company in April, 2020, alone, a stunning 900-per-cent increase compared with April, 2019. Other companies in this space – such as ExamSoft, Examity and ProctorU – are enjoying similar surges in demand.
Once installed on a student’s device, applications like Proctorio can monitor students’ keystrokes, capture and record anything on their screens, track their web browsing, and even turn on cameras and microphones to record students’ faces, their surroundings and ambient sounds for evidence of cheating. Proctorio’s proprietary algorithms flag what it detects as “suspicious behavior” to faculty or teaching assistants (TAs) for follow up.
My son said using Proctorio made him feel “creeped out” and uncomfortable. Who can blame him?
It’s one thing to have a TA strolling up and down the aisles of an exam room. It’s quite another to force students to install spyware that tracks everything from their keystrokes to retina movements, sending that data down a mysterious black hole. Imagine having an omniscient, invisible robot looking over your shoulder, staring into your eyeballs, scrutinizing every movement, and scanning your bedroom – the entire time you’re taking an exam. Who could concentrate in those conditions? And yet, he had no choice: The course makes it mandatory.
As it turns out, my son is relatively fortunate. He is white, male, has a good WiFi connection, has no disabilities and lives alone. Many other students are not so fortunate, and pay a high price for it. As I dug deeper into Proctorio and other remote surveillance exam software platforms like it, I unearthed a litany of horror stories – most of them affecting students that were already disadvantaged or marginalized.
For example, Black students and other students of colour have reported authentication delays and even outright rejection by remote proctoring applications because of “poor lighting” or “camera position.” Flaws in the software’s facial recognition systems – technology that is notoriously bad at recognizing dark skin tones – is the more likely answer.
Students who experience facial tics have reported anxiety about being flagged for cheating, having spent the entire exam trying to suppress involuntary movements. Even those without disabilities have trouble with tools like Proctorio. On social media, students – some in tears – describe failing exams because the software flagged them for mouthing the questions or looking around their room while thinking, behaviour the system interprets as talking to someone offscreen.
Low-income students with shared living spaces and caregivers recount similar feelings of anxiety, worried that they’ll be flagged because of nearby noise. One student did her best to tune out her 12- and eight-year-old siblings banging on the door for the duration of the exam, siblings for whom she is the primary caregiver.
Beyond these discriminatory effects are even more serious concerns. A system that spies on students’ homes and bedrooms will almost certainly amplify risks of stalking and sexual harassment. It would never be appropriate for a professor or TA to roam around a student’s bedroom, but Proctorio and the like invite them in … by design.
Digital proctoring platforms (as with so much else of our big tech world) seem to presume their users are mostly white, affluent people with high-speed internet and free from crowded houses or caregiving responsibilities. They assume that they can protect students from abuse by making users subscribe to its terms and conditions. All assumptions that are proving to be highly questionable.
When it comes to digital technologies and COVID-19, by far the vast majority of discussion has focused on contact-tracing applications. Although important, this narrow focus has obscured more fundamental and far-reaching effects at the intersection of digital technology, surveillance and pandemic response. While we fixate on the merits of this or that app, we’ve been missing out on an entire landscape shifting beneath our feet.
Largely without public debate – and absent any new safeguards – we’ve become even more dependent on a technological ecosystem that is notoriously insecure, poorly regulated, highly invasive and prone to serial abuse. It’s like building a second floor addition on our homes without fixing the rotting infrastructure. Eventually, it will all come crumbling down – or slowly make us sick.
Consider Amazon. The company’s success amid COVID-19 becomes obvious just by staring out the window at the invasion of delivery vans perched on sidewalks and parked in bike lanes, delivering packages to makeshift home offices.
Once a startup reseller of books and DVDs, Amazon has become the corporate embodiment of globalization and surveillance capitalism. With shops and malls mostly shuttered, Amazon’s online services have exploded. Chief executive Jeff Bezos, the world’s richest person, saw his personal wealth grow by US$50-billion in the first six months of the pandemic alone.
But that sudden growth only exacerbates Amazon’s existing pathologies. They include the company’s predatory pricing, monopolistic practices, and dismal labour conditions for its warehouse and delivery personnel. These so-called “flex” workers lack health care or other benefits and are subject to extensive surveillance, including navigation software, wristbands, security and thermal cameras (likely as much for the prevention of union organizing as for “security”). The company is also responsible for the proliferation of Ring, a notorious home security system associated with racial profiling and warrantless data-sharing with law enforcement, as well as its vast, wasteful and highly polluting “data farms” (which power streaming video services such as Prime and Netflix) – graded “F” for energy transparency by Greenpeace in 2017.
Meanwhile, tech startups of all shapes and sizes – often with dubious qualifications – are trying to capitalize on the growing demand for technology to spy on workers at home (dubbed “bossware”), bust unions, enforce productivity, monitor symptoms, police physical distancing, detect emotions and, yes, invigilate exams remotely. This bewildering array of new digital bracelets, electronic ankle monitors, immunity passport apps, fever detection goggles, drones, thermal cameras, and video- and audio-capture systems are not subject to extensive auditing or built to protect privacy. Instead, many are the digital equivalent of “snake oil” looking to make quick cash on the rapidly expanding surveillance economy.
This explosion of pandemic-era applications will invariably amplify the defects of the mobile marketing and location tracking industry – a sector made up mostly of bottom-feeder companies whose business model relies on collecting billions of user-generated data points, later sold and repackaged to advertisers, law enforcement, the military, customs and border agencies, and private security services (not to mention bounty hunters and other dubious characters). A shocking number of entrepreneurs and policy makers are nonetheless turning to this cesspool of parasitic firms – poorly regulated and highly prone to abuses – as a proposed pandemic solution.
The entire ecosystem presents a bonanza for petty criminals, ransomware opportunists, spyware firms and highly sophisticated nation-state spies alike. Meanwhile, law enforcement and other state agencies – already growing accustomed to reaping a harvest of digital data with weak judicial oversight – will enjoy a bounty of new and revealing information about citizens without any new safeguards to prevent abuse of that power.
Some argue that this COVID-19-era innovation cycle will pass once there is a vaccine. But the more we embrace and habituate to these new applications, the deeper their tentacles reach into our everyday lives and the harder it will be to walk it all back. The “new normal” that will emerge after COVID-19 is not a one-off, bespoke contact-tracing app. Rather, it is a world that normalizes remote surveillance tools such as Proctorio, where private homes are transformed into ubiquitously monitored workplaces and where shady biometric startups and data analytics companies feed off the footloose biosurveillance economy.
There undoubtedly are positive aspects of digital technologies. But, as presently constituted, those benefits are far outweighed by the many long-term negative consequences that we are risking without serious public debate, and which will almost certainly come back to haunt us as this “new normal” settles in.
What’s the alternative? A different approach would use the historic crisis that the pandemic presents as an opportunity for a reset and to rethink the entire technological ecosystem from the ground up. For far too long, data surveillance companies have been largely insulated from government regulations. But this has come at a major social cost and with numerous unintended consequences. With digital technologies now a “lifeline” and an essential service for many who must adapt to both working and living at home, consumers and citizens have a right to demand more. What might those demands entail?
First, and most importantly, we need to clean up the cesspool of free-wheeling data broker, advertisement and location-tracking companies. New laws should be passed to give users real power and restrain how tech companies gather, process and handle their personal information. This includes meaningful and easy to understand “opt-in” measures, rules to minimize the type of data collected to specific and justifiable purposes, and better avenues for users to sue companies that transgress those laws. In particular, legislation should permit consumers to restrict the use of geolocation data by third parties, prohibiting targeted advertising to users visiting therapists, clinics and other “none-of-your-business” activities.
Second, the rights of “flex” workers, independent contractors and other “gig economy” workers need meaningful legal protection. Big tech platforms and other businesses should not be able to use COVID-19 as an excuse to spy on workers in warehouses, factories, rental cars and homes, or to clandestinely monitor their social-media feeds to disrupt labour organizing. Big tech CEOs and their shareholders should also be compelled to use the newfound prosperity they are reaping thanks to COVID-19 to improve their employee’s lives. It is both grotesque and unethical that the Jeff Bezoses of the world can lap up skyrocketing personal wealth while their front-line workers experience layoffs, longer hours, fewer benefits, disproportionate health risks and dehumanizing surveillance measures.
Third, tech platforms should be legally required to open up their algorithms and other proprietary technology to outside scrutiny and public interest audits. The giants of surveillance capitalism hold enormous power over our lives, including over the choices we make, the products we purchase, the news we see and the people with whom we associate. These platforms have increasingly become essential to just about everything we do, and they should no longer be able to operate as “black boxes” whose inner workings are obscured to all but a select few. It’s time to pry open the lid of the technologies that surround us.
Lastly, there’s something all of us can do. We’ve all become habituated to seeking technical solutions for complex social and political issues. And while technologies can produce enormous benefits, we’ll need a lot more than a few new gadgets to solve the problems of our time. We must resist the temptation to reflexively look to “apps” and “platforms” when there may be other more traditional and ultimately more enriching ways to organize our lives, respond to social problems and accomplish our goals.
Unlike many other industrial sectors, the tech platforms have emerged from the pandemic stronger and are already positioning themselves as indispensable to public health. It’s time to hold them, and all of us, to account for it.