As long as these two terms continue to be misunderstood or interchanged for one another, businesses will struggle to protect the privacy of consumers online. Security software may address the challenge of protecting your devices from viruses and intruders, but it doesn’t provide control over how your information is shared online.
The paper develops a threat model that describes the actors, incentives, and risks in online education. Our model is informed by our survey of 105 educators and 10 administrators who identified their expectations and concerns. We use the model to conduct a privacy and security analysis of 23 popular platforms using a combination of sociological analyses of privacy policies and 129 state laws… alongside a technical assessment of platform software.
As the researchers point out, the well-established educational norms that protect the privacy of students and teachers in the traditional educational context are absent in the new virtual classrooms. There, they are determined largely by the privacy policies of the companies that make the software. As is usually the case, few people bother understanding or even reading the details. As a result, practices that would be completely unacceptable in the physical classroom – things like surreptiously recording students and teachers, or recording data about their studies – may be happening as a matter of course, but without anyone being aware of that fact. Collecting and storing large quantities of personal data about every student is so easy that it often happens by default. Analyzing the privacy policies, the researchers found that 41% permitted a platform to share data with advertisers, which conflicts with at least 21 US state laws, while 23% allowed a platform to share location data. Clearly, neither would be acceptable in most educational contexts, and this underlines how privacy in the virtual classroom is not well protected for those platforms.
On the plus side, the research also revealed the importance in the US of Data Protection Addenda (DPAs). These are side agreements where universities exploit their size to negotiate extra privacy protection for users of a platform. Recognizing this need, many providers of virtual classroom software offer templates that can be used as the basis of these agreements. The widespread use of DPAs is a reminder that it is often an option for organizations to negotiate stronger data protection for users – there is no requirement to accept low default levels of privacy.
Another serious issue of the rapid, almost desperate implementation of online teaching systems is that security issues tend to be overlooked in the rush to get something working fast. The researchers looked at binary security, known vulnerabilities, and bug bounties, and drew on the Common Vulnerabilities and Exposure (CVE) system, as well as the National Vulnerability Database and its impact and exploitability scores. The platform most widely used among participants in the researchers’ survey, Zoom, was found to have a number of problems:
Zoom has many recent CVEs (11). While intense recent attention is no doubt a contributing factor, the substantial number of recent vulnerabilities suggests a systemic component to Zoom’s security issues. Further, as our evaluation postdated Zoom’s efforts to remediate the aforementioned security issues our results likely understate recent problems with the software. On the other hand, Zoom’s rapid improvement in both software and process (that represented a response to disfavorable media coverage) point to a positive trajectory for Zoom.
Ten years later, after the horrors of World War II, George Orwell published 1984, which described a dystopian future far less comforting than Huxley’s, and was positively terrifying in many ways. A cypherpunk is any activist advocating widespread use of strong cryptography and privacy-enhancing technologies as a route to social and political change.
The research concludes with some recommendations on ways to protect the privacy of students and teachers, including the following:
universities should not spend all their resources on a complex vetting process before licensing software. That path leads to significant usability problems for end users, without addressing the security and privacy concerns. Instead, universities should recognize that significant user issues tend to surface only after educators and students have used the platforms and create processes to collect those issues and have the software developers rapidly fix the problems.
It’s always best to protect your personal computer.Another thing which the privacy software needs to possess is a way of eliminating cookie trackers.The software should offer malware detection and browser cleaning.Program that protects privacy will continue to defend you even when you’re not online.
Although that might seem surprising, it’s a pragmatic response to the difficult situation created by Covid-19. It simply isn’t possible to conduct a leisurely process of exploring every aspect of multiple platforms, before finally making a decision. Instead, the researchers are suggesting, it is more important to plan for problems after a platform has been installed, and students and educators start using it in earnest. The most important thing is to put in place a system to collect those issues in a systematic way so that they can be addressed.
It’s an idea that can be applied more widely. The pandemic has required many rapid changes to how life is conducted, whether in an educational context, in business, or at home. Most people just want something that lets them get on with their lives, at least to the greatest extent possible, which means they are likely to choose solutions quickly, and without the kind of research they would carry out in “normal” times. Software companies benefitting from that rapid uptake would do well to see the problems that will inevitably arise as an opportunity to spot and fix bugs quickly. If they don’t, they are likely to face the kind of public criticism Zoom has experienced for precisely this reason.
Choosing providers that support data residency helps companies satisfy their customers’ increasingly regional expectations of privacy — especially when paired with a robust, globally-focused privacy program.Working with service providers that support data residency helps ensure that information can be collected, processed, and stored in a way that meets different expectations.
Featured image by Danslafrique.