Schools and Covid-19
We’re part of a coalition asking data protection authorities, policymakers, edtech providers, and educators to take the following steps to protect children around the world.
An estimated 90% of the world’s student population are affected by school closures in the Covid-19 pandemic. And, in the absence of physical space, education technology companies are stepping in to fill the gap. There are plenty of reasons to be excited about the potential of technology to provide support, but it’s important to consider the ongoing implications of which technology we choose, and the implications for those families who don’t have access to them in the first place.
That’s why we’ve signed on to this letter to policy makers, data protection authorities, and providers, worldwide, regarding rapid technology adoption for educational aims.
For many families across the world the technology their child’s school is using isn’t a relevant concern, for many reasons, whether they don’t have reliable access to the internet or access to a home computer. Without serious steps to address the inequality of access, the Covid-19 crisis risks creating two classes of kids, those with technological access and those without - further deepening already existing inequalities.
In California Google has decided to provide internet access and chromebooks for free, that’s a generous offer, but it needs to be accompanied with significant transparency and safeguards. What does Google intend to do with the data they are able to collect from the kids with no other choice? It’s worth asking, with any company, seeking to take on a role like this - what do they get out of it?
Data is extremely valuable to Google in many ways and there has already been much controversy about their practices - from amassing health data to a range of other data practices around the world. Google collect a significant amount of information about students using their education offering including location data, the mobile network they’re using, and their phone number.
Inviting tech companies into the classroom without thinking about risks and corresponding mitigation strategies is problematic and needs to be treated extremely sensitively. According to Common Sense, an American non-profit that evaluates edtech tools, “80% of the appications and services they reviewed in 2019 did not meet their minimum level of responsible safeguards”.
The rush to use Zoom has already proved dubious - schools in New York City are reportedly being banned from using it because of its privacy and security flaws. In Brazil teachers have been repurposing technologies like Facebook to communicate with their students - potentially one of the few times Facebook has been welcomed in classrooms by a school. In Uganda telecommunications companies are stepping in to provide education, with both MTN and Airtel offering their own education portals.
Data protection law affords some safeguards, if implemented and enforced. However, not all countries have such laws and many that do struggle to effectively enforce them. In Indonesia, for example, where schools have also been closed, Ruangguru, “Indonesia’s No. 1 Online Learning Solution” has offered access to their e-learning platform for free, but the app in Google Play includes 8 seperate trackers and 34 permissions including 9 marked as a ‘Dangerous’ or ‘Special’ level according to Google’s protection levels. And that’s just one app in the many that are on offer.
The consequences can be significant. The effects of the tracking, profiling, data distribution and commercial targeting that may well be the result of this rush may include data being sold on or reused without consent.
From UCAS - the UK’s university admissions service - selling access to student’s data for marketing, to US schools sharing peronal student data with for-profit edtech vendors, student’s data has already proved valuable.
This unprecedented global pandemic is accelerating the uptake and distribution of tech solutions - from surveillance, to education, to identity. But, what will happen once the crisis is over? Will schools stop using their new tools? What will the companies do with the data they’ve gathered, and the records they’ve created on every child who’s used their services? This shutdown won’t go on forever, and although it seems far off at the moment, it’s more important than ever to protect our children’s futures.
Everyone is tired and stressed, and in an ideal world school’s wouldn’t have to be wary of all the edtech companies popping up to offer free trials - but that’s not the world we live in.
That’s why we’re part of the coalition asking data protection authorities, policymakers, edtech providers, and educators to take the following steps to protect children around the world.
Data protection authorities
- To co-operate globally to publish guidelines, monitor practice andenforce compliance of e-learning platforms, children’s apps and other edTech
- To consider the impacts of the current use of e-learning, and to conduct and publish children’s rights, equality and data protection impact assessments.
- Recommend and adopt only platforms and resources to schools that adhere to the obligations to respect, protect and fulfil the rights of the child in the digital environment and UN General Comment No.16 (2013) regarding the business sector impact on children’s and students’ rights.
- Publish any decisions about new national level product or service adoptions, and commit toreview practices, and their impacts with civil society including the most affected andmarginalised communities, once the emergency situation has ended.
- To not exploit students’ participation in compulsory education for commercial gain,in particular at this time when consent cannot be considered freely given.
- To adhere to best practice consistent with the rule of law, and with suitable safeguards for students’ security and privacy, including accessible and inclusive curriculum needs, encryption and data protectionby-default-and-design; avoiding profiling, dark patterns or interference from opaque nudgetechniques, and behavioural and emotional analytics.
- To be fully transparent aboutprocessing personal data, automated decision making, and the sources and assumptionsmade in any training data, used in tools that employ artificial intelligence.
- To procure and recommend resources where children can learn untouched by monitoring, profiling, data mining, marketing, or manipulation for commercial exploitation.