Learning about learners: Protecting Children's Data

This week we speak to Jen Persson, Director of Defend Digital Me, about the technologies being deployed in schools in England and Wales. 

Video
English

Transcript

00:07.64
Caitlin
Welcome to The Technology Pill, a podcast that looks at how technology is reshaping our lives every day and exploring the different ways that governments and companies use tech to increase their power. My name's not Gus Hosein, I'm Caitlin and I'm PI's Campaigns Coordinator. Hi.

00:25.18
Caitlin
This week, we're talking about education technology. If you've listened to the podcast for a long time, you'll know this is the topic I am low-key obsessed with and bring up whenever I have the opportunity. Luckily for me, I guess luckily for him as well, Gus is on holiday this week. So I'm going to get to talk to Jen Person, who is the executive director of Defend Digital Me, about the unique environment in England specifically of how children's data gets extracted, gets used and gets reshared.

00:53.63
Caitlin
in the UK educational environment both by the Department for Education, often called the DFE by all the cool UK kids using the cool UK slang, but also by edtech companies that are coming in to these really sensitive and interesting spaces to try and make a load of money.

01:12.77
Caitlin
Let's just jump right into this conversation because I'm really excited to have it.

01:28.89
Jen Persson
Hello, thank you for having me. So I'm Jen Persson, director of Defend Digital Me. Defend Digital Me was set up a decade ago in the UK to look at children's and learners' education data. So we campaign for safe, fair and transparent data across the education sector in England in particular.

01:47.48
Jen
But we now say also and beyond because education data is increasingly used as a go-to source for lots of, but whether it's governments, departments or ed tech or other commercial types of users to source information about children or indeed adults from when they were children.

02:05.17
Jen
We're looking really at how those information can be used across education sector and also the rest of the public sector. And we started out with as a single issue, looking at how the UK government used national pupil records, that is all the information that's collected about children and learners when they start school, and is kept on a rolling basis and updated until about age 25, and then is retained forever about their course of their education, and how that database was being used without full transparency, without users and learners really knowing what and where

02:42.93
Jen
was being collected and or what it was used for and the sorts of repurposing it was going to. And we're still looking at getting improvements in how those data are cared for, looked after, managed, and importantly, trying to get changes in what people are told about that data and to give them controls over its use through an opt-in mechanism.

03:06.35
Caitlin
What caught your eye, I guess, about the National People Record National People database in the UK that led to Defend Digital Me?

Jen
So more importantly, there were lots of other people working across civil society about whom we were not familiar. I was not familiar until groups and individuals got together. And there was a group called Med Confidential that campaigned around the collection of GP medical records about all of us across the NHS.

03:35.69
Jen
And they did incredible work to raise awareness in people like me who knew nothing about how the state would collect information and personal data about all of us. And so that raised my awareness of how that kind of information was being used in particularly in the health sector.

03:50.30
Jen
And then drew my attention to how it was being used in other sectors. I was a mother, I still am, of three children at the time who were then in school and really had no idea about what was collected and where it went. And it surprised and shocked me and still continues to do about what level of sort of sensitive and very intrusive data is collected on an individual level about children.

04:15.18
Jen
And really trying to get that information out to parents and to families and to everyone really, because that data is kept forever. It affects people in England in particular up to about the age of 48.

04:28.97
Jen
Anyone who's been in state education has a record and it's been given away to all sorts of third parties. And what particularly shocked me at that time was in sort of 2012, 2013, was it being given to journalists, to charities, think tanks, commercial companies, including ah company that, for example,

04:45.63
Jen
was given identifiable pupil level information to create heat maps that were used for estate agents mapping where children came from and where they went to school.

04:56.25
Jen
So there were all sorts of reasons this data is collected for your everyday direct education, but I didn't realise it even left school on a named basis at all. So I think the surprise of that really caught me out and I wanted to make sure there was that level of understanding that I was starting to get available to other people.

05:14.20
Jen
And so we started with pupil data and then really working through how the rest of that information was being sent across the rest of the education sector and then out into the public sector and to other places.

05:26.67
Hen
We started to also look at other areas, including, for example, commercial uses of data. So EdTech, that's technology companies that children use in the course of an education, or the administrative systems that ah schools are using about children.

05:41.99
Jen
And also then looking at sort of the policing and home office type reuses of the same sorts of data that are going into education, but coming out and going to other bodies.

05:53.16
Caitlin
Has a lot changed in the last decade? Or is it times change, but broadly things stay the same?

Jen
It's a great question. I think there's a really good reference point, which actually goes back 25 years, which is the database state report that was published by Leading Academics.

06:10.94
Jen
And that tackles a number of databases that the UK government was in the process of, or perhaps starting or had a established 25 years ago.

06:22.58
Jen
And this really included primarily sort of population-wide or or sectors of population-wide data. And what I think has changed is now The scale of those, of course, where databases that were about a population at a point in time and have now been kept forever are now 25 years old. And if those records go back any length of time, they are really, really long, what they call longitudinal records.

06:50.76
Jen
And the use cases of the data have grown. And as people say, you know, if you build it, they will come. Those databases were built initially, many of them without the safeguards and protective oversight that should have been in place for the sensitivity in particular for many of the data.

07:09.75
Jen
And also what's changed perhaps is some of the infrastructure and the technological capabilities that existed 25 years ago have really been expanded on, as well as the malicious types of uses that they can be put to, because we're now all far more connected. The internet, the online activities that data now don't just sit in the database, but can be shared at speed and scale with a couple of clicks of a button means that The initial risks that I think they scoped back then have really materialized. And a lot of those uses that we now see that civil society warned of, for example, back in 2002, the warnings from civil society at that point were,

07:53.34
Jen
If you start collecting named pupil data, what could be done with it? And a government of ministers, of course, said at the time, we're not interested in using this at named level. We only want to be able to use it to join up records.

08:03.87
Jen
And that will give us better research data. And we'll then use that research data to better the system and improve the the lives and the delivery of education for everybody in the system. Fast forward 10 years and different governments started using it for immigration enforcement, not to improve the lives of individuals, but actually chase down families or people that they felt were in breach of their terms and conditions in terms of migration and immigration laws.

08:31.56
Jen
And that was something that was certainly never discussed at the time the data started to be collected. Last year, the same data set starts to be used for welfare benefit fraud detection.

08:42.62
Jen
And again, this might have worthy reasons for that sort of you know pursuit of fraud. But I think the question is, where is both the legal and the moral duties to examine why was that data collected in the first place.

08:58.74
Jen
And if it's collected about children who go to school in order to fulfill their right to education, it's really important that those data are kept safe for those purposes. And particularly right now, where governments are concerned in the wake of COVID, that more and more children are perhaps being home schooled or out of school, that bond of trust between a family and the school is really important to maintain. And so for those families, or for those with undocumented you migration status, you pose that risk that then the attending a school is no longer seen as a benefit to the child, but potentially a risk to the family.

09:37.16
Jen
And I think you jeopardize the very purpose for why the information was collected in the first place, which is to fulfill the right of the child to education. And we seem to have lost sight of that. And I think increasingly, that is probably the thing that has changed most, is simply the idea that people warned of 25 years ago and 10 years ago. And every time new databases and new things get expanded, which is...

10:00.04
Jen
If there is potential for scope creep, it will happen. And know if we are really genuinely looking at better, good use of safe data, we need to be focused more on data minimization, which seems to have been, ah of course, completely disbanded by now the big push for more data about everything, about everywhere, all of the time, in order to build artificial intelligence and AI technology.

10:32.86
Caitlin
Broadly, if you're a child going to school, what data is being collected? And maybe it makes sense to chunk it. So like before you start school, what data has already been collected about you by the state?

10:44.41
Jen
So in England, it's very different from even Wales and Scotland or Northern Ireland in the yeah UK, because education is a devolved matter. That means the laws around education are different. And when a child has to start compulsory school and when they can leave, for example, has different ages.

11:00.80
Jen
But if we were to focus only on England, there are particular families and children who will have hey greater impact contact rate with the state than others.

11:13.15
Jen
And so, for example, where there are children about whom there are considered to be safeguarding risks or threats to children, there is a lot of data collected already about a child, even potentially pre-birth.

11:29.07
Jen
And those data sets, like the children in need data, is collected by local authorities for the very direct purposes of the care of a child. And that data, in addition, then already goes to the Department for Education on an individual basis.

11:45.07
Jen
So you can start very, very young in some circumstances. In normal circumstances, preschool, that would be for the early years and preschool, so from roughly two to five and children that attend any state funded. So where there is public taxpayer money going into institutions and places for preschool, those places and information about the child that fills them will be collected already.

12:12.62
Jen
So you can have already pretty rich detailed record of a child, even from age two in nursery. And then at age five is really where In education, it starts for everybody because that's the age of which children must be, if they are going to go into state education rather than choose elective home education, for example, they must be in in state education in the term of their birth of the fifth birthday.

12:36.36
Jen
And so really that then starts off a whole set of statutory tests and collection about assessment. It's about a child's progress. So you've got the baseline test, which is as soon as a child enters the school to do sort of assessments around motor skills and behavior and initial sort of reading and responses to models and patterns and and those sorts of things. That's done now on an iPad, for example, like by staff. And those ah test scores already collected you know at at age five.

13:09.66
Jen
And then you've got the progressive tests that go through different sort of levels of assessment and progression at you know age eight, nine, 10 and 11 again, which are called the SATs, key stage tests in the UK.

13:22.49
Jen
And again, then as children go into secondary school, of course, all the statutory testing that's around exams. So GCSEs, A-levels, BTECs, any of the other types of exams that are offered to young learners around the age of 16,

13:37.74
Jen
and 18 again before going to university. So that's the kind of assessment and and progression of education that you might expect to be collected. What always surprises me is it's at pupil level, but it's on your named record.

13:51.43
Jen
So your score of how you did in the multiplications times tables test at age eight is collected and kept. And what's sort of concerning is there's also all the reasons for not taking the test And that's the kind of data that you start to think this isn't necessarily only going to be used for education purposes. But if it says newly arrived to the system, i.e. you're likely to be of an uncertain migration status, ah you know, those sorts of things start coming in where it isn't any more just about your learning and progress, but starts to be ah about the characteristic of childbirth.

14:24.09
Jen
And for every child that has a state education place, there is a census taken every term. So that means three times a year. And the schools don't ask the parents for the information, but it comes directly from the schools through local authorities or directly to the Department of Education, depending on which type of school you're in.

14:45.54
Jen
And that is then sent to collect things like you know the full name, date of birth, home address, those sorts of contact details, the geolocation of your property, and then going through things like your characteristics. So ethnicity, special educational needs, so types of hearing impairments, sight impairments, visual impairments, anything to do with mental and physical health. There's a whole range of factors recorded there in quite a lot of detail. And then going through to things like whether you're in a military family, for example, or adopted from care, or again, going back to those children in need flags.

15:22.39
Jen
So there's quite sensitive characteristics about the individual that also give information about the infer information about a family. And then it touches things like indicators of living in poverty or socioeconomic disadvantage. So we call that free school meals indicators.

15:38.54
Jen
And we have about a quarter of children we know receive that flag at some point in in their school lifetime at the moment in the UK because one in four children is living in poverty.

15:50.43
Jen
And so we've got a lot of sort of a mix of information that's collected throughout every term. So there's that's collected about 50 times across a child's lifetime. And that together with the assessment data goes into the National Pupil Record.

16:04.22
Jen
And then, interestingly, it's it's joined up with lots of other data that comes from lots of other sources. So there's about 25 different collections that go on throughout a child's lifetime across different types of census, and they all get linked up into what is then this melting pot, the National Pupil Database.

16:21.36
Jen
And then in parallel, there's an individual learner record, which is about other alternative types of education, so further education, technical colleges, and then all that information is then now being linked to welfare data and tax records, so HMRC data.

16:37.79
Jen
And that surprised me again. We don't know that our education records now are being linked to our first jobs and our first salaries and what type of education you had.

16:48.75
Jen
And I think what's really interesting, because there's a lot of content there, is what is it telling government Why is it being collected? And I think one of the questions that's perhaps changing is who is politically using that and benefiting from it?

17:04.54
Jen
And the sense I have is that increasingly everything is seen as a cost and we are measuring what is the the cost of something is then being sort of use it uses as a proxy for what is the value of something. And I think there's a great risk that we start seeing government policy based on what type of education is seen of value and who is seen as having value to the UK economy.

17:34.72
Jen
And you start to then say, well, should these children, should these learners with these types of characteristics or these types of disabilities be able to go into these types of education?

17:45.33
Jen
Or will you start to see it shape public policy of being more restrictive about access to higher education, for example, or to different types of grants or different types of welfare support that would dissuade certain people from going into education, which might be seen as a cost to the system, and is now being measured in a return to the economy in terms of your tax receipts to the government.

18:09.45
Jen
All of this that you don't see going on behind the scenes when you just think you send your children to school.

Caitlin
So, they're recording a huge amount of information, some of which is, fair to say, been traditionally quite dubious. In how much trouble have the DfE gotten, let's say, for their handling of this data and the accuracy or the inputs?

Jen
There's been some challenges and we've been successful in some challenges, but I would say overall, unfortunately, the direction of travel is still to collect more data, to use it for more things and to do it with less transparency.

18:44.35
Jen
And so where there have been challenges, for example, was around the expansion of the school census in 2016, where we first saw that there was information going from the Department of Education to the Home Office for Immigration Enforcement,

18:59.82
Jen
And at the same time, the Department for Education was about to collect nationality and country birth from children in schools. And that raised flags to a whole great array of people working across civil society and charities and with young people and children who were really concerned that this could be misused potentially.

19:19.25
Jen
for migration sort of crackdowns that had been promised by several Home Office ministers over the the previous sort of five years that I was really unfamiliar with. And a huge number of you know anti-racism, pro-migrants' rights, um supportive youth workers were and teachers across the education sector came together and said, we're really concerned about this.

19:41.70
Jen
And it was exactly then at the time that we discovered not only were their concerns justified, but actually they were already being realized that it there was an agreement that was already in place, that was live, that nobody knew about.

19:54.26
Jen
That had never been discussed in parliament that showed that the Department of Education was sharing data on a monthly basis with the Home Office. And it was for the purposes, amongst many others, of furthering what was called the hostile environment.

20:08.20
Jen
So to make it harder for people who wanted to live here of uncertain or undocumented migrant status, or to proceed against know criminal proceedings against people who'd overstayed visas and things.

20:20.39
Jen
And so the Department did get a lot of scrutiny around that, particularly through members of parliamentarians, particularly in the House of Lords. But unfortunately, it didn't stop more than the new collection. So we were able to to persuade government.

20:36.63
Jen
And that was thanks to you know the collaborative work of many, many campaigners who formed an umbrella organization against borders of children and a coalition of over 20 groups. And teachers and parents collectively refused to provide 25% that return.

20:50.73
Jen
And so the quality of the data was therefore going to be no good for whatever purposes, nefarious or otherwise, the Department of for Education claimed it would be useful. And so they were able to get that data stopped being collected.

21:01.70
Jen
But unfortunately, the department carried on with the monthly shares of other data, which they carry on today. The quality of data, I think, is ah as a whole different issue. Again, if we if we start to think about AI, Ethnicity is is a well-known example.

21:14.36
Jen
Just last week, we found a local authority with a policy that directed schools that if families had chosen not to provide ethnicity, which is their legal and lawful right, then it was directing the local authority to fill it in, just to tell the schools to basically go ahead and make it up.

21:31.13
Jen
It's one of those those matters that you think, I don't see an effect um on me when my child goes to school of whether their ethnicity is accurate or not. But what's it doing if we start to use that information, perhaps in some sort of other third party system? So we don't know where it's gone because the Department for Education has handed out ah raw data for years.

21:52.27
Jen
So you don't have any control over where it's gone once it's been given out. and Is it now with insurance companies? Is it now with other places that you wouldn't know that could be using it to make decisions about you? as an adult.

22:03.43
Jen
I think that's that's pretty concerning, and those sorts of things, when they do start to matter, you know they shape a whole narrative. What people perceive about their society, what they perceive then, the direction of travel of society.

22:16.95
Jen
And that I think we're seeing the real effects of those sorts of conversations and direction of travel, even you know to undermine the sort of perception of how society is made up and what impact immigration has on society and even you know democracy itself.

22:45.45
Caitlin
Assuming that nothing changes in terms of kind of the governmental vibe and the legislative kind of environment, where do you think things, if they continue on this trend, are heading?

22:57.69
Caitlin
And what do you think the kind of biggest dangers of the next few years, but then like on a much longer timeframe are?

Jen
Well, I'd love to think about what good would look like. What what would be a really wise and sensible use of information that is a collected today, or B, has been retained over a long time?

23:16.38
Jen
And how does it compare with what is? There are good models of what's called safe data management. So particularly around the UK, it's called the five safes model looking at data being retained in ways that is minimal or is minimized and is also changed in ways that perhaps is less identifying and can be therefore more secure as well. And as you know, the the methods that are looking at encryption and how it might be stored,

23:40.88
Jen
And then you look at how you don't distribute it, as they have done and are still doing for some use cases, to the recipients of data. They're not receiving what we used to call raw data, you know just unprotected data as it is and identifying in its accessible format.

23:56.67
Jen
But they would have to go to a setting and access the information there. So instead of being sent the data, they would use it in a safe setting. And you can also make the data accessible, only you know, distributed to people that are somehow accredited or somehow trained or have ethics reviews and oversight, rather than just sending out Raw to you know The Telegraph or BBC Newsnight or other journalists that received it in 2012, without those sorts of oversights or perhaps training in the way that you might consider necessary for accredited researchers.

24:27.42
Jen
And I think sort of some of those directions of travel would be good. And we we went in some of those better ways of looking at data, I think, because of regulation that came under the General Data Protection Regulation in 2018.

24:39.77
Jen
Unfortunately, the Department for Education didn't then stop those unsafe uses as well. So they now manage both processes in parallel. I think what's concerning is there is so much data now that people have poor oversight, even within the departments. And it was a criticism of the of the audit that there was no proper record keeping. There was no ROPA: record of processing activities.

25:02.68
Jen
And it's that kind of competing directions of travel I think we've got at the moment. We really have to ask ourselves, I think, and I say we, you know, this is one of the challenges. Who do you mean by we? Who gets a say in this?

25:15.39
Jen
Is this a few narrow people making policy in a department? Is it a government? Does the cabinet office even know that probably half of them have their own records in this data being given away to to third parties and reused commercially because a lot of them are under 50 now and state educated?

25:31.69
Jen
You know is there actually awareness of who we is that are making these decisions? And I think that would be perhaps the biggest potential for a direction of travel that would be better, is if people actually knew, if the government had the guts to really make a a wise, informed policy choice, which would be built on all the findings of lots of lots of committees and reports and recommendations that have been written over the last decade, which say, you need to tell people.

26:00.21
Jen
It's not just the law that you need to tell people, but this is good practice. You'd actually get better data out of it. Perhaps some people would choose not to provide everything that you currently collect. But really, would that be such a bad thing? Is everything that's collected right now used?

26:15.42
Jen
And we would argue we know it's not. We know there's lots of data collected that's not used for anything. And potentially there's lots of data collected everywhere. ah shouldn't be used. So if we looked at, for example, equality monitoring data, sexual orientation, religion, disabilities is being collected when young people start the transition between secondary school and higher education that apply to university.

26:38.65
Jen
And that's been collected by the admissions processes for these organizations that involved in access to higher education in England. And they don't have an adequate process for telling students this is what happens, this information and where it goes. And so now it's collected on a named basis.

26:55.25
Jen
And you might think that if you were to state your sexual orientation as part of equality monitoring quite happily in the UK, that it would be kept as a statistic. But in fact, in higher education, it's been added then back into the Department of Education and back into that national pupil record, which has your name on it.

27:12.25
Jen
And so... sexual orientation, religion and and disabilities are all now kept as equality monitoring data on a named basis forever at the Department of Education. And it's been kept by various funding bodies and other higher education authorities as well.

27:26.55
Jen
And that is the kind of data where i start to think, what's the direction of travel for that? If we look at the US right now, We see lots of of big tech companies and others ah disavowing all their ah diversity and equality work and scrapping their sort of commitments to accountability and transparency and equality.

27:47.24
Jen
And I think we look at that kind of data and think, what if? What if that data is sitting around on a named basis and there is a new government that doesn't fancy the look of a particular sexual orientation or or agree with a particular religion?

28:02.64
Jen
How might that information be used? So I think the big thing we need to really consider is what the risks are is actually look at the realities of of the infrastructure and the checks and balances we've got in place and think, are they enough?

28:16.39
Jen
Are they sufficient? And are they going to be robust to potential misuse that you might not want to dream of today, but is a potential reality soon?

28:26.78
Jen
And so you need to build your infrastructure to manage those risks. And I don't think we have those safeguards in place yet. So we have two different paths to possibly take. And I think with the UK government being really focused on economic growth and really pushing a big agenda around AI into the UK economy and and UK sort of technology as as a leading field, I think we need to think, what is the data we've got Is it good quality? Are we looking at this sort of poor quality data? Could we improve it by giving it back or giving access back to the people it comes from and asking them to have access to that data and make it accurate and check it and be able to give updates on a regular regular basis?

29:09.16
Jen
Would we potentially have a slightly smaller data set, but it might be far better quality, more accurate? Could we then you know look at that sort of safe infrastructure of how it's being used and do away with the parallel, weak, risky infrastructures and make sure we've got sort of the right checks and balances in place? And you know we've got a lot work to do in that field still.

29:27.94
Jen
I think we need to look at two things that the Department of Education is looking at for the future. One is the automated decision making and AI sector, and they're talking about what's called a content store with a pretty loose description right now what might be in it.

29:44.60
Jen
But there's already clear direction of travel around consultation potentially to come around the reuse of pupil, so learners' content that might be coming from their EdTech, Everyday Technology Applications.

29:59.55
Jen
And when we think about that, you know we haven't talked much about those commercial applications, but those are 24-7 from when a child crosses the playground on CCTV, using the school's information management system, logs data about them.

30:11.72
Jen
They can be using a number of different apps you know at any time of the day, whether it's doing sort of quizzes maths checks or using an AI-assisted tool for prompting questions and and study.

30:24.24
Jen
Very, very commonly used now in the UK, which might shock people from other countries as it's not as widespread and weakly overseen as it is here in the UK. But we use biometrics in the canteens often to use cashless payment systems and fingerprints and even growing and facial recognition being used by schools.

30:43.92
Jen
We don't really you know all of those companies that are operating in schools and that sort of interaction between the data that's being stored locally that's going then to companies and whether it's being used to train people that sort of biometric and and particularly facial recognition tools.

30:58.52
Jen
And they could be using... yeah a number of different platforms from Google to Microsoft to various others. And all of these companies yeah so coming in to education, many of whom came from business that weren't really designed for educational purposes at all, on top of apps that look at you know things like behavior control.

31:18.08
Jen
So they're looking, doing things like monitoring points that that teachers can award a child in a class for good or bad behavior. And those, I think, have a really... in some ways, quite nasty approach sometimes to their pedagogy.

31:31.07
Jen
They work on a sort of shaming basis. If you do something bad, you get a behavior point, and then it might play a sound in front of your classmates and embarrass you that you got a bad point. Or you get your points totaled up at the end of a week or end of a month, and you're on a chart that gets projected up on a wall. And again, it's sort of this name and shame, children.

31:50.35
Jen
I think these kinds of EdTechs and technologies we're not thinking enough about in terms of how they shape a child and how you influence the child's not only cognitive, but social and you know mental well-being and what it's really doing to our children to be sort of governed by these rank and spank systems,

32:11.72
Jen
that don't have oversight, that don't have any standards, that don't have any quality control, that we don't know if there's proven efficacy. And that is something the Department for Education has started to talk about and is looking at, is looking at an ed tech evidence base and and panel, although there's no openness at the moment about how that panel might be chosen.

32:29.60
Jen
We hope we'll be involved in that. So sort of looking at that sort of sector and then saying, right, Department for Education has a plan that that might involve ed tech as well as then everything we talked about earlier.

32:40.84
Jen
So whether it's pupil data coming out of your everyday education or any of the exams data, exams that you've written and exam content. And all of that might be up for grabs to turn into AI developers' content.

32:55.70
Jen
And they're not really discussing openly right now the sort of data protection implications. they Apparently, some some seem to think certainly the the AI company mainly involved, as a spokesperson has told me, they don't think the data protection angle is a problem, which surprised me. But the piece they think is a problem is IP, the rights of copyright holders.

33:15.88
Jen
And this opens up a whole new sort of debate that's never been really in the UK education system yet, which is you children as rights holders of creative content producers and also, of course, teachers.

33:28.61
Jen
And certainly, I think for learners, there's... There's certainly no no understanding at all that we somehow become a cash cow for AI development companies to somehow now mine the very essence of what we do on an everyday basis in education. i think is it would almost be laughable.

33:48.14
Jen
Except that, you know, it's very real and those discussions are happening. And that will be going over the next sort of six months. And while there's been lots of criticism of of it from the media and creative industries, there's been nothing from the education sector yet.

34:01.65
Jen
And we really need to engage with that far better. So you've got that sort of aspect of what's going on. Thinking about perhaps you know those that are opposed to privatization in UK might have called it you know in and sort of Margaret Thatcher's years and and that that sort of decade, we sold off the family silver.

34:19.16
Jen
And it now seems like we've got nothing else left to sell but our children. and And that really concerns me. And then going into the even more core of their beings is looking at genomics and that link between education and health and health.

34:35.58
Jen
And the Department of Education published last August, to their credit, a study of what might be the implications of commercial genomic companies and information around genomics that parents or families or communities might have or others might have access to and how that might influence education and the delivery of education in the future.

34:58.96
Jen
And I think if we don't grapple with that, it's easy to laugh it off and think as somehow that's a Gattica-style model of the future that could be determining which children get to achieve which type of education and where they're permitted to fulfill their full development and thrive into adulthood and and become a ah part of society that they choose to be.

35:25.59
Jen
and it might be determined for you if we start to see that kind of information being used in ways that has never been intended and that many would argue it's not fit for purpose, which is some sort of determinism of whether your risk factors are somehow worth paying any attention to when it comes to education.

35:45.69
Jen
And I think many would say no. And I think we need to be very, very alert to the fact that when we think of what do we want for the direction of of education is made up of lots of different competing communities and individuals and ideas.

36:01.39
Jen
And that idea of how you might use data that's been collected about you from birth, that was perhaps collected for your health purposes from your heel prick test as a child when you were born, is now somehow ending up as research data which could be extended you know to be consented somehow by by others on your behalf to be used for other purposes.

36:23.54
Jen
And that somehow that could then be some sort of pre-determinate of your future, I think, is not beyond the the realms of reality that some believe in, and they think that is the best thing for society.

36:35.55
Jen
So I think we're at a really interesting time, I say interesting loosely, that really needs attention for people who want to stand up for reality - things like you know human rights and democracy and the rule of law, to say this matters to me right now.

36:50.15
Jen
What direction of travel we take really matters. What kind of society and and future of society will we live in, not just this year and the next five years, but 10 years and beyond. And that we're really going to shape that, I think, with all this big data that we've collected about children and education.

37:05.38
Jen
And it's really going across all sorts of companies and third parties and is pretty much you know all behind closed doors and we don't know about. So and I think we need much more oversight, but we also need to...

37:17.14
Jen
democratise it. We need to understand what's collected about us. Who knows what about me and which people get to control that should really be coming back to the people that it comes from. You know, you and me and and our children.

37:27.68
Caitlin
Literally me having been to a state school in the UK and being under 48. Yeah, that's disturbing to think out about. Might be worth data subject access request.

37:38.32
Jen
Absolutely. Well, that's a great idea. You can make one. and Anyone can make one. It's free. and The Department for Education has it processed on a website. Look up Department for Education, a subject access request. How do I ask access information about me?

37:49.46
Jen
It's a bit of a convoluted process. It's not really suitable for children, but you can go and it's free and they will provide you a record. And you'll be as amazed as I was to find that they have your equality monitoring data recorded against your university modules.

38:03.75
Jen
They even know which modules of your university course you took and It's an incredible level of detail going back to 1996 on an unnamed basis and 2002 on a named basis. So anyone that's been in state education since 2002, you'll have a named record there.

38:21.65
Caitlin
Wonderful.

38:29.84
Caitlin
Is there anything you wanted to talk about that you didn't get a chance to talk about?

Jen
There's loads of things, you know, like age assurance and age verification. Safety tech is a big thing for us, obviously. We're doing a lot of work with Council of Europe and a team right now on AI literacy.

38:43.10
Jen
And it's been really, really eye-opening because the more we've done on AI literacy, for it's a terrible term, but, you know, everybody just has to use it. You realize this really is about managing technology.

38:56.06
Jen
Who owns the future of your education system? you know Is it going to be owned by big tech companies? And you know I didn't mention, but the business model, a lot of ed tech companies depends on the reuse of data in some format or other.

39:10.90
Jen
And it's not necessarily selling the data. People often get hung up on that. But it's actually being able to use the information that parents contact details, for example, within an app to then market them an add-on product or an ad-free version.

39:26.20
Jen
So your child might be using a quiz app, and now you get an email from the app saying, your child you know uses this, would you like them to have an upgrade? And they can get an ad-free version. And you then have this sort of two-tier system being encouraged across the sector, which is ostensibly free because it's free to schools.

39:44.40
Jen
But then you start having this sort of behind-the-scenes payment systems where... or the parent who contacted us saying they didn't know what to do because they were using an app for encouraging their their reading at home. And they've been told by school they had to use this app with their son.

39:59.83
Jen
And they said the trouble is there's Disney ads on it. And he seems to be watching more Disney ads than he's time spent. you know improving his reading because he's really distracted. He's really easily distracted.

40:10.80
Jen
And so some of these systems might not bother some people that much, but it really is detrimental to others. And I think you end up with this unseen discrimination and commercial exploitation that is not equitable and What's fascinating is a lot of those packages that come out of big tech companies were designed for industry.

40:30.74
Jen
They were designed for efficiency, productivity, measuring numbers, measuring output, measuring anything that was measurable. and That's not necessarily what schools systems were about, but it has driven that kind of mentality around what's the infrastructure of education and how is it actually shaping the education that is delivered.

40:51.75
Jen
And so now it's very normal for students to have to go onto a Google Classroom and download their homework and upload a shared document or a PowerPoint.

41:02.29
Jen
But the way of learning is not necessarily a good thing to be driven by the tool that you have to use. And I think that's the risk that we end up with. A lot of these systems now are driving learning how education is delivered rather than looking at what's you know the best sort of quality educational outcome.

41:20.42
Jen
And what really concerns me is this kind of idea that we've just handed over without any scrutiny, the delivery of education to a variety of tech companies and their values and their methodology and their look and feel and requirements of dexterity or concentration or how they expect you to interact with the system.

41:42.99
SJen
are somehow now the norm for if you want to get your education, this is what you have to be able to do. And that's new. And I think we haven't really scrutinized any of that. We've just kind of slipped into it.

41:53.73
Jen
And so you'll end up with, I think, if we're not careful, the same sort of challenges in a few years with the education system and now seeing back to social media.

Caitlin
Well, countries ban the use of phones in schools. Like Brazil has just passed a big law saying, you know, phones in a school aren't going to be allowed or whatever, which is fine, which is lovely.

42:12.20
Caitlin
Sure. But to do that at the same time as massively expanding the digital devices that you're spending loads and loads of time looking at and playing with at school as if it's substantively different. just seems wrong, especially when they're designed by the same companies designing social media apps on the same basis that are reusing the data to then we may potentially retarget you on the social media app. The cognitive dissonance is quite confusing.

Jen
And, it also means it's all that knowledge about who knows what about the education system. so Who knows what about you is one thing, but also who knows what about your overall holistic picture of education.

42:47.48
Jen
And so we end up with dashboards being sold in these products which don't necessarily talk to each other. Now, a lot of companies are starting to consider interoperability and they want to share data across the different systems, which will have a whole new myriad of questions around privacy and and data sharing.

43:05.39
Jen
But the idea is that a teacher can then see across a number of different apps and platforms how a child is, according to these systems, methods of what counts and what's worth measuring, somehow being datafied and being Ranked and spanked, as we call it, across their classroom of who's in what sort of position in terms of achievement.

43:24.75
Jen
And you get this sort of datification of the learning path and that profiling of children, sort of building up their achievement over time.

43:35.84
Jen
think another aspect of... particularly the UK education where we use a lot of these tools that perhaps doesn't exist elsewhere because you're then setting, I think, unconsciously expectations of teachers who you hope have the good professional training to really understand how these tools work and their flaws and their risks.

43:55.92
Jen
But I suspect most don't because there's no data protection or data digital training as standard in in UK teacher education, teacher training. And so the concern is that everything's taken at face value and they don't necessarily understand what these data boards are telling them and misread or you know only have the time to read it in a very limited way. And I think then we're potentially pigeonholing children into pathways that say we expect the outcome to be X over this period of time, given your previous performance.

44:28.29
Jen
And somehow, again, you're sort of putting this predictive risk analysis based on scoring and profiling onto children, which is based on something that isn't linear, which is children's progression in education. And it's provably linear.

44:43.97
Jen
that you know progression and learning is not linear. And yet we're making everything sort of flattened and normed. And you want to look for outliers. And anything that jumps up out as a norm outside of the curve is somehow problematic, when actually it might just never have been designed to look like that to start with.

45:01.39
Jen
You know, a child who was 10 recently said to me, you know I'm meant to be achieving these scores. And I was told that I was above average. Well, who told me I should ever want to be average?

45:12.50
Jen
And I thought it was a very smart thing for a sort of 10 year old to be coming out with. He was already realizing that there was this sort of expectation of norm and average. And if you were somehow different, you stood out from it because of what data expectations were telling the teacher.

45:29.09
Jen
And it's shifting, I think, that sort of relationship between how children perceive themselves and their relationship with the teacher, which is going to be even more important, probably affected by AI, where we start to lose control and agency and autonomy of what teachers teach and how they understand what they taught and how they measure what was taught and are able to you know look into black box systems and say,

45:53.39
Jen
I understand why the system measured you that way. And I agree or disagree. You know, even being able to overrule automated scoring or point system. And especially when they make mistakes. you know that you go to systems and you think, no, the maths is wrong.

46:06.31
Jen
It's got it wrong. And you there's no way of children, apart from throwing a tantrum you know when they're doing their homework and super frustrated because they can't put the right answer in and they know they've got it right. There's no recourse for these things usually on the system.

46:18.13
Caitlin
But it's not like the UK doesn't have a massive and famous example of an algorithm doing predictive you know stuff incredibly badly. It's not like there was a huge scandal of all the grades you know getting f***ed up. Then a lot of AI systems are non-deterministic, and we've got research that we're working at the moment, which shows if you put the exact same things into some recruitment AIs, you get different results out because that's kind of how they're supposed to function.

46:40.83
Caitlin
But the idea that you can just apply that to education with systems that hallucinate completely random nonsense that lie, like, lie the wrong word because it implies, you know, some kind of decision-making capacity, but like, it's just really, really weird.

46:56.61
Jen
There's not enough critical analysis of it. And yet, to you, to me, to lots of people who will be listening, I think it's so obvious. And yet, you think in policymakers, why is it not? And i think that's something we have to really get to grips with, is this sort of dual messaging that comes from different sources and who is the source of truth and who is the source of evidence.

47:17.01
Jen
And if it is driven by companies and their marketing, and that ties in better to what a policymaker wants to deliver in terms of cost cutting, for example, then without evidence to the contrary, you might well buy into it.

47:32.30
Jen
And things that... ah Perhaps, you know, less provable in terms of obvious risks and harms are really challenging to to challenge and to say, you know, why why should we do this? And sometimes the answer comes back, well, why should we not? We haven't yet seen the harms.

47:48.60
Jen
of course, we try and operate in, let's avoid the harms rather than than have to prove that they happened. But I think you go back to that you know National People Database and say, well, look, 2002, you said you collect names and nothing would happen.

48:00.68
Jen
10 years later, you changed that policy. A different government came and changed that policy. And there is harm in terms of immigration enforcement and misuse and now targeting people on an operational basis.

48:11.11
Jen
And I think we do have growing evidence of harms. And I think it's a sad sort of state of affairs that it it seems to be seen as an either or. as a playoff, somehow you're either pro-tech or pro-privacy. You're somehow pro-innovation and growth or pro-privacy, and you can't be both. And I think people that really have a sensible understanding of all this see that privacy is underpinning all of these good practices And it's not just a human right for individuals, but for communities and collectives and society to really understand that power imbalance between all those companies we're talking about, designing their systems with their values and their systems design and sort of black box algorithmic decision making that, oh as you say, poor choice of words in terms of decision making, but the outcomes that happen for the person using the system that are somehow unparalleled -

49:03.07
Jen
beyond your control. I think this is a thing that we don't get a chance to discuss with policymakers because they're always in such a hurry and they need to make laws and they need to be seen to be doing the right thing all the time. And you actually need policymakers to be able to step back and say, what's our vision of good?

49:18.44
Jen
What is it that we really want to deliver? And it's not about technology. It's about what's the purpose of education? Like, why do we even educate society? And if if your model, as it is increasingly across many you know Western and global North countries, are it's about productivity, it's about output, it's about producing workers for the system.

49:39.16
Jen
And those workers will then create wealth and they will return that wealth in taxation to the government and to the state. and We will somehow have a productive society because we are wealthier than our predecessors.

49:53.91
Jen
I think the evidence right now is pointing to the decades of that system really isn't working right now. and and If anything, the increased pace of technology through using more AI across more types of systems and more types of sectors, and especially the public sector, will only enhance that.

50:11.47
Jen
I think people have really lost the sense of, I'm not in control of my everyday life anymore. I don't feel like I get a say and I i don't know what's going to happen. And I ah can't use this system. or I can't use the telephone system or I can't and pay cash for something or I can't check out the way I want to in it in ah and a queue at the shop.

50:28.80
Jen
And somehow you have that sense of loss of control. And that for me is a really concerning thing that policymakers are not engaging with, that lent the sense of agency, is if we don't tackle, ah think will be a sort of social void that the far right will take more and more space in. And politically, current policymakers who think they're doing the right thing, being all about economic growth, will actually find that you that the sort of social impacts of that are inequitably spread. And therefore, you will find politically, we'll have significant outcomes for so democratic participation and where people feel comfortable, the direction of society is taking them.

51:11.27
Jen
And I don't know what we do well about that. yeah We can keep trying to put these mechanisms in place that fix some of those systems. But will we get policymakers that really want to collectively sit down and say, we are going to take a different path and we understand this is how these things contribute holistically.

51:31.60
Jen
I think the crux of a lot of that is your education system, because it is the core of how people individuals are shaped and formed in family life, in school life, in preparation for modern life and society as a whole. And I think if we don't fix that in education, we're really on a downward path at the moment. And I'd like to be optimistic. I don't want to always end things on a sort of downer.

51:54.95
Jen
We could imagine a different future where you understand what data is collected about you. You know which tools and apps you use. They're always going to be in your best interests, not competing interests with the school to profile you and and rank and spank you.

52:09.26
Jen
They're not in the competing interest of measuring the capacity of the teacher or how good a teacher they are. They're not in the interest best interest of the system to measure your exam's output. They're actually always going to be your best interest to see, have you learned something? Has it helped you learn the way that you need to learn best?

52:25.37
Jen
And do you have access to that information? And can you be in control of it? And can you decide to some extent you know what it is that your gaps are in your learning and and what that might lead to next? Yeah.

52:38.12
Jen
And that's a different vision from the current model of personalization that a lot of people see, which is all about sort of scoring and datification profiling and the teachers being able to steer things, but really only in the way that the company's systems allow you to.

52:50.96
Jen
So it's still steered ultimately by the values that the company puts into the system and controls through their design practices and their user experience. And I think we haven't grappled with that. And we've got you know different ways we could do that. But we we could have a vision which is so much more positive.

53:07.89
Jen
But it does need some infrastructure and it does need some funding. And we can't leave that to big tech companies who are prepared to do it for free because they see something in it for them. yeah Their business models depend on freeware or they're happy to give it to low cost. and I think more people have to ask why.

53:23.84
Jen
What's in it for them? are they getting out of it? and And who's going to control our education system a few years' time? We could have a really exciting time with better and educated, better informed, better savvy, you know, everyday practical skills for everybody of every capability that that has the capability and capacity to use some sorts of digital skills, which really is required across so much of society now.

53:49.47
Jen
Yeah, I mean, I could be really, really optimistic about some of that. I just hope there's enough people that you know can work towards those things and and get them delivered into policy that can make it happen. And it might be incremental. It might be it's not going to happen in the next five years and it's bit by bit, but we'll see.

54:18.18
Caitlin
I hope you enjoyed this conversation. Defend Digital Me are an interesting organisation, so if you want to find out more from them, there's a link in the description. If you want to read more, we've also collaborated with Jen on some articles that you can find on our website at pvcy.org/edtech.

54:34.05
Caitlin
There's more to come from our work with her, so keep an eye out. We've been really closely monitoring this kind of trend and we've been really concerned because a lot of these technologies, a lot of these kind of databases, methods of data extraction and sharing are really familiar to us from other avenues of work like our government surveillance work and our corporate surveillance work.

54:54.61
Caitlin
So seeing them being deployed on children in really sensitive mandatory spaces has been really, really concerning. And so we've got an article that we've done with Jen called Studying Under Surveillance, the Securitization of Learning.

55:06.67
Caitlin
and one called the unavoidable rise of EdTech in educational spaces which kind of give more of an overview of the specific ways that we're concerned then we've got a couple more coming out one specifically on England and one looking more at the commercialization aspects We're not the only people raising the alarm and I realise that conversation was pretty UK specific and maybe a little bit depressing in ways we didn't really mean it to be.

55:32.39
Caitlin
So I thought it would be worth maybe sneaking in at the end and talking a bit more about the fact that like the alarm is being raised among others. The UN Special Rapporteur on the Right to Education has called out the adverse impacts of state surveillance visited by EdTech, including the intrusive nature of the technologies, the data collection, the role of commercial entities, and the fact that some technologies just shouldn't be deployed in the ways that they have been in the spaces that they have been.

55:59.61
Caitlin
like facial recognition technology. If you want to learn more about what we think about that and learn more from our collaboration with Jen, pvcy.org/edtech. Thanks for listening.

56:10.60
Caitlin
You can sign up to be the first to learn more about our work at pvcy.org forward slash pod sign up. and we'll include some links to relevant articles and information in the description wherever you're listening or on our website at pvcy.org forward slash tech pill.

56:25.58
Caitlin
Don't forget to rate and subscribe to the podcast on whichever platform you use. Music courtesy of Sepia. Podcast produced by Max Burnell for Privacy International. Thank you.

Media type
Related learning resources