Search
Content type: Examples
Almost half of all job seekers are using AI tools such as ChatGPT and Gemini to help them write CVs and cover letters and complete assessments, flooding employers and recruiters with applications in an already-tight market. Managers have said they can spot giveaways that applicants used AI, such as US-style grammar and bland, impersonal language. Those whose AI-padded applications perform best are those who have paid for ChatGPT - overwhelmingly from higher socio-economic backgrounds, male, and…
Content type: Examples
About 30 Brazilian delivery drivers for Deliveroo and Uber Eats in Bristol, UK have resorted to living in encampments due to rapidly rising rents and low, if not below, minimum wage. Gig workers have reported increasingly harsh living conditions and long hours with low pay, leading to worsening mental health problems in the encampment. Meanwhile, Deliveroo and Uber Eats have both recently declared profits, and Deliveroo has even recently defeated a seven-year legal effort to gain workers'…
Content type: Examples
The UK's Information Commissioner's Office (ICO) has reprimanded the Chelmer Valley High School in Chelmsford, Essex for unlawfully implementing facial recognition technology in its canteen. The school failed to perform a data protection information assessment, and didn't get adequate permission to process their students' biometric data or ask students to give consent by opting in. North Ayrshire Council - who implemented facial recognition in nine schools in Scotland - have also been warned by…
Content type: Examples
The UK's Department of Education intends to appoint a project team to test edtech against set criteria to choose the highest-quality and most useful products. Extra training will be offered to help teachers develop enhanced skills. Critics suggest it would be better to run a consultation first to work out what schools and teachers want.Link to article Publication: Schools WeekWriter: Lucas Cumiskey
Content type: Examples
The UK's new Labour government are giving AI models special access to the Department of Education's bank of resources in order to encourage technology companies to create better AI tools to reduce teachers' workloads. A competition for the best ideas will award an additional £1 million in development funds. Link to article Publication: GuardianWriter: Richard Adams
Content type: Examples
The Los Angeles school district turned off “Ed”, a $6 million chatbot, after the company paid to develop it got into financial trouble. The incident provides a cautionary tale for Britain’s new Labour government, which has talked of using AI in schools to free up teacher time and revive public services on a tight budget, as has a report issued by the Tony Blair Institute. The failure of the algorithm used to predict GCSE and A-level grades during the covid lockdown and the potential to increase…
Content type: Long Read
Social media is now undeniably a significant part of many of our lives, in the UK and around the world. We use it to connect with others and share information in public and private ways. Governments and companies have, of course, taken note and built fortunes or extended their power by exploiting the digital information we generate. But should the power to use the information we share online be unlimited, especially for governments who increasingly use that information to make material…
Content type: Examples
Former delivery driver Edrissa Manjang is pursuing a claim for harassment, indirect discrimination, and victimisation in UK courts, alleging that a racially-biased algorithm kicked him off Uber Eats' ride-sharing app. After Uber Eats supplied information that contradicted Manjang's original claims, the judge gave him leave to amend his complaint but refused him permission to use emails sent him during the litigation as evidence of harassment. Manjang, who is black and of African descent, says…
Content type: Examples
A new poll from the trade union Prospect finds that 58% of UK workers believe government should protect jobs by regulating the use of generative AI. Only 12% believe government should not interfere. The poll also found that workers are deeply uncomfortable with being surveilled at work and about companies' use of software to automate decisions about hiring and promotion.https://prospect.org.uk/news/public-call-for-government-regulation-of-generative-ai-at-work Publication: Prospect
Content type: Examples
Following a February 2024 ruling by the Information Commissioner's Office against Serco Leisure, national leisure centre chains are among dozens of UK companies removing or reviewing the use of facial recognition and fingerprints to monitor staff attendance. The ICO found that the Serco subsidiary had unlawfully processed the data of more than 2,000 employees at 38 centres.https://www.theguardian.com/business/2024/apr/16/leisure-centres-scrap-biometric-systems-to-keep-tabs-on-staff-amid-uk-data…
Content type: Advocacy
BackgroundThe Snowden revelations and subsequent litigation have repeatedly identified unlawful state surveillance by UK agencies. In response, the UK Parliament passed the highly controversial Investigatory Powers Act 2016 (IPA), which authorised massive, suspicionless surveillance on a scale never seen before, with insufficient safeguards or independent oversight.Privacy International led legal challenges to this mass surveillance regime both before and after the Act became law. The Act…
Content type: Examples
Some UK schools have bought and installed sensors in toilets that 'actively listen' to pupils' conversations to try to detect keywords spoken by pupils. The sensors are being sold to detect vaping, bullying, and other problems. However, privacy campaigners say these sensors are potentially a safeguarding issue, a violation of children's rights, and are likely to be unlawful. The sensors do not record or save any conversations, but send alerts to staff when triggered. Not all the schools…
Content type: Video
We explore the legal case, the ways the tag hasn't worked for long periods of time, and a dubious AI the Home Office has been using in decisions as to whether someone remains on a GPS tag.LinksRead more from Katie's law firm, Wilsons Solicitors, about the casePI's Complaint to the ICO (the UK's Data Protection Authority)Read more about relevant cases in which PI has filed witness evidenceThe five companies at the heart of the UK's GPS tagging systemWe tested GPS ankle tags, read how our…
Content type: Examples
An app used by more than 100 Bristol schools has raised concern among criminal justice and anti-racism campaigners that the easy access it gives safeguarding leads to pupils' and their families' contacts with police, child protection, and welfare services risks increasing discrimination against those of minority ethnic or working class backgrounds. Staff using the app say the app is often kept secret from parents and carers. The council website says the Think Family database, which the app…
Content type: Examples
UK government ministers are seeking to ensure schools benefit financially from any future use of pupils’ data by large language models such as those behind ChatGPT and Google Bard. Data from the national pupil database is already available to third-party organisations. The BCS head of education recommends that the Department of Education should write a clear public benefits statement to ensure that initiatives benefit pupils as well as providing financial benefits.https://schoolsweek.co.uk/…
Content type: Examples
The UK's Behavioural Insights ("Nudge") Unit has trialled machine learning models to help automate some decisions made by regulators such as Ofsted (schools), and the Care Quality Commission (health and social care in England). The resulting algorithm uses data such as the number of children are on free school meals, teachers' pay, the number of teachers for each subject, and parents' reviews of schools in order to predict which schools' performance might be suffering. The dataset deliberately…
Content type: Examples
English school head teachers were asked to fill out a census form designed in partnership with the Department of Education and hosted by Capita that included fields asking for pupils’ asylum status, ethnicity, and passport numbers and export dates. Families are meant to be advised it’s not mandatory to supply the information, but when they don’t schools “ascribe” - that is, guess - children’s ethnicity. Privacy campaigners expressed concern that the census data would be used for immigration…
Content type: Examples
ICO warns North Ayrshire Council for adopting facial recognition in schools without parental consent
The Office of the Information Commissioner has warned Scotland's North Ayrshire council that it has likely infringed data protection law by using facial recognition technology in nine schools. North Ayrshire used the iPayimpact contactless system for payment for meals, and claimed that 97% of parents had given consent. The ICO said that although consent was the appropriate legal basis, the requirements had not been met, and the council needed to explain how students' data will be collected,…
Content type: News & Analysis
*Photo by Michelle Ding on Unsplash
Pat Finucane was killed in Belfast in 1989. As he and his family ate Sunday dinner, loyalist paramilitaries broke in and shot Pat, a high profile solicitor, in front of his wife and children.
The Report of the Patrick Finucane Review in 2012 expressed “significant doubt as to whether Patrick Finucane would have been murdered by the UDA [Ulster Defence Association] had it not been for the different strands of involvement by the…
Content type: Long Read
*Photo by Kristina Flour on Unsplash
The British government needs to provide assurances that MI5’s secret policy does not authorise people to commit serious human rights violations or cover up of such crimes
Privacy International, along Reprieve, the Committee on the Administration of Justice, and the Pat Finucane Centre, is challenging the secret policy of MI5 to authorise or enable its so called “agents” (not MI5 officials) to commit crimes here in the UK.
So far we have discovered…
Content type: Press release
Privacy International, Open Rights Group, the Institute for Strategic Dialogue, Fair Vote, Who Targets Me? and Demos have today written to all the main UK political parties, demanding that they are transparent with the public about how they are using voters’ personal data in their electioneering. Twitter's announcement yesterday of their ban on political advertising is just the latest wake up call to politicians about the risks to democracy of personal data driven microtargeting of political…
Content type: Examples
In February 2019, the UK Home Office told the Independent Chief of Borders and Immigration that it was planning to build a system that could check and confirm an individual's immigration status in real time to outside organisation such as employers, landlords, and health and benefits services. Lawyers and human rights campaigners expressed concerns that the project had received no scrutiny or public discussion, and that the Home Office's record suggested the result would be to unfairly lock…
Content type: Examples
In January 2019 the UK Home Office announced it would collaborate with France to overhaul its regime for suspicious activity reports in order to fight money laundering. In 2018, the number of SARs filed with the National Crime Agency rose by 10% to nearly 464,000. Banks, financial services, lawyers, accountants, and estate agents are all obliged to file SARs if they suspect a person or organisation is involved in money laundering, terrorist finance, or other suspicious activity. The system has…
Content type: Examples
In 2018, the UK Department of Education began collecting data for the schools census, a collection of children's data recorded in the national pupil database and including details such as age, address, and academic achievements. The DfE had collected data on 6 million English children when, in June 2018, opposition led the department to halt the project, which critics said was an attempt to turn schools into internal border checkpoints. In January 2019, however, in an answer to a Parliamentary…
Content type: Examples
In September 2018, at least five local English councils had developed or implemented a predictive analytics system incorporating the data of at least 377,000 people with the intention of preventing child abuse. Advocates of these systems argue that they help councils struggling under budget cuts to better target their limited resources. The Hackney and Thurrock councils contracted the private company Xantura to develop a predictive model for them; Newham and Bristol have developed their own…
Content type: Examples
In October 2018, in response to questions from a committee of MPs, the UK-based Student Loans Company defended its practice of using "public" sources such as Facebook posts and other social media activity as part of the process of approving loans. In one case earlier in the year, a student was told that a parent's £70 Christmas present meant the student did not qualify for a maintenance loan without means testing because it meant the student was not estranged from their family. SLC insisted…
Content type: Examples
In November 2018 the UK's Equality and Human Rights Commission warned that asylum seekers have been deterred from seeking medical help in Scotland and Wales since the UK government began forcing the English NHS to charge upfront in 2017 and by fears that medical personnel will comply with Home Office orders to forward their data. The commission, along with health charities and the Labour and LibDem political parties, called for the policy to be suspended. The Home Office policy of moving asylum…
Content type: Examples
The Home Office Christmas 2018 announcement of the post-Brexit registration scheme for EU citizens resident in the UK included the note that the data applicants supplied might be shared with other public and private organisations "in the UK and overseas". Basing the refusal on Section 31 of the Freedom of Information Act, the Home Office refused to answer The3Million's FOI request for the identity of those organisations. A clause in the Data Protection Act 2018 exempts the Home Office from…
Content type: Examples
In December 2018, a report, "Access to Cash", written by the former financial ombusdsman Natalie Ceeney and independent from but paid for by the cash machine network operator Link, warned that the UK was at risk of sleepwalking into a cashless society and needed to protect an estimated 8 million people (17% of the British population) who would become disadvantaged as a result. Although cash used halved between 2007 and 2017, and debit cards passed cash in share of retail transactions in 2017,…
Content type: Examples
In October 2018, British home secretary Sajid Javid apologised to more than 400 migrants, who included Gurkha soldiers and Afghans who had worked for the British armed forces, who were forced to provide DNA samples when applying to live and work in the UK. DNA samples are sometimes provided by applicants to prove their relationship to someone already in the UK, but are not supposed to be mandatory. An internal review indicated that more people than the initially estimated 449 had received DNA…