Search
Content type: Examples
Foodinho, the Italian food delivery subsidiary of the Spanish company Glovo, continues to accumulate millions of euros in fines for infringements of labour law such as collecting and misusing riders' data. New research studying Glovo's app indicates that the company appears to have created its own hidden scoring system so evaluate couriers' performance, and shares personally identified riders' after-hours location with Google and other unauthorised third-party trackers.https://algorithmwatch.…
Content type: Examples
Companies like the Australian data services company Appen are part of a vast, hidden industry of low-paid workers in some of the globe's cheapest labour markets who label images, video, and text to provide the datasets used to train the algorithms that power new bots. Appen, which has 1 million contributors, includes among its clients Amazon, Microsoft, Google, and Meta. According to Grand View Research, the global data collection and labelling market was valued at $2.22 billion in 2022 and is…
Content type: Examples
Human raters have played a significant role in the rapid improvement in the machine learning models that fuel modern AI. The raters evaluate the algorithmic output of search engines and AI chatbots and provide "Reinforcement Learning with Human Feedback" (RLHF) – the technical name for the deployment of such ratings to improve AI models. The efforts of these workers, who are mostly located in the global South but include thousands in the US, is downplayed by the technology companies to whom…
Content type: Examples
Delivery drivers in Jakarta use GPS-spoofing apps in order to improve their chances of selection by the Gojek delivery and transport app, an equivalent to Apple Pay, Postmates, Venmo, and Uber all in one. Gojek that operates in more than 200 cities in Indonesia, Singapore, Vietnam, and Thailand. Other grey market apps enlarge details of orders that are too small to read, automate bidding, and apply filters to open orders. Some apps are distributed via Google Play; more are sold via driver…
Content type: Examples
Four people in Kenya have filed a petition calling on the government to investigate conditions for contractors reviewing the content used to train large language models such as OpenAI's ChatGPT. They allege that these are exploitative and have left some former contractors traumatized. The petition relates to a contract between OpenAI and data annotation services company Sama. Content moderation is necessary because LLM algorithms must be trained to recognise prompts that would generate harmful…
Content type: Examples
Behind the colourful bicycles and games rooms, Silicon Valley tech giants operate a strict code of secrecy, relying on a combination of cultural pressure, digital and physical surveillance, legal threats, and restricted stock to prevent and detect not only criminal activity and intellectual property theft but also employees and contracts who speak publicly about their working conditions. Apple has long been known for requiring employees to sign project-specific non-disclosure agreements (NDAs…
Content type: Examples
In the 2014 report "Networked Employment Discrimination", the Future of Work Project studied data-driven hiring systems, which often rely on data prospective employees have no idea may be used, such as the results of Google searches, and other stray personal data scattered online. In addition, digital recruiting systems that only accept online input exclude those who do not have internet access at home and must rely on libraries and other places with limited access and hours to fill in and…