Privacy protests lead Mattel to cancel child-focused AI smart hub

In 2017, after protests from children's health and privacy advocates, Mattel cancelled its planned child-focused "Aristotle" smart hub. Aristotle was designed to adapt to and learn about the child as they grew while controlling devices from night lights to homework aids. However, Aristotle was only one of many tech devices being released onto the market to take over functions that have traditionally been part of the intimate relationship between children and their parents: a smart cradle that rocks a baby autonomously, a cushion to calm colicky babies, and Mattel's Hello Barbie, which uses machine learning and cloud-based artificial intelligence to converse with its child owner.

These devices collect and store enormous amounts of data derived from their interactions with their child owners: the children's preferences, details of their family lives, and what they say. This poses a serious privacy risk, even when the law, as in the US, prohibits the sale of this information to advertisers. Hello Barbie, for example, learns information about family members and incorporates it into conversation. An additional concern raised by child development experts is that little is known about the effect of tech devices on early childhood development, especially if that device is more immediately responsive than any of the people in the child's environment.

https://www.washingtonpost.com/news/the-switch/wp/2017/10/04/mattel-has-an-ai-device-to-soothe-babies-experts-are-begging-them-not-to-sell-it/

Writer: Hayley Tsukayama
Publication: Washington Post
Publication date: 2017-10-04
 

What is Privacy International calling for?

People must know

People must be able to know what data is being generated by devices, the networks and platforms we use, and the infrastructure within which devices become embedded.  People should be able to know and ultimately determine the manner of processing.

Limit data analysis by design

As nearly every human interaction now generates some form of data, systems should be designed to limit the invasiveness of data analysis by all parties in the transaction and networking.

Control over intelligence

Individuals should have control over the data generated about their activities, conduct, devices, and interactions, and be able to determine who is gaining this intelligence and how it is to be used.

We should know all our data and profiles

Individuals need to have full insight into their profiles. This includes full access to derived, inferred and predicted data about them.