US finds border AI kiosk system too slow for operational use
In 2011, the US Department of Homeland Security funded research into a virtual border agent kiosk called AVATAR, for Automated Virtual Agent for Truth Assessments in Real-Time, and tested it at the US-Mexico border on low-risk travellers who volunteered to participate. In the following years, the system was also tested by Canada's Border Services Agency in 2016 and the EU border agency Frontex in 2014. The research team behind the system, which included the University of Arizona, claimed the system, which used sensors, biometrics, and AI to analyse movements and changes in voice, posture, and facial gestures to flag individuals who might be lying, had a success rate of 60 to 75% and was consistently more accurate than humans. DHS found, however, that the system was too slow for operational use.
Writer: Jeff Daniels
People must know
People must be able to know what data is being generated by devices, the networks and platforms we use, and the infrastructure within which devices become embedded. People should be able to know and ultimately determine the manner of processing.
Limit data analysis by design
As nearly every human interaction now generates some form of data, systems should be designed to limit the invasiveness of data analysis by all parties in the transaction and networking.
Control over intelligence
Individuals should have control over the data generated about their activities, conduct, devices, and interactions, and be able to determine who is gaining this intelligence and how it is to be used.
We should know all our data and profiles
Individuals need to have full insight into their profiles. This includes full access to derived, inferred and predicted data about them.
We may challenge consequential decisions
Individuals should be able to know about, understand, question and challenge consequential decisions that are made about them and their environment. This means that controllers too should have an insight into and control over this processing.