Personalised online ads enables invisible - and illegal - discrimination

Examples
Date

In 2016, researchers discovered that the personalisation built into online advertising platforms such as Facebook is making it easy to invisibly bypass anti-discrimination laws regarding housing and employment. Under the US Fair Housing Act, it would be illegal for ads to explicitly state a preference based on race, colour, religion, gender, disability, or familial status. Despite this, some policies - such as giving preference to people who already this - work to ensure that white neighbourhoods remain white. These problems are at least visible. However, Facebook's interface makes it easy to tick a box to ensure that only specific demographics see a particular ad. In addition, many signals that are used for personalisation - such as music choices or schools attended - act as proxies for race and other protected classes. In a 2015 experiment, Carnegie-Mellon researchers found clear examples of gender discrimination in the Google advertising ecosystem. In the early 1990s, a court case established that newspapers could be held liable for discriminatory ads. To date, websites have been protected from liability under the Communications Decency Act. Testing algorithms for bias also risks falling afoul of the Computer Fraud and Abuse Act, which prohibits violations of terms of service - such as creating fake profiles; in 2016 the ACLU filed a challenge to this law on the basis that it creates a chilling effect for research into discrimination. In April 2018 a judge in a Washington, DC federal court ruled that the case could move forward, denying a federal motion to dismiss.

https://www.vox.com/2016/12/12/13867692/poor-neighborhoods-targeted-ads-internet-cartoon
https://www.axios.com/suit-to-let-researchers-break-website-rules-wins-a-round-1522766474-c687c3de-d8c4-4b00-bd09-9759b29d7220.html

Writer: Alvin Chang; Joe Uxhill
Publication: Vox, Axios
Date of publication: 2016-12-12; 2018-04-03