Campaign group Global Witness said it failed to prevent discriminatory targeting of job advertising and its algorithm was biased in choosing who would see them. In an experiment, almost all Facebook users shown adverts for mechanics were men, while ads for nursery nurses were seen almost exclusively by women.
Job advertising reflects the algorithm
Facebook says that their system shows people ads that they may be most interested in. Global Witness submitted two job ads for approval. So, the first one was for women, while the other one targeted people 55 and up. Well, Facebook approved both ads for publication. Still, it asked the advertiser to agree that it would not not discriminate against these groups.
Global Witness has pulled the ads before they were published. Facebook confirmed that their system helps serve people ads that they will be most interested in and that they are reviewing the report’s findings.
In 2019, the US Department of Housing and Urban Development (HUD) accused Facebook of discriminating against certain ethnicities when it comes to housing advertisements. Facebook has since promised not to show discriminatory ads in the US and Canada.
Facebook is exploring the possibility of expanding their reach to other countries. “The fact that it is possible to do this on Facebook in the UK is particularly shocking,” said Naomi Hirst, who led Global Witness’s investigation.
The experiment that led to action
Global Witness created four job ads, linked to real vacancies on the indeed.com platform. The ads were specifically targeted towards jobs in the UK and specified only those ads should be seen by UK adults.
“That meant that it was entirely up to Facebook’s algorithm to decide who to show the ads to,” Ms Hirst said. Also, she mentioned the result was “downright sexist.” Ms. Hirst is referring to the fact that, when Facebook show users an ad, it’s up to algorithms to decide who should see it.
With the algorithm, Facebook tries to make sure as many people as possible see the ad. But Global Witness says this perpetuates biases that already exist. For example, before, ads for mechanic jobs may have been placed in magazines geared towards men.
Ms. Hirst appreciates that gender does not have to be a barrier when it comes to looking for a job as a mechanic. In the real world, it is just as easy for anyone to buy a job hunting magazine. But this “it’s just simply not true online.”
GW also asked barrister Schona Jolly QC to examine the findings. Therefore she agreed that that “Facebook’s system itself may, and does appear to, lead to discriminatory outcomes.”
Global Witness has contacted the information commissioner about Facebook’s evidence processing data for job adverts.
Ravi Naik is a data-rights lawyer who represents a nonprofit group called Global Witness. He also believes Facebook’s advertising mechanisms might lead to the company violating equality laws.