Racist Photograph Analysis Software – Really ?


Women with darker skin are more than twice as likely to be told their photos fail UK passport rules when they submit them online than lighter-skinned men, according to a BBC investigation.

So claims this BBC news article.

BBC News – UK passport photo checker shows bias against dark-skinned women https://www.bbc.co.uk/news/technology-54349538

One black student said she was wrongly told her mouth looked open each time she uploaded five different photos to the government website.

Here is my own, personal, experience of the governments racist software.

I took a series of photos of my wife and selected the best of them to submit to the passport system. The selected photo was rejected on the basis that it didn’t meet the criteria. The issue was that the system couldn’t identify a clear demarcation of my wife’s hairline and her face, my wife is white skinned with blonde hair.

This experience has just highlighted the difference between automated systems and the human eye. In some circumstances the human eye is far superior.

The process, however, does allow you to challenge the automated rejection and to provide your argument as to why it is wrong. According to the article Ms Owusu did challenge the rejection and had her photo accepted.

I also challenged the rejection and the passport application was duly processed and my wife received her new passport.

This shows how “systemic racism” can spread, Elaine Owusu said.

This article has turned a frustrating technological problem into a problem of racism.

Is Ms Owusu suggesting that the software developers have programmed the system to give people of colour a hard time. To what end ?

Elaine Owusu, I think you are just jumping on the racism bandwagon and I, for one, am sick of the racism card being played at every turn.

The issue of racism is being rammed down our throats every minute of the day. Programming on the TV channels with a colour bias along with the adverts during and between programs. Open any newspaper and there it is.

All that this article proves is that the software is not perfect. It is not racist, just flawed. As technology advances the errors will reduce.