WE ARE A MAGAZINE ABOUT LAW AND JUSTICE | AND THE DIFFERENCE BETWEEN THE TWO
December 12 2024
WE ARE A MAGAZINE ABOUT LAW AND JUSTICE | AND THE DIFFERENCE BETWEEN THE TWO
Search
Close this search box.

Entrenched biases in fingerprint scanning and facial recognition, a new report shows

Entrenched biases in fingerprint scanning and facial recognition, a new report shows

Emergency lights, Etolane, Flickr under Creative Comms,

A new report highlights both racial and gender biases entrenched in biometric fingerprint scanning and facial recognition technology’s use disproportionately targeting Black and Asian males. The Racial Justice Network and Yorkshire Resists have recently released an analysis report on the increased use of the Biometric Services Gateway (mobile fingerprinting) by UK police forces, as well as a live facial recognition pilot run by South Wales police. It uses FOI responses from 34 forces.

The Biometric Service Gateway (BSG) is a mobile application system allowing the user to scan an individual’s fingerprint and receive an instantaneous check against the police database (IDENT1) or the Immigration and Asylum Biometric database (IABS). Currently 24 forces use the BSG. The basis for stopping and scanning appears subjective and discretionary; most scans were conducted due to an officer suspecting an offence had been committed, or doubting the verity of personal information (e.g., names) given by a stopped individual.

The report states that all but eight responding forces did not collect or refused to provide ethnicity information.  As expressed by Jake Hurfurt, the Head of Research and Investigations of Big Brother Watch, the discretionary nature of the practice ‘appears to be continuing and exacerbating racial disparity in police use of suspicionless surveillance powers.’ It further showed that Black people are four times more likely to be stopped and scanned than a white person, and Asian people are two times more likely to be stopped. Men are around twelve times more likely to be stopped and searched than women.

Likewise, there are evident racial and sexist biases in Facial Recognition Technology. A 2019 NIST study showed that many algorithms were between 10 to 100 times more likely to misidentify a Black or East Asian than a white face; the category most prone to misidentification was Black women. Similar algorithms have been used by South Wales police on 35 people, of which six were minors between the ages of 10-17. Furthermore, six forces stated that they routinely scan the police and immigration databases simultaneously, whilst a separate six forces stated that they scan specifically for immigration reasons. As the report highlights: ‘This both reflects and entrenches the damaging conflation of immigration with criminal activity…this technology is increasing the pervasive trend towards the criminalisation of migration.’

It also suggests the increasing adoption by police forces of immigration and border guard duties, further embedding Hostile Environment policies which disproportionately impact on migrant and vulnerable communities. Movements such as LAWRS’s Step Up Migrant Women campaign, which supports victims’ safe-reporting mechanisms irrespective of immigration status, have long since recognised this issue.

There is a great disparity between the percentage of arrests and/or matches from conducted scans between forces, ranging from 1.22% to 57.72% on immigration matters, and from 11.81% to 57.72% on police matters. This casts doubt on the objective functionality and accuracy of such practices.

The report proposes several recommendations including the immediate ceasing of BSG use, and the implementation of a firewall between immigration and police databases. The report also highlights that both practices fail to increase community safety; rather, it further exacerbates issues of community trust and the marginalisation of vulnerable individuals.

Related Posts