WE ARE A MAGAZINE ABOUT LAW AND JUSTICE | AND THE DIFFERENCE BETWEEN THE TWO
October 19 2021
WE ARE A MAGAZINE ABOUT LAW AND JUSTICE | AND THE DIFFERENCE BETWEEN THE TWO

Digital forensic evidence relied upon in court is prone to ‘bias and error’, says report

Share on facebook
Share on twitter
Share on facebook
Share on twitter

Digital forensic evidence relied upon in court is prone to ‘bias and error’, says report

Photographer: Harland Quarrington (MOD images images from Flickr)

Digital forensic evidence relied upon in court is prone to ‘bias and error’ with ‘inherent’ uncertainty and can lead to wrongful convictions, according to a new study which takes issue with the notion that such evidence is ‘objective’. The research, published in Forensic Science International, explores the twin issues of reliability and bias in decision-making in ‘the new and fast changing’ disciple in digital forensics. It draws on reports made by some 53 experts involved in criminal investigations mainly from Norway but also from the UK, Denmark, Finland and Canada and others. The experts were asked to analyse the same 3GB evidence file involving a scenario related a leakage of confidential information with four to five hours to analyse and make their report.

The experts were given different types of contextual information either strongly indicating guilt, containing ambiguous and weak indications of guilt, or else indicating innocence. Other experts only received the scenario and no contextual information. The purpose was to better understand how such information impacted the judgment of experts.The group provided with the context suggesting innocence made the smallest number of observations ‘indicating that they were biased to find less evidence’; and, by contrast, the group presented with information suggesting guilt found the highest number of observations ‘indicating they were biased to find more evidence’ – with the ‘weak guilt’ group finding ‘significantly more’ observations of guilt.

The report’s authors – senior cognitive neuroscience researcher from University College London, Itiel E Dror and Nine Sunde, from the Norwegian police University College – highlight the lack of quality control in the digital forensics sector. It has been reported that digital evidence now features in around 90% of criminal cases.  The report argues that the ‘ever-changing landscape of technology’, both in terms of evidence itself as well as the forensic tools required to examine it, require ‘constant adaptation’. ‘This constant flux may have contributed to what may be described as a “quality challenge” in digital forensics,’ they say; referring to concerns about a ‘wild west… where key components for quality management rarely seemed to be in place’ (here).

They say that the growing use of digital evidence is often perceived as ‘objective and credible’; however there is ‘a range of possible errors and uncertainties inherent in the evidence’ as well as ‘errors that can derive from the human factors involved in the processes in which the evidence is produced’. The authors also flagged low ‘reliability’ (including a lack of consistency) between experts in their observations indicating ‘a serious and urgent need for quality assurance in digital forensic examinations’ to prevent ‘erroneous results from cascading into the investigation process’.

‘Digital forensics work involves many judgements and decisions that require interpretation and subjectivity. This means that the quality of the outcome of the process – digital evidence – is dependent on cognitive and human factors, which can lead to bias and error.’
Itiel E Dror and Nine Sunde

Earlier this year the Justice Gap reported the forensic science regulator, Gillian Tully, in her final report, describing her six years’ tenure as ‘fraught with financial, reputational and capacity problems’. She noted that digital media investigators had not yet made ‘any significant steps towards implementing the required quality standards’. In some police forces they were kept separate from their digital forensics colleagues ‘presumably… in an attempt to avoid the adoption of quality standards’.

Dr Tully said there was ‘an urgent need’ for ‘more fundamental change’ and that it was ‘inexcusable’ that the impact of ‘the shortfalls in capacity for toxicology and digital forensics which have been clear for many years’ fell on the frontline forensic science practitioners. She continued: ‘They bear the brunt of the stresses in the system, with consequent risks to their well-being and, potentially, to quality. The impact on justice is even more inexcusable.’

A report by the Police Foundation published in January on digital forensics highlighted the problems faced by the sector in the UK where ‘a fragmented police service is widely deemed ineffective’ and each force had ‘different governance, systems, priorities and capabilities’. The report noted that the ‘sheer volume of data trails’ was ‘growing at an alarming rate and in parallel with ever-improving criminal innovation’. ‘Silo working undoubtedly has a negative impact on quality and this is exacerbated by a lack of common standards,’ it said. ‘While ISO accreditation ensures that evidence submitted to courts is reliable, forces have generally struggled to attain it. Furthermore, it is normal practice to contract private forensic providers to cope with demand, but a lack of collaboration and the use of different tools can affect extraction results. This is of particular concern when presenting evidence in court.’

In a 2019 editorial for Science magazine, Dror wrote that forensic experts are ‘too often exposed to irrelevant contextual information’ manly because they work with the police and prosecution. ‘Extraneous information – from a suspect’s ethnicity or criminal record to eyewitness identifications, confes sions, and other lines of evidence – can potentially cause bias,’ he wrote. ‘This can give rise to conclusions that are incorrect or overstated, rather than what forensic decisions should be: impartial decisions, appropriately circumscribed by what the evidence actually supports.’

As a result of cognitive biases, science is ‘misused, and sometimes even abused, in court’. ‘Not only can irrelevant information bias a particular aspect of an investigation, it often causes “bias cascade” from one component of an investigation to another and “bias snowball,” whereby the bias increases in strength and momentum as different components of an investigation influence one another,’ he wrote.