Presenting digital evidence in court is an onerous and complex process. Dr Ian Kennedy CITP FBCS MCSFS outlines the processes, rules and future of digital forensics.

Contrary to the phrase ‘the evidence speaks for itself’, the presentation of digital forensic evidence to court does not readily convey inference to a jury (or other decision maker) deliberating over the guilt or innocence of individuals. Such evidence continues to pose a number of challenges for digital forensic practitioners (referred to hereafter simply as ‘practitioners’). Today’s practitioner operates in a very different environment to that I described back in 2006 when I last wrote as a practitioner working for the police on the subject of presenting digital evidence to court. The challenges faced by practitioners to investigate and prepare evidence for court are many. Let’s explore some of these.

Data volumes and diversity

In recent years, there has been an explosion of digital data. In his testimony to the House of Commons Select Committee, the head of digital in the Cyber and Communications Forensics Unit of the Metropolitan Police stated that 90 percent of crime now has a digital element to it.

Internet of Things (IoT) technology includes an almost unlimited range of devices with a diverse range of data including biometric data from wearable devices, doorbell or dashcam footage, personal weather station data, and energy usage data from smart plugs. Juries of the future may even begin to see the appearance of cyber physical system (CPS) related cases arising out of network breaches and control of Industrial Internet of Things (IIoT) devices.

Each of these devices can potentially capture, store and transmit data to cloud-based platforms in a variety of jurisdictions, making recovery of data for criminal proceedings challenging. Whilst any one case is unlikely to have many IoT data types to consider, it does mean courts have to contend with understanding the operation and meaning of an increasing variety of devices and their data.

One initiative that was launched in an effort to simplify the process of gathering large quantities of dashcam evidence, is called Operation Snap. It operates by allowing members of the public to upload dashcam footage, accompanied by a witness statement.

What is interesting is that, to date, the authenticity of such footage (potentially submitted by parties with a vested interest in the outcome) has yet to be challenged, thereby risking its admissibility.


In the criminal justice system of England and Wales, the rules concerning evidence are quite relaxed and largely about exclusion. Before it can be tendered at trial, this ‘potential evidence’ must be declared legally admissible to the criminal proceedings. The court has the final say on this matter and ultimately asks three questions of the proposed evidence. These questions examine if the material is:

  1. Relevant to a fact in issue
  2. Not subject to any exclusionary rules (such as hearsay)
  3. Subject to any inclusionary rules (such as a witness statement)

A nuance concerning relevance is that it must relate to a specific matter in dispute. If the opposing counsel have already conceded a point, any material that only supports this point is not relevant (as it is now redundant) and therefore likely to be deemed inadmissible, under the first test above. Thus, if the defence counsel have conceded that an email threatening blackmail to the victim was sent from the defendant’s email account, any evidence to support this particular point is redundant and therefore likely to be inadmissible. Note this concession of itself does not prove the defendant is the perpetrator of the act.

Evolved regulatory landscape

The publication of the Government’s ‘Forensic science on trial’ report recognised back then that digital forensics was becoming ‘critical in a growing number of cases’. It prompted two significant changes to forensic science practice within the criminal justice system of England and Wales.

Firstly, Criminal Practice Direction (CPD) 19A.5 was introduced to safeguard against unreliable expert opinion evidence . Expert opinion evidence is generally permitted if the judge allows it and the practitioner is accepted as an ‘expert’.

Secondly, the report led to the establishment of the Forensic Science Regulator (FSR), which placed a spotlight of greater scrutiny on what may best be described as a presumption of reliability previously bestowed upon expert evidence.


A fair and impartial criminal justice system requires reliable evidence. What may surprise (perhaps alarm) you, in a study in 2021 that explored the importance of forensic evidence for decisions on criminal guilt found inclusion of forensic evidence increased both the likelihood of a guilty verdict and level of confidence in this verdict. According to some, lawyers and forensic scientists don’t always share a common understanding of what reliability even is.

Reliable or not, juries can struggle with fully understanding some technical evidence. Some academics argue juries simply end up trusting the expert’s interpretation. As humans, we often like to simplify things to understand the overall picture. Abstraction is one way of doing this, but it can come at a cost.


Digital evidence is almost always an interpretation and representation of what is physically the polarised states of magnetic flux on a physical device. To make this human readable to the court, the practitioner uses tools and their own skills to decode and visualise the data.

This abstraction removes information from the underlying data, creating a simplified view of the same data. To see this in action, just look at the headers behind a typical email.

Removing information can introduce errors. A screenshot of a decoded email, discards the information from the headers, leaving an incomplete picture of the data. Previous studies examining data acquired from the cloud concluded abstraction of cloud data produced inferior forensic evidence.

With or without error, data will often need meaningful inference in the context of the investigation. This is typically provided by the practitioner, whose expertise will interpret meaning from the data. In addition, the practitioner and their methods may come under scrutiny. One example of this is the potential issue of practitioner bias.

Practitioner bias

Just like everyone else, practitioners are human and so can be at the mercy of unconscious bias, impacting on the reliability of any evidence they tender at court.

Unconscious bias is not new as an idea but has been recognised by the FSR who articulate several categories of unconscious bias. These can ultimately impact on the reliability of evidence tendered at court:

  • Expectation bias
  • Confirmation bias
  • Anchoring bias
  • Contextual
  • Role effects
  • Motivational bias
  • Reconstructive effects

The impact of bias is also recognised by rule 19.2 of the UK’s Criminal Procedure Rules, which covers the use of expert evidence in Court. It states that any opinions a practitioner acting as an expert witness offers must be ‘objective and unbiased’. Furthermore, the expert’s duty is to assist the court and that this duty ‘overrides any obligation to the person from whom the expert receives instructions or by whom the expert is paid’. Thus, despite acting for either the defence or the prosecution, the practitioner is expected to maintain a level of neutrality in their work and testimony.

Regardless of any bias, the conclusions drawn by the practitioner must be communicated to the court. This then presents another challenge: the use of appropriate scientific language.

Misuse of scientific language

Because much of the evidence can be complex, it must be expressed in a language that is both comprehensible to a layperson, yet exact enough to avoid any ambiguity or misinterpretation. A study of 500 randomly selected forensic science reports found that ‘questionable communication practices’ were common and that ‘little information’ was provided to support their ‘absolute conclusions’.

Phrases such as ‘it is likely’ are commonly used to express a conclusion. For example, you may encounter:

‘It is likely that the files were downloaded from the internet to the USB.’

As practitioners, if we express that something is ‘likely’ then we are indicating our conclusion is based on a likelihood. This is calculated using something called a ‘posterior probability’.

This is an area of statistics that calculates the likelihood of something happening given some known information or past event. Often missing from such reports where likelihood is mentioned, is any detail on how that likelihood has been determined. In his book on forensic evidence in court, Craig Adam points out that this likelihood may be entirely subjective. The problem is that to calculate probabilities you basically need to know all the possible outcomes for a given scenario, which is complex (or impractical) in digital forensics. Interestingly, an English Judge has previously ruled ‘against using similar statistical analysis in the courts in future’, due to it being misunderstood by the court and the expert who attempted to use it.

For you

Be part of something bigger, join BCS, The Chartered Institute for IT.

Another common phrase you can expect to encounter in expert testimony is the ambiguous expression ‘is consistent with’, which states some (unknown) degree of similarity between two things. An example of this might be:

‘The keywords used are consistent with an individual searching for hacking tools.’

When these types of expressions are used, typically there is no indication how common the ‘consistent’ features are in the wider population.

Coming from the lips of a practitioner with expertise in the subject, such words can be compelling to the court. The expertise of the practitioner is thus another potential risk to the reliability of evidence.

Exceeding expertise

Practitioners typically specialise in relatively narrow areas of expertise, such as DNA profiling, ballistics, computer forensics or mobile phones, for example. Whatever their field, each must steer a very narrow course to stay within their field of expertise when presenting their testimony.

As a practitioner, you need to be very alert to the risks of exceeding your expertise, as many opportunities exist for stepping outside your area of specialism. For example, confirmation bias can happen when attempting to corroborate your findings from another (related) field. This might be a practitioner trained in mobile forensics attempting to interpret the findings of a cell site analysis survey.

Similarly, you may try to quantify the uncertainties in your data with the use of statistics without fully understanding how to do so correctly. This issue led to the flawed evidence tendered by paediatrician Professor Sir Roy Meadow who fell foul of the prosecutor’s fallacy in the now infamous trials of R v Clark [2003] EWCA Crim 1020, R v Cannings [2004] EWCA Crim 1 and R v Patel [2003]. This error continued to appear in cases as recently as R v London Borough of Croydon (EWHC 1473).

Practitioners may also succumb to a strategy sometimes used by opposing counsel to follow a line of questioning that leads the practitioner outside their field of expertise with a view to undermining their testimony or create confusion for the jury, thus introducing reasonable doubt.

The future

Greater automation is almost certainly expected to be one of the challenges for the future. It may be driven, in part, by rising data stores and huge backlogs in both forensic labs and the courts. This includes the use of AI technology, already in use to assist document review in legal cases. Law firms are investing in the use of AI and the legal services market is expected to change. This means practitioners may not simply be engaging with lawyers in the future but with new roles created within the profession, such as legal data scientists and others working within legal technology departments.

Some writers such as Richard Susskind see a transformative future for the justice system with the evolution of online courts. The UK Government already recognises this for non-criminal cases through online dispute resolutions.

There is also a move to experiment with using virtual reality in courts. Despite all these developments, there is a level of scepticism about the increasing use of technology in the courtroom, due to technical reliability and its potential impact on vulnerable individuals taking part in the process.

A future where evidence can truly speak for itself without a human in the loop sounds scary, but is thankfully a long way off.