AFR

Tagged: AFR

The Pendulum of Private and Public Interests: Striking the Balance with New Technology Article

The Pendulum of Private and Public Interests: Striking the Balance with New Technology

-

R (on the application of Edward Bridges) v Chief Constable of South Wales [2019] EWHC 2341 (QB)

The law must always keep pace with new and emerging technologies. In this particular case, AFR (Automatic Facial Recognition), which goes beyond regular CCTV by comparing the biometrics of an individual against an image, was used. When comparing an image, a ‘similarity score’ is generated which is a numerical value indicating the likelihood of the two faces matching. At the time of the case, AFR had been used by the police in Cardiff on several occasions and the Claimant contended the illegal deployment of AFR in their vicinity on 27 March and 21 December 2017, because it contravened Article 8(1), the right to respect for one’s private life of the European Convention of Human rights.

Novel instances such as this highlight the need to balance the protection of private interests against public interest when harnessing new technologies to aid the detection and prevention of crime. In such cases, there is a need to strike the right balance, as permeated throughout the judgment. In this case, the court noted Lord Styne’s (concerning DNA retention) emphasis on the public benefits for law enforcement agencies in using new technologies, and that law enforcement agencies should be able to take full advantage of the opportunities presented (reference). However, Lord Reed’s counter observation of the growing concern in recent times about the surveillance, collection and use of personal data in Western democracies was also noted. Given the use of such data by more totalitarian countries such as China, this concern is well-founded.

On 27 March outside the Motor Point area in Cardiff, AFR was deployed. The defence exhibition which took place previously had attracted disorder and individuals that caused criminal damage, as well as making two bomb hoax calls. AFR created a list of wanted individuals, categorised by colour. For instance, red (for serious crimes), amber (wanted on warrant) and purple (suspected of committing a crime). Scanned faces would then be compared against AFR’s list.

In this case, the Claimant stated that he was unaware that AFR was in use and unaware that his face was being scanned. However, it was not possible to check if his face was scanned because if his facial biometric information was processed, the system would have identified him as not a person of interest, and thus immediately deleted his biometric data from the system. Furthermore, the officers involved provided the public with information of what was occurring, as well as informing people on social media.

The question before the court was whether this engages Article 8(1), the right to privacy and family life. Their lordships drew attention to Wood, which rejected that “the bare act of taking pictures” amounted to an interference with Article 8(1) rights. Furthermore, in the context of police activity, it was suggested the activity complained of was “expected and unsurprising”, thus not breaching Article 8(1). Despite that, it was distinguished that AFR goes beyond taking mere pictures as it captures an individual’s biometric data which is then further processed, by comparing it to the watchlist database. Such a use of the Claimant’s biometric data would go well beyond the “expected and unsurprising,” thus engaging Article 8(1).

It was established that the police have a basis in law for using AFR. Nevertheless, it had to be decided if the interference to the Claimant’s Article 8(1) rights was justified under the four-part test in Bank. It was held that the use of AFR was for a legitimate aim – to justify interference with the Claimant’s rights under Article 8(1) – and that the use of AFR is rationally connected to that legitimate aim. The question then became whether a less intrusive measure could have been used without compromising the objective, and if a fair balance has been struck between the rights of the individual and the interests of the community. As AFR was deployed in a limited open and transparent way, the intrusion of the Claimant’s Article 8(1) rights was also very limited. Furthering this notion, any unwanted individuals who were scanned would have instantly had their data deleted, mitigating any interference with the claimant’s Article 8(1) rights. Consequently, the use of AFR did not amount to a disproportionate interference with the Claimant’s Article 8(1) rights, with any interference being limited, due to the near-instantaneous discarding of the Claimant’s biometric data. As such, the use of ARF was justified.

The court noted that given AFR’s current use they are satisfied that it is being used proportionately. However, the use of AFR no doubt raises questions as to the balance between individual rights and that of the community, which will no doubt be revisited as technology further develops.

Keywords: