Tagged: Privacy

The Pendulum of Private and Public Interests: Striking the Balance with New Technology Article

The Pendulum of Private and Public Interests: Striking the Balance with New Technology


R (on the application of Edward Bridges) v Chief Constable of South Wales [2019] EWHC 2341 (QB)

The law must always keep pace with new and emerging technologies. In this particular case, AFR (Automatic Facial Recognition), which goes beyond regular CCTV by comparing the biometrics of an individual against an image, was used. When comparing an image, a ‘similarity score’ is generated which is a numerical value indicating the likelihood of the two faces matching. At the time of the case, AFR had been used by the police in Cardiff on several occasions and the Claimant contended the illegal deployment of AFR in their vicinity on 27 March and 21 December 2017, because it contravened Article 8(1), the right to respect for one’s private life of the European Convention of Human rights.

Novel instances such as this highlight the need to balance the protection of private interests against public interest when harnessing new technologies to aid the detection and prevention of crime. In such cases, there is a need to strike the right balance, as permeated throughout the judgment. In this case, the court noted Lord Styne’s (concerning DNA retention) emphasis on the public benefits for law enforcement agencies in using new technologies, and that law enforcement agencies should be able to take full advantage of the opportunities presented (reference). However, Lord Reed’s counter observation of the growing concern in recent times about the surveillance, collection and use of personal data in Western democracies was also noted. Given the use of such data by more totalitarian countries such as China, this concern is well-founded.

On 27 March outside the Motor Point area in Cardiff, AFR was deployed. The defence exhibition which took place previously had attracted disorder and individuals that caused criminal damage, as well as making two bomb hoax calls. AFR created a list of wanted individuals, categorised by colour. For instance, red (for serious crimes), amber (wanted on warrant) and purple (suspected of committing a crime). Scanned faces would then be compared against AFR’s list.

In this case, the Claimant stated that he was unaware that AFR was in use and unaware that his face was being scanned. However, it was not possible to check if his face was scanned because if his facial biometric information was processed, the system would have identified him as not a person of interest, and thus immediately deleted his biometric data from the system. Furthermore, the officers involved provided the public with information of what was occurring, as well as informing people on social media.

The question before the court was whether this engages Article 8(1), the right to privacy and family life. Their lordships drew attention to Wood, which rejected that “the bare act of taking pictures” amounted to an interference with Article 8(1) rights. Furthermore, in the context of police activity, it was suggested the activity complained of was “expected and unsurprising”, thus not breaching Article 8(1). Despite that, it was distinguished that AFR goes beyond taking mere pictures as it captures an individual’s biometric data which is then further processed, by comparing it to the watchlist database. Such a use of the Claimant’s biometric data would go well beyond the “expected and unsurprising,” thus engaging Article 8(1).

It was established that the police have a basis in law for using AFR. Nevertheless, it had to be decided if the interference to the Claimant’s Article 8(1) rights was justified under the four-part test in Bank. It was held that the use of AFR was for a legitimate aim – to justify interference with the Claimant’s rights under Article 8(1) – and that the use of AFR is rationally connected to that legitimate aim. The question then became whether a less intrusive measure could have been used without compromising the objective, and if a fair balance has been struck between the rights of the individual and the interests of the community. As AFR was deployed in a limited open and transparent way, the intrusion of the Claimant’s Article 8(1) rights was also very limited. Furthering this notion, any unwanted individuals who were scanned would have instantly had their data deleted, mitigating any interference with the claimant’s Article 8(1) rights. Consequently, the use of AFR did not amount to a disproportionate interference with the Claimant’s Article 8(1) rights, with any interference being limited, due to the near-instantaneous discarding of the Claimant’s biometric data. As such, the use of ARF was justified.

The court noted that given AFR’s current use they are satisfied that it is being used proportionately. However, the use of AFR no doubt raises questions as to the balance between individual rights and that of the community, which will no doubt be revisited as technology further develops.

The Power of Data Article

The Power of Data


Every second 40,000 searches are being conducted on Google. Every minute 31.35 million messages are being sent on Facebook. Everyday, 95 million photos are uploaded onto Instagram, while a staggering 2.5 quintillion bytes of data is being created right now. Within a click of a button we transfer so much data but often underestimate its capability. In recent times, the false consciousness of ‘privacy’ was highlighted, and our own data was weaponised, manipulated and misused to make democratic decisions.

We have always been aware of our data never being free from invasion of privacy. It has been justified by keeping us safe through ‘government surveillance’. However, we did not know the extent to how our data was being sold by tech giants and used by firms to make political decisions; a form of psychological warfare.

Cambridge Analytica (CA) was a political consulting firm. It was founded in 2013 and was a branch of the Strategic Communications Laboratories (SCL). The company claimed that it tailored messaging by extrapolating personalities of people through surveying to persuade voters.

The exploits were first reported in 2015 by The Guardian. The report claimed that Ted Cruz’s political campaign was using psychological data of millions of Facebook users without their permission. It is worthwhile to note that Facebook also owns Instagram and Whatsapp. The degree of ‘behavioural microtargeting’ was only revealed in 2018 by a former CA employee, Christopher Wylie. He provided first-hand testimony as well as documents revealing how user data from Facebook was deployed in political campaigns around the world, particularly Trump’s campaign and Brexit.[1]

Facebook allowed researcher Aleksander Kogan from CA to have access to 87 million of its users’ data. This was achieved through a Facebook app called ‘thisismydigitallife’ that was formatted as a quiz. Users were paid to take part in this personality quiz, but it also collected data from the Facebook friends of the quiz takers. The data included personal information, the pages people liked, as well as private messages. The data was scrutinised, and targeted ads were shown particularly to the ‘persuadables’; this was a demographic of people in the middle who were not sure who to vote for. These ads involved a mass focus on Hillary Clinton’s controversies, anti-immigration and pro-Trump propaganda. In 2019, new evidence showed that they had also worked on the ‘Leave’ campaign during Brexit by running the website and using targeted advertisements (reference). Further investigations showed that CA also had major influence in political campaigns globally: more than 100 campaigns in over 30 countries such as India and Trinidad.

Facebook failed to disclose the breach and Mark Zuckerberg was called to testify before Congress. He argued that Facebook was not aware that user data was being misused. New privacy policy and strict regulations were introduced, albeit too late. CA strongly denied all the allegations but both SCL and CA have since gone into liquidation. The triggering factor was an undercover documentary conducted by Channel 4 that exposed CA bosses explaining how the organisation operates (reference). This included: scale of their work in Trump’s campaign, fake identities to avoid investigations, untraceable propaganda, honey trapping and bribery.

Many predictions have been made in terms of data in the future. One thing that is clear is that almost every industry relies on it and nothing is immune to it. It is not that data itself is problematic, but rather it is the analysis of it, that makes it an issue. This misconduct has shown a new wave of crime in which no one is held responsible. Data can be created, destroyed and transformed; it can be the solution but in the wrong hands, it can also be a weapon. There needs to be legal developments in this area to ensure protection is at the centre of it.

This scandal has transformed the relationship between tech and politics. It has also made the public aware of misinformation and there has been a rise in mistrust in social media. The truth is that, social media is forever growing and that digital footprints that already exist are hard to erase. A recent Netflix documentary The Social Dilemma showed former engineers from tech giants addressing the disturbing influence of algorithms in creating polarisation in society through businesses often making profit upon disinformation. The examples include conspiracy theories regarding the coronavirus, 5G, climate change, Pizzagate, flat-earthers etc. The amplification of mere hearsay via these tools of persuasion are blurring the lines of what the truth is. Politicians are also engaging in the ‘spread of manipulative narratives with phenomenal ease’ as seen in Myanmar because such platforms allow it. Therefore, there is a lack of control over who we are and what we believe in, and it causes a discord in society. The power of data is the absence of ours.

[1] BBC, ‘Cheating may have swayed Brexit poll – Christopher Wylie’ (BBC, 2018) <>; and Alex Hern and Dan Sabbagh, ‘EU referendum won through fraud, whistleblower tells MPs’ (The Guardian, 2018) <>

Infosoc 2018: Informational Rights, Informational Wrongs, and Regulatory Responsibilities Article

Infosoc 2018: Informational Rights, Informational Wrongs, and Regulatory Responsibilities

Roger Brownsword - Professor of Law

Working Papers Series, edition 1
This is the first edition of the Bournemouth University Working Papers Series, launched in March 2018, which represents research articles from staff members of the Law Department at Bournemouth University.

The information that surrounds us, in a digital world, has many legal implications and raises a number of questions, which require further examination. As such, the Working Paper in Law Series introduces the field of informational rights and wrongs as one that invites further inquiry. The papers presented in this Series will explore various informational interests cutting across a large number of different legal disciplines as well as other topics of interest.

Led by Professor Roger Brownsword (Consulting Editor, BULR), the BUWPS provides a platform for legal inquiry in this fast-moving field, whilst representing the research of BU Law Staff, through this student-staff co-created journal.

The Balancing of Rights in a Democratic Society – Are the Media Too Free? Article

The Balancing of Rights in a Democratic Society – Are the Media Too Free?

Chloe Beeney - Final Year LLB (Hons) Student

Published in Issue 1, September 2017

England and Wales thrive on being a democratic society, promoting the need for equal rights amongst all.

Although, within this, freedom of expression is a fundamental right, this is ultimately weighed against the crucial right of privacy and the right not to be defamed. However, with the advance of technology and in particular the rise of the internet, the media are less restricted with their publishing, leading to an increase of infringement on an individual’s rights. Despite attempts to control this through the reform of defamation laws, it is argued that the law is inadequate with guarding against conflicts between the media and individuals, resulting in the media experiencing greater freedom than before. With suggestions that this has become unmanageable, equal rights seem to be something of the past and despite recent attempts to resolve this, the law is essentially not equipped to do so.