The Danger of Automatic Facial Recognition Technology and Court Restrictions

automatic facial recognition technology danger

On the 11th of August the Court of Appeal found that the use of automatic facial recognition (AFR) technology was proportionate interference with human rights as the benefits outweighed the impact on the data subject. Is it time for the government to recognise the dangers of this technology?

The Grounds of Appeal

It turned out that SWP’s DPIA doesn’t meet the requirements for (non)engagement of the rights and the processing of the (biometric) personal data of persons whose facial biometrics are captured by AFR but who are not on police watchlists used for AFR.

The use of AFR technology involves the collection, processing, and storage of a wide range of information, including 

(1) facial images; 

(2) facial features (i.e. biometric data); 

(3) metadata, including time and location, associated with the same; and 

(4) information to match with persons on a watchlist.

Facial Biometrics Features

Automatic facial recognition entails the processing of biometric data in the form of facial biometrics. The term “biometrics” is described as “the recognition of people based on measurement and analysis of their biological characteristics or behavioural data” what should be identified is that Facial biometrics bear some similarity to fingerprints because both can be captured without the need for any form of intimate sampling and both concern a part of the body that is generally visible to the public. 

A significant difference, however, is that AFR technology enables facial biometrics to be procured without requiring the co-operation or knowledge of the subject or the use of force, and can be obtained on a mass scale.

facial recognition

Facial Recognition Use by Law Delegates

It has been issued that police had been given “too broad a discretion” over the watchlists used to compare scanned faces against without an adequate data protection assessment, which is required by the Data Protection Act 2018. According to grounds of appeal, the police use of Live Automated Facial Recognition technology was not in accordance with the law for the purposes of Article 8(2) of the European Convention on Human Rights.

Therefore the police should have thoroughly reviewed their DPIA’s to assess the risks to the data subject.

The full article including all five grounds of appeal details and law delegates’ comments is accessible to download through the form below. 


download full article

More Posts

What Is Behavioural Analysis?

Behavioural Analysis recognises people do things for reasons, and following patterns. In this article will examine those reasons and what the underlying patterns are in depth.

Read More »

What Is The Difference Between AI and Machine Learning?

At 6S Global, we use both AI (Artificial Intelligence) and Machine Learning in the services we provide to help your cybersecurity. We thought you might like to know a little more about these processes, so this piece will describe the distinctions between them.
Actually, ‘difference’ is maybe the wrong word, as Machine Learning is a part of Artificial Intelligence.

Read More »
The Newest System to Create Clean Air Zones

How Will Automatic Number Plate Recognition Control Air Pollution?