The Danger of Automatic Facial Recognition Technology and Court Restrictions

facial recognition technology danger

On the 11th of August the Court of Appeal found that the use of automatic facial recognition (AFR) technology was proportionate interference with human rights as the benefits outweighed the impact on the data subject. Is it time for the government to recognise the dangers of this technology?

The Grounds of Appeal

It turned out that SWP’s DPIA doesn’t meet the requirements for (non)engagement of the rights and the processing of the (biometric) personal data of persons whose facial biometrics are captured by AFR but who are not on police watchlists used for AFR.

The use of AFR technology involves the collection, processing, and storage of a wide range of information, including 

(1) facial images; 

(2) facial features (i.e. biometric data); 

(3) metadata, including time and location, associated with the same; and 

(4) information to match with persons on a watchlist.

Facial Biometrics Features

Automatic facial recognition entails the processing of biometric data in the form of facial biometrics. The term “biometrics” is described as “the recognition of people based on measurement and analysis of their biological characteristics or behavioural data” what should be identified is that Facial biometrics bear some similarity to fingerprints because both can be captured without the need for any form of intimate sampling and both concern a part of the body that is generally visible to the public. 

A significant difference, however, is that AFR technology enables facial biometrics to be procured without requiring the co-operation or knowledge of the subject or the use of force, and can be obtained on a mass scale.

facial recognition

Facial Recognition Use by Law Delegates

It has been issued that police had been given “too broad a discretion” over the watchlists used to compare scanned faces against without an adequate data protection assessment, which is required by the Data Protection Act 2018. According to grounds of appeal, the police use of Live Automated Facial Recognition technology was not in accordance with the law for the purposes of Article 8(2) of the European Convention on Human Rights.

Therefore the police should have thoroughly reviewed their DPIA’s to assess the risks to the data subject.

The full article including all five grounds of appeal details and law delegates’ comments is accessible to download through the form below. 


download face recognition atricle

More Posts

Who Needs Security Consultancy?

Faced with the question – who needs security consultancy? – many companies may shrug and say, effectively, not us. That may not be very wise, especially if the work of the company has any sensitive aspects.

Read More »