Skip to content

Menu

Network by SubjectChannelsBlogsHomeAboutContact
AI Legal Journal logo
Subscribe
Search
Close
PublishersBlogsNetwork by SubjectChannels
Subscribe

EDPB opines on the use of facial recognition in airports

By Marcus Evans (UK) & Shiv Daddar (UK) on July 18, 2024
Email this postTweet this postLike this postShare this post on LinkedIn

Co-written by Swaathi Balajawahar, Trainee Solicitor

Introduction

On 23 May 2024, the European Data Protection Board (EDPB) issued Opinion 11/2024 on the use of facial recognition to streamline airport passengers’ flow (the Opinion). The Opinion considered the use of facial recognition technology (FRT) by airport operators and airline companies for the purpose of streamlining passenger flow at the airport (security checkpoints, baggage drop-off, boarding, and access to passenger lounges) where processing of personal data is subject to the General Data Protection Regulation (GDPR).

The EDPB regards the harms that stem from centralised storage of biometric information (ie potential over identification and tracking by the service provider through scope creep or unauthorised access by a third party allowing it to impersonate the data subject) as requiring particularly high necessity justification.

It concluded in the Opinion that FRT can only be used to verify a passenger’s identification in a manner compliant with the GDPR if the biometric template remains in the individual’s hands, or in cases where the biometric template is held on a central database but where the encryption key is kept solely in the individual’s hands. The EDPB also emphasised the need for data controllers to be able to justify their retention period in line with GDPR requirements.

Scope

The Opinion analyses the compatibility of the use of FRT with the storage limitation principle (Article 5(1)(e) GDPR), the integrity and confidentiality principle (Article 5(1)((f)) GDPR, data protection by design and default (Article 25 GDPR) and security of processing (Article 32 GPDR). It sets out clearly that FRT requires the use of biometric data, which is granted special protection under Article 9 GDPR.

Compliance with other GDPR provisions, including analysis of the applicable legal basis were not in scope of the Opinion and therefore the validity of consent as a lawful basis for the use of FRT was not examined. However, the Opinion appears to assume that consent is required as the Article 9 condition for the processing of biometric data, and the Opinion stresses that that any consent relied would need to meet the requirements of the GDPR, which may have practical consequences (e.g. valid consent means that individuals must be allowed to choose whether or not to use these services without any detriment, such as significantly longer delays).

Summary of Scenarios

The EDPB considered the compliance of processing passengers’ biometric data with four different scenarios, as summarised in the table below. In all cases, only the biometric data of passengers who actively enrol and consent to participate should be processed and the data controller must be able to demonstrate that there are no less intrusive alternative solutions that could achieve the same objective as effectively.

 Scenario 1Scenario 2Scenario 3.1Scenario 3.2
StorageBiometric template stored in the hands of the individual (e.g., on their device).Biometric database is stored within the airport premises, under the control of the airport operator but with the encryption key in the hands of the individual.Centralised storage in a database within the airport, under the control of the airport operator.Centralised storage in the cloud, under the control of the airline company.
Description of how facial recognition is implementedPassengers’ device used to transmit passengers’ identification data (ID) / biometric template (e.g., through an encoded QR code which can be decoded by QR scanners at dedicated booths).App generates a QR code containing a key that is scanned at biometric checkpoints with a QR scanner and a camera. Passenger’s index is sent to the database to request the encrypted template which is downloaded and checked locally at the checkpoint and/or user’s device. Only the matching result is known to and used by the checkpoint controller.Passengers present themselves at a pod equipped with a camera. Their biometric sample is then sent to a central airport server, which will attempt to match the data with that of the central biometric database. The passenger can thus be identified and checked for whether or not they are registered for a departing flight. Biometric matching is only performed when each passenger presents themselves at the check points, but the data processing itself is done at a central server connected to the central database.At the airport, the passengers go through dedicated pods, equipped with a camera. The passengers’ biometric data is sent through a request to an airline cloud server, where the matching of this data is performed against the central database. The passenger is identified and checked for whether or not they are registered for a departing flight.
Enrolment (recording of ID / biometric template)Done by airport operator, remotely or at airport terminals. Occurs only once for a specific validity period.Done by airport operator, remotely or at airport terminals. Occurs only once for a specific validity period.Passengers need to enrol for each flight in a short period before their departure, remotely or at airport terminals.Enrolment is done once, for as long as the customer holds an account with the airline company, remotely or at airport terminals.
EncryptionNot applicable as data stored locally on passengers’ device.ID and biometric data is encrypted with a key stored on passengers’ devices. A second layer of encryption is done by the airport operator with keys controlled by the airport operator. ID and biometric data is encrypted with a key held by airport operator.ID and biometric data is encrypted with a key held by airline company or its cloud service provider.
Storage / Retention PeriodPassengers’ biometric data is retained only for a very short period and deleted after matching (i.e., verifying passenger identity via QR scanners) is completed. Shortest possible retention period is advised, taking into account passengers that fly very rarely and offer data subjects the option to set their preferred storage period.The storage period is typically 48 hours and data is deleted once the plane has taken off.Storage period is defined by the airline company and can potentially last as long as the customer has an account with the airline company.
ComplianceCould in principle be compatible with Articles 5(1)(e), 5(1)(f), 25 and 32 GDPR, subject to appropriate safeguards. The Opinion outlines recommended safeguards that should be implemented.Could in principle be compatible with Articles 5(1)(e), 5(1)(f), 25 and 32 GDPR, subject to appropriate safeguards. Note that the recommended safeguards set out in the Opinion go beyond what is required under Scenario 1.Not compatible with Article 5(1)(f), Article 25 and Article 32 GDPR.Not compatible with Article 5(1)(e), Article 5(1)(f), Article 25 and Article 32 GDPR.

Our take

The Opinion clearly sets out the EDPB’s view that that any use of FRT for the purpose of streamlining passenger flow at the airport (security checkpoints, baggage drop-off, boarding, and access to passenger lounges) will not comply with the requirements of the GDPR where a passenger’s biometric template does not remain in the passenger’s control (i.e. where the template is not stored locally on the passenger’s device or where, in the case of encryption, the decryption key is not in the passenger’s controls). As has been the case since 2012, when, the EDPB’s predecessor, the Art 29 Working Party published Opinion 3/2012 on developments in biometric technologies, centralised biometric databases under the control of private sector entities appear to be very difficult to justify. Indeed, this stance is becoming more explicit with the prohibitions and extra controls on facial recognition databases and remote biometric identification in certain use cases under the EU AI Act.

Service providers within the industry that may be acting as data processors (e.g. cloud service providers, ground handlers or providers that assist with designing new processes) should also note that the Opinion deals with potential non-compliance with Article 32 (security) of the GDPR, which also applies directly to data processors.

This has the potential to have significant practical consequences, particularly at a time where players in the aviation industry are looking to explore more technically advanced and novel measures to streamline the airport experience and reduce airport congestion. Some of these measures may not be dissimilar to Scenario 3.1 and Scenario 3.2 described in the Opinion and summarised above.

In light of this Opinion, airport operators and airline companies will need to take a step back and evaluate any FRT systems they use or are considering using and emphasise user-based control of the templates and robust data security.

Photo of Marcus Evans (UK) Marcus Evans (UK)

Marcus is a communications, media and technology lawyer based in London. He focuses on data privacy and IT services.

Read more about Marcus Evans (UK)Marcus's Linkedin Profile
Photo of Shiv Daddar (UK) Shiv Daddar (UK)
Read more about Shiv Daddar (UK)
  • Posted in:
    Privacy & Data Security
  • Blog:
    Data Protection Report
  • Organization:
    Norton Rose Fulbright
  • Article: View Original Source

LexBlog logo
Copyright © 2026, LexBlog. All Rights Reserved.
Legal content Portal by LexBlog LexBlog Logo