Menu Close

Defining remote biometric identification technically and legally

For a more accurate discussion on what remote biometric identification is and current and potential laws around it, the association of civil and human rights organizations, European Digital Rights (EDRi), has created a highly accessible guide to the technologies and legal landscape.

The guide is prompted by the December 2022 compromise among the European Union digital ministers on the EU’s upcoming AI Act, which EDRi claims will water down the Act’s proposed ban. The European Parliament’s co-rapporteurs countered by proposing a tenth set of amendments to the AI Act to ensure risk criteria would classify any AI systems handling biometric data as high-risk.

Germany appears to positioning itself against the December agreement and to be instead more closely aligned with the European Parliament, which wants a ban on mass biometric surveillance, ahead of the next rounds of dialogue on the AI Act which seek an agreement between the Council, Commission and Parliament.

EDRi’s guide also comes within a context of civil society organizations claiming they have been excluded from participating in the drafting of an international AI treaty handled by the Council of Europe. The United States is reportedly behind the decision.

Acceptability and consent

The guide is split into three sections covering the technologies and laws and legal proposals surrounding them, then a further section on EDRi’s recommendations.

The first section addresses what biometrics use cases are deemed legally acceptable, such as to unlock a smartphone, and use cases which are unacceptable to EDRi in terms of potential harm, such as being surveilled in public. In these areas the law needs to be strengthened.

As it involves biometrics processing, even the example of face scans or fingerprints to biometrically unlock one’s smartphone is only lawful within the bloc’s General Data Protection Regulation (GDPR) if it is done with “informed consent, the data are processed in a privacy-preserving and secure manner and not shared with unauthorized third parties, and all other data protection requirements are met,” notes the guide.

EDRi asks how being biometrically surveilled in public can be deemed to have received informed consent. Entering a public area covered by facial recognition cameras forces someone to undergo biometric processing. This is “coercive and not compatible with the aims of the GDPR, nor the EU’s human rights regime (in particular rights to privacy and data protection, freedom of expression and freedom of assembly and in many cases non-discrimination).”

The guide finds that providers are using exceptions within GDPR to effectively conduct mass surveillance, which data protection authorities are pursuing. EDRi argues that the law around RBI (remote biometric identification) in public places needs to be made more explicit.

The draft AI Act only banned live RBI, not post or retrospective. It also only banned the police from using it, but not central or local government or private companies. EDRi argue that in human rights terms, there is no real difference between being identified in real-time or after the matter, and that retrospective processing can prove more harmful.

The current Law Enforcement Directive does not clearly establish what types of biometrics processing by police should be blocked by criteria that specify what is sensitive.

Biometric identification vs verification

What happens when biometrics are taken needs to be clearly understood, finds the guide, as the difference between biometric identification and biometric verification is legally significant.

The guide outlines the 1:1 matching in verification such as phone unlocking, versus 1:N matching for identification, such as surveillance. Verification generally does not require comparison against a database of people and the sensitive data does not go anywhere.

It argues that the use of biometric identification for authentication purposes is growing, such as by pre-enrolling in system such as for travel and entering venues. EDRi warns that providers use language such as ‘validation’ and ‘authentication’ when what they are really doing is identification rather than verification.

Identification carries further risk as it requires a database prone to hacking and commercial exploitation. It can also introduce more risks around misidentification and empower biometric mass surveillance.

The addition of ‘remoteness’

Biometric verification, such as unlocking a phone or laptop or going through a passport gate, is an active move and not remote. CCTV cameras and sensors that can identify people are remote, even unseen.

“Although biometric identification is often referred to as 1:n (1-to-many matching), it’s actually more accurate to think of remote biometric identification as n:n (many-to-many matching).

“That’s because – even if only one person is being searched for – every single person gets scanned.”

This is biometric mass surveillance.

Tighten and define

The blog concludes with a series of amendments to the AI Act to ensure people are protected from biometric mass surveillance. Remote biometric identification should be banned – live and post – in all publicly-accessible areas, by all actors, with no exceptions.

Specific rewordings of articles of the AI Act are included, to remove exceptions for police use and to ensure the law applies online. The guide includes suggested wording to define an RBI system and ‘remote’ as well as add new prohibitions for RBI in the draft.

Biometric verification would not be affected. For a more accurate discussion on what remote biometric identification is and current and potential laws around it, the association of civil and human rights organizations, European Digital Rights (EDRi), has created a highly accessible guide to the technologies and legal landscape.

The guide is prompted by the December 2022 compromise among the European Union digital ministers on the EU’s upcoming AI Act, which EDRi claims will water down the Act’s proposed ban. The European Parliament’s co-rapporteurs countered by proposing a tenth set of amendments to the AI Act to ensure risk criteria would classify any AI systems handling biometric data as high-risk.

Germany appears to positioning itself against the December agreement and to be instead more closely aligned with the European Parliament, which wants a ban on mass biometric surveillance, ahead of the next rounds of dialogue on the AI Act which seek an agreement between the Council, Commission and Parliament.

EDRi’s guide also comes within a context of civil society organizations claiming they have been excluded from participating in the drafting of an international AI treaty handled by the Council of Europe. The United States is reportedly behind the decision.
Acceptability and consent
The guide is split into three sections covering the technologies and laws and legal proposals surrounding them, then a further section on EDRi’s recommendations.

The first section addresses what biometrics use cases are deemed legally acceptable, such as to unlock a smartphone, and use cases which are unacceptable to EDRi in terms of potential harm, such as being surveilled in public. In these areas the law needs to be strengthened.

As it involves biometrics processing, even the example of face scans or fingerprints to biometrically unlock one’s smartphone is only lawful within the bloc’s General Data Protection Regulation (GDPR) if it is done with “informed consent, the data are processed in a privacy-preserving and secure manner and not shared with unauthorized third parties, and all other data protection requirements are met,” notes the guide.

EDRi asks how being biometrically surveilled in public can be deemed to have received informed consent. Entering a public area covered by facial recognition cameras forces someone to undergo biometric processing. This is “coercive and not compatible with the aims of the GDPR, nor the EU’s human rights regime (in particular rights to privacy and data protection, freedom of expression and freedom of assembly and in many cases non-discrimination).”

The guide finds that providers are using exceptions within GDPR to effectively conduct mass surveillance, which data protection authorities are pursuing. EDRi argues that the law around RBI (remote biometric identification) in public places needs to be made more explicit.

The draft AI Act only banned live RBI, not post or retrospective. It also only banned the police from using it, but not central or local government or private companies. EDRi argue that in human rights terms, there is no real difference between being identified in real-time or after the matter, and that retrospective processing can prove more harmful.

The current Law Enforcement Directive does not clearly establish what types of biometrics processing by police should be blocked by criteria that specify what is sensitive.
Biometric identification vs verification
What happens when biometrics are taken needs to be clearly understood, finds the guide, as the difference between biometric identification and biometric verification is legally significant.

The guide outlines the 1:1 matching in verification such as phone unlocking, versus 1:N matching for identification, such as surveillance. Verification generally does not require comparison against a database of people and the sensitive data does not go anywhere.

It argues that the use of biometric identification for authentication purposes is growing, such as by pre-enrolling in system such as for travel and entering venues. EDRi warns that providers use language such as ‘validation’ and ‘authentication’ when what they are really doing is identification rather than verification.

Identification carries further risk as it requires a database prone to hacking and commercial exploitation. It can also introduce more risks around misidentification and empower biometric mass surveillance.
The addition of ‘remoteness’
Biometric verification, such as unlocking a phone or laptop or going through a passport gate, is an active move and not remote. CCTV cameras and sensors that can identify people are remote, even unseen.

“Although biometric identification is often referred to as 1:n (1-to-many matching), it’s actually more accurate to think of remote biometric identification as n:n (many-to-many matching).

“That’s because – even if only one person is being searched for – every single person gets scanned.”

This is biometric mass surveillance.
Tighten and define
The blog concludes with a series of amendments to the AI Act to ensure people are protected from biometric mass surveillance. Remote biometric identification should be banned – live and post – in all publicly-accessible areas, by all actors, with no exceptions.

Specific rewordings of articles of the AI Act are included, to remove exceptions for police use and to ensure the law applies online. The guide includes suggested wording to define an RBI system and ‘remote’ as well as add new prohibitions for RBI in the draft.

Biometric verification would not be affected.  Read More   

Generated by Feedzy

Disclaimer

Innov8 is owned and operated by Rolling Rock Ventures. The information on this website is for general information purposes only. Any information obtained from this website should be reviewed with appropriate parties if there is any concern about the details reported herein. Innov8 is not responsible for its contents, accuracies, and any inaccuracies. Nothing on this site should be construed as professional advice for any individual or situation. This website includes information and content from external sites that is attributed accordingly and is not the intellectual property of Innov8. All feeds ("RSS Feed") and/or their contents contain material which is derived in whole or in part from material supplied by third parties and is protected by national and international copyright and trademark laws. The Site processes all information automatically using automated software without any human intervention or screening. Therefore, the Site is not responsible for any (part) of this content. The copyright of the feeds', including pictures and graphics, and its content belongs to its author or publisher.  Views and statements expressed in the content do not necessarily reflect those of Innov8 or its staff. Care and due diligence has been taken to maintain the accuracy of the information provided on this website. However, neither Innov8 nor the owners, attorneys, management, editorial team or any writers or employees are responsible for its content, errors or any consequences arising from use of the information provided on this website. The Site may modify, suspend, or discontinue any aspect of the RSS Feed at any time, including, without limitation, the availability of any Site content.  The User agrees that all RSS Feeds and news articles are for personal use only and that the User may not resell, lease, license, assign, redistribute or otherwise transfer any portion of the RSS Feed without attribution to the Site and to its originating author. The Site does not represent or warrant that every action taken with regard to your account and related activities in connection with the RSS Feed, including, without limitation, the Site Content, will be lawful in any particular jurisdiction. It is incumbent upon the user to know the laws that pertain to you in your jurisdiction and act lawfully at all times when using the RSS Feed, including, without limitation, the Site Content.  

Close Bitnami banner
Bitnami