Security Experts:

Swedish GDPR Fine Highlights Legal Challenges in Use of Biometrics

A small fine of $20,000 in Sweden highlights a potential problem for the use of biometrics in security throughout Europe, including American firms with offices in Europe.

In late August 2019, the Swedish data protection regulator issued its first ever fine under the General Data Protection Regulation (GDPR). The fine was for 200,000 Swedish Krona, which is just over $20,700.

The action was brought against the Skelleftea municipality, where a local school had run a trial facial biometric recognition system to track 22 students for a period of three weeks. The school had obtained the consent of both the students and their parents, and the trial was intended to improve school administration. The trial was a success, and the school had planned to expand the trial before the regulator stepped in and blocked it.

The regulator's decision was that the consent obtained did not satisfy GDPR consent requirements. According to the European Data Protection Board's commentary on the incident, "consent was not a valid legal basis given the clear imbalance between the data subject [the students] and the controller [the school]." The wider question for business and security is whether this same 'imbalance' also exists between employee and employer. 

It appears that it does, making the required use of biometrics (which is defined as personal data, in fact, a 'special category' of personal data) for purposes of authentication and access potentially problematic throughout Europe. This would also apply to the European offices of American companies.

"The data protection authorities in various EU countries," explains David Flint, a commercial law consultant at the Inksters law firm, "including the UK, have determined that the use of consent as a basis for lawful processing of personal data will not be sufficient in an employment situation (or indeed in an educational situation as held by the Swedish DPA). The basis for this decision is that there is a fundamental imbalance of power between the parties concerned. It is not open to the employee (or the student) to give informed independent consent as the service or right which they are intending to have -- employment or education -- is such that, were they to refuse, they would not be paid or would not receive schooling."

This is not a new decision. The Article 29 Working Party (comprising the data protection authorities of all EU member states, now replaced by European Data Protection Board) wrote (PDF) in 2017, "An imbalance of power also occurs in the employment context... It is unlikely that an employee would be able to respond freely to a request for consent from his/her employer... without feeling any pressure to consent. Therefore, WP29 deems it problematic for employers to process personal data of current or future employees on the basis of consent as it is unlikely to be freely given."

In such a relationship, 'consent' also fails Article 7 of GDPR, which states that consent must be accompanied by the ability to withdraw that consent. It is likely that an employee's withdrawal of consent in a biometric authentication situation would lead to termination of employment (or at least inability to continue working), making the original consent unlawful and further demonstrating the imbalance of power between the parties.

Consent is not the only acceptable justification for processing personal data; but is not acceptable in an employee/employer relationship. Article 6 of GDPR lists five other options. The most likely justification for business is 'legitimate interest' (although this is excluded for public authorities, so could not be used for most schools).

Chris Pounder, director of Amberhawk training and a specialist and practitioner in information law, explained the complexities of GDPR and biometrics. "While legitimate interest is justification for an employer to process employees' personal data in the context of the functioning of the business, granting access or authentication via biometric data is not fundamental to the operation of the business and is again likely to be illegal," he said. One of the problems is the classification of biometric data as a special category of personal data. Under GDPR, processing such data requires first, a lawful reason (such as consent or legitimate interest in Article 6), and, second, a 'condition' from Article 9.

Consent is the only lawful reason under Article 6; but there is no 'condition' under Article 9 that would make it lawful. And, of course, consent also fails the balance of power test.

It is worth noting that the police -- at least in the UK -- do not have similar problems over using facial biometrics. There are two fundamental reasons. Firstly, law enforcement is not bound by GDPR in this matter, but has separate legislation to which it must adhere. Secondly, law enforcement's use is different. It operates by comparing a scanned image to a watch list of lawfully obtained images. If there is no match, there is no identification of the scanned person. With no identification, there is no personal data -- and furthermore, a non-match image is discarded.

A second case to consider is a bank's use of biometrics -- such as HSBC's Voice ID. (Other banks have also introduced different types of biometric authentication.) It involves biometric personal data, and is justified by the user's consent. Here, the consent is acceptable because despite the apparent imbalance of power between user and bank, the user could choose to use a different bank, can revoke consent for the bank to hold the biometric data, and could choose to use a different (albeit more time-consuming) form of identification. This is a completely different scenario to the typical use of business using biometrics to identify and authorize its staff.

The growing use of biometric identification appears to be unstoppable. Within a few years, biometrics and smart things will combine for greater efficiency in the workplace -- from unlocking doors to activating devices. This is problematic if not simply illegal within GDPR's jurisdiction. It will require legislators and system designers to find a way in which biometrics can be used without being a threat to personal privacy and personal data. 

For now, authorization might be possible, while identification would be illegal. Some borrowing from the current police method might work. If a building held a database of authorized biometrics that did not identify the individuals concerned, then an employee would be allowed or denied access based solely on the existence of a biometric match. It would keep unauthorized individuals out, but would not tell the business who has been allowed in.

"I think new legislation is needed to solve this," commented Pounder. For the time being, however, businesses within the remit of GDPR -- and that will include U.S. businesses with offices in Europe, will have to be very careful over their use of biometrics as a form of staff identification.

Related: Biometrics: Dismantling the Myths Surrounding Facial Recognition 

Related: DHS HART Biometric Database Raises Security, Civil Liberties Concerns 

Related: As Facial Recognition Use Grows, So Do Privacy Fears 

Related: Huge US Facial Recognition Database Flawed: Audit

view counter
Kevin Townsend is a Senior Contributor at SecurityWeek. He has been writing about high tech issues since before the birth of Microsoft. For the last 15 years he has specialized in information security; and has had many thousands of articles published in dozens of different magazines – from The Times and the Financial Times to current and long-gone computer magazines.