Skip Nav

JLTP is currently transitioning articles from our previous website to this updated platform, with completion expected by fall 2024. We apologize for any temporary inconveniences this may present. Our commitment remains to deliver an enhanced user experience and the highest standard of content. Thank you for your patience and continued support.

X

Practical Pieces & Perspectives

December 30, 2020
Face it – Congressional Action Needed on Facial-Recognition, Other Biometric-Identification Technologies
Wamiq Babul
I.  Introduction

Even before the COVID-19 pandemic, biometric identifiers were serving an important role in American lives.[1]  Experts predict that to facilitate contactless transactions, post-pandemic businesses will increasingly rely on biometrics, especially facial-recognition technology.[2]  This article discusses some of the purposes, privacy concerns, and legal issues regarding biometric data, highlighting the need for a federal act that governs its use.

 

II.  Background
Increased Convenience, Accuracy, and Security

Ranging from eye scans, fingerprints, face-recognition[3], to health and sleep data[4], biometric systems use the most personal and unique elements of human identity.[5]  Biometrics streamline transactions and security screenings, among other practices that increase convenience and accuracy.[6]  As more commercial transactions are being conducted online, Congress is considering the Improving Digital Identity Act and U.S. Digital IDs.[7]  Digital IDs – “likely based on smartphones and biometrics” – would help mitigate U.S. consumers’ losses from identity theft and fraud (estimated at $16 billion in 2018).[8]

 

Biometrics can also improve consumer goods.  For example, at an estimated error rate of one in a million, Apple’s iPhone models project 30,000 infrared dots onto a user’s face, using pattern-matching to establish their identity.[9]  In employment settings necessitating optimal security, biometrics are perceived as more viable than traditional passwords.[10]  Facing threats of hacking and data breaches, entities are weary of “default, weak or even nonexistent passwords . . .”[11], often employing costly maintenance programs, such as systems for changing lost passwords.[12]

 

Privacy Concerns

An unclear “boundary between commercial and government data”[13] raises one privacy concern regarding biometrics.  Some Apple users fear that their facial scans are being saved in the “cloud” – or even worse, a government directory.[14]  In fact, Apple’s facial-identification software has been challenged for leading to an individual’s arrest for Apple Store thefts that he didn’t commit.[15]  Last year, Amazon – amid protests over racial inequality – banned police from using its “Rekognition” facial-recognition service[16], with Microsoft following suit[17] and IBM deciding to discontinue its recognition-technology altogether.[18]

 

In the absence of a uniform law on biometrics, some Congressmembers have considered limiting their use in governmental employment.[19]  The Facial Recognition and Biometric Technology Moratorium Act proposed banning federal agencies’ use of biometrics unless Congress enacts a law that specifically authorizes such use.[20]  Although the European Union in 2018 provided “robust protection[s]” for biometric data under the General Data Protection Regulation (GDPR),[21] the U.S. has yet to enact a nationwide statute.

 

III.  Discussion

State laws that regulate the use of biometrics attempt to balance the interests of businesses and private individuals.[22]  Lawmakers balance consumers’ and employees’ right to protect their biometric data with businesses’ use of such data to enhance security and provide products and services of higher quality.[23]  In the U.S., Illinois is the only state granting private citizens the right to sue under its biometric laws.[24]

 

Illinois’s Biometric Information Privacy Act (BIPA)

Enacted in 2008 as a reaction to the company Pay By Touch’s attempts to sell consumers’ fingerprints[25], BIPA has evolved into a formidable encumbrance for companies that use biometrics.[26]  Early BIPA plaintiffs failed to prove a legal “injury-in-fact,” despite claiming that “just losing control of one’s biometric privacy is injury enough.”[27]  In 2019, however, the Rosenbach v. Six Flags court held that petitioners alleging BIPA violations are no longer required to allege a suffered injury or adverse effect to have a statutory standing as an injured person.[28]  The Illinois Supreme Court elaborated that a defendant was subject to liability even in the absence of actual damages.[29]  Under this interpretation, BIPA functions to remedy past injury as well as compel companies to maintain biometric data before any substantial, irreversible harm occurs.[30]  These plaintiff-friendly developments have contributed to an increase in BIPA class-actions, with “700-plus” suits pending in Illinois state and federal courts.[31]

 

After the landmark Rosenbach decision, Patel v. Facebook brought biometric privacy litigation to the forefront of the tech industry.[32]  Plaintiffs challenged Facebook’s “Tag Suggestions” feature, which analyzes its existing photo-database to help users identify persons depicted in newer photos.[33]  At the time, Snapchat[34], Google[35], and Shutterfly[36] faced similar lawsuits.  According to the complaint, Facebook did not release plaintiffs’ biometric information to third parties”[37]  but rather violated BIPA’s requirement that biometrics be collected only with the issuance of a written release.[38]  The Court rejected Facebook’s argument (among others) that Illinois’s “statutes are not to be given extraterritorial effect,” which would’ve defeated the class-action by requiring a trial for each member to ascertain where the alleged violations for that member took place.[39]  Indicating BIPA’s substantive and procedural formidability, Facebook resulted in a $550 million settlement and subjected the tech giant to a $35 billion maximum penalty.[40]

 

California Consumer Privacy Act

Allegations against Clearview AI last year highlight the developing relationship between the law, government, and biometric technology.  Clearview AI, a New York company, provides facial recognition software to law enforcement agencies, businesses, and other third-parties.[41]  According to multiple class-actions, Clearview created a database of “faceprints” by “scraping” 3 billion user images from Facebook, Google, Twitter, and Instagram before selling database-access to third-parties.[42]  Plaintiffs invoked both BIPA and the CCPA, alleging that their biometric data was “collected and/or used by Clearview AI without prior notice . . . and without their consent.”[43]  Despite the CCPA’s limited private right of action[44], plaintiffs invoked California’s Unfair Competition Law, which prohibits business practices that violate other laws[45]; since its effectiveness in January 2020, plaintiffs have often used the CCPA as a “backdoor” for asserting separate claims.[46]

 

A uniform law with “well-funded enforcement through the Federal Trade Commission”[47] could address such potential abuse (of privacy and litigation) without overburdening companies.  In 2019 (before the pandemic), California’s Attorney General estimated that companies would spend $55 billion dollars in implementing CCPA compliance measures.[48]  Clearview’s legal battles reflect the current patchwork of laws that govern biometrics.[49]  These high costs may not necessarily provide “consumers with meaningful safeguards or additional security.”[50]  Rather, the average cost of $8 million for a data breach may compel companies – burdened with “compliance-confusion” – to compensate through other means that could harm consumers.[51]  Thus, Congress must enact biometrics legislation, either as a standalone law or as part of a comprehensive privacy regulation.

 

Each “Violation”

Congress must define the scope of “violation” under biometric-privacy regulations.[52]  Consider a class-action of 100 employees that are challenging a company’s timekeeping-policy of collecting fingerprints; is there a violation for every time each employee provided their fingerprint or are there only 100 violations?  In the former scenario, massive damages can accumulate.  On the other hand, it doesn’t seem fair that a company might repeatedly flout biometric laws with a single damages payment.  Assessing a similar situation, the court in Peatry v. Bimbo Bakeries USA, Inc. refrained from narrowly interpreting BIPA’s reference to “each violation.”[53]  The court suggested that damages under BIPA may be measured broadly by “each scan and each disclosure to a third-party rather than by each person whose biometrics were collected and shared.”[54]  Last October, a Seventh Circuit district court dismissed concerns of fairness in the former approach, stating that damages must be tacked on “even though the consequences may be harsh, unjust, absurd, or unwise.”[55]  Therefore, the definition of “violation” should be consistent, as it carries significant consequences for both plaintiffs and defendants.

 

IV.  Conclusion

Policy tends to struggle with narrowing the gap between technology and the methods for predicting its advancements.  The competing interests regarding biometrics – individuals’ privacy and entities’ efforts to enhance security and quality of service – create a difficult trade-off.  But the necessity of a federal biometrics law is underscored by a post-COVID world where companies may obtain biometric data at an unprecedented rate.

Back
Sign Up Now