Face it – Congressional Action Needed on Facial-Recognition, Other Biometric-Identification Technologies

By Wamiq Babul 

I.  Introduction

Even before the COVID-19 pandemic, biometric identifiers were serving an important role in American lives.[1]  Experts predict that to facilitate contactless transactions, post-pandemic businesses will increasingly rely on biometrics, especially facial-recognition technology.[2]  This article discusses some of the purposes, privacy concerns, and legal issues regarding biometric data, highlighting the need for a federal act that governs its use.

II.  Background

Increased Convenience, Accuracy, and Security

Ranging from eye scans, fingerprints, face-recognition[3], to health and sleep data[4], biometric systems use the most personal and unique elements of human identity.[5]  Biometrics streamline transactions and security screenings, among other practices that increase convenience and accuracy.[6]  As more commercial transactions are being conducted online, Congress is considering the Improving Digital Identity Act and U.S. Digital IDs.[7]  Digital IDs – “likely based on smartphones and biometrics” – would help mitigate U.S. consumers’ losses from identity theft and fraud (estimated at $16 billion in 2018).[8]

Biometrics can also improve consumer goods.  For example, at an estimated error rate of one in a million, Apple’s iPhone models project 30,000 infrared dots onto a user’s face, using pattern-matching to establish their identity.[9]  In employment settings necessitating optimal security, biometrics are perceived as more viable than traditional passwords.[10]  Facing threats of hacking and data breaches, entities are weary of “default, weak or even nonexistent passwords . . .”[11], often employing costly maintenance programs, such as systems for changing lost passwords.[12]

Privacy Concerns

An unclear “boundary between commercial and government data”[13] raises one privacy concern regarding biometrics.  Some Apple users fear that their facial scans are being saved in the “cloud” – or even worse, a government directory.[14]  In fact, Apple’s facial-identification software has been challenged for leading to an individual’s arrest for Apple Store thefts that he didn’t commit.[15]  Last year, Amazon – amid protests over racial inequality – banned police from using its “Rekognition” facial-recognition service[16], with Microsoft following suit[17] and IBM deciding to discontinue its recognition-technology altogether.[18]

In the absence of a uniform law on biometrics, some Congressmembers have considered limiting their use in governmental employment.[19]  The Facial Recognition and Biometric Technology Moratorium Act proposed banning federal agencies’ use of biometrics unless Congress enacts a law that specifically authorizes such use.[20]  Although the European Union in 2018 provided “robust protection[s]” for biometric data under the General Data Protection Regulation (GDPR),[21] the U.S. has yet to enact a nationwide statute.

III.  Discussion

State laws that regulate the use of biometrics attempt to balance the interests of businesses and private individuals.[22]  Lawmakers balance consumers’ and employees’ right to protect their biometric data with businesses’ use of such data to enhance security and provide products and services of higher quality.[23]  In the U.S., Illinois is the only state granting private citizens the right to sue under its biometric laws.[24]

Illinois’s Biometric Information Privacy Act (BIPA)

Enacted in 2008 as a reaction to the company Pay By Touch’s attempts to sell consumers’ fingerprints[25], BIPA has evolved into a formidable encumbrance for companies that use biometrics.[26]  Early BIPA plaintiffs failed to prove a legal “injury-in-fact,” despite claiming that “just losing control of one’s biometric privacy is injury enough.”[27]  In 2019, however, the Rosenbach v. Six Flags court held that petitioners alleging BIPA violations are no longer required to allege a suffered injury or adverse effect to have a statutory standing as an injured person.[28]  The Illinois Supreme Court elaborated that a defendant was subject to liability even in the absence of actual damages.[29]  Under this interpretation, BIPA functions to remedy past injury as well as compel companies to maintain biometric data before any substantial, irreversible harm occurs.[30]  These plaintiff-friendly developments have contributed to an increase in BIPA class-actions, with “700-plus” suits pending in Illinois state and federal courts.[31]

After the landmark Rosenbach decision, Patel v. Facebook brought biometric privacy litigation to the forefront of the tech industry.[32]  Plaintiffs challenged Facebook’s “Tag Suggestions” feature, which analyzes its existing photo-database to help users identify persons depicted in newer photos.[33]  At the time, Snapchat[34], Google[35], and Shutterfly[36] faced similar lawsuits.  According to the complaint, Facebook did not release plaintiffs’ biometric information to third parties”[37]  but rather violated BIPA’s requirement that biometrics be collected only with the issuance of a written release.[38]  The Court rejected Facebook’s argument (among others) that Illinois’s “statutes are not to be given extraterritorial effect,” which would’ve defeated the class-action by requiring a trial for each member to ascertain where the alleged violations for that member took place.[39]  Indicating BIPA’s substantive and procedural formidability, Facebook resulted in a $550 million settlement and subjected the tech giant to a $35 billion maximum penalty.[40]

California Consumer Privacy Act

Allegations against Clearview AI last year highlight the developing relationship between the law, government, and biometric technology.  Clearview AI, a New York company, provides facial recognition software to law enforcement agencies, businesses, and other third-parties.[41]  According to multiple class-actions, Clearview created a database of “faceprints” by “scraping” 3 billion user images from Facebook, Google, Twitter, and Instagram before selling database-access to third-parties.[42]  Plaintiffs invoked both BIPA and the CCPA, alleging that their biometric data was “collected and/or used by Clearview AI without prior notice . . . and without their consent.”[43]  Despite the CCPA’s limited private right of action[44], plaintiffs invoked California’s Unfair Competition Law, which prohibits business practices that violate other laws[45]; since its effectiveness in January 2020, plaintiffs have often used the CCPA as a “backdoor” for asserting separate claims.[46]

A uniform law with “well-funded enforcement through the Federal Trade Commission”[47] could address such potential abuse (of privacy and litigation) without overburdening companies.  In 2019 (before the pandemic), California’s Attorney General estimated that companies would spend $55 billion dollars in implementing CCPA compliance measures.[48]  Clearview’s legal battles reflect the current patchwork of laws that govern biometrics.[49]  These high costs may not necessarily provide “consumers with meaningful safeguards or additional security.”[50]  Rather, the average cost of $8 million for a data breach may compel companies – burdened with “compliance-confusion” – to compensate through other means that could harm consumers.[51]  Thus, Congress must enact biometrics legislation, either as a standalone law or as part of a comprehensive privacy regulation.

Each “Violation”

Congress must define the scope of “violation” under biometric-privacy regulations.[52]  Consider a class-action of 100 employees that are challenging a company’s timekeeping-policy of collecting fingerprints; is there a violation for every time each employee provided their fingerprint or are there only 100 violations?  In the former scenario, massive damages can accumulate.  On the other hand, it doesn’t seem fair that a company might repeatedly flout biometric laws with a single damages payment.  Assessing a similar situation, the court in Peatry v. Bimbo Bakeries USA, Inc. refrained from narrowly interpreting BIPA’s reference to “each violation.”[53]  The court suggested that damages under BIPA may be measured broadly by “each scan and each disclosure to a third-party rather than by each person whose biometrics were collected and shared.”[54]  Last October, a Seventh Circuit district court dismissed concerns of fairness in the former approach, stating that damages must be tacked on “even though the consequences may be harsh, unjust, absurd, or unwise.”[55]  Therefore, the definition of “violation” should be consistent, as it carries significant consequences for both plaintiffs and defendants.

IV.  Conclusion

Policy tends to struggle with narrowing the gap between technology and the methods for predicting its advancements.  The competing interests regarding biometrics – individuals’ privacy and entities’ efforts to enhance security and quality of service – create a difficult trade-off.  But the necessity of a federal biometrics law is underscored by a post-COVID world where companies may obtain biometric data at an unprecedented rate.


 

[1] Lauren Stewart, Big Data Discrimination: Maintaining Protection of Individual Privacy Without Disincentivizing Business’ Use of Biometric Data To Enhance Security, 60 B.C. L. Rev. 349, 349 (2019).

[2] Joel Griffin, The Role of Biometrics in a Post COVID-19 World, Security Infowatch (July 7, 2020),

https://www.securityinfowatch.com/access-identity/biometrics/article/21143152/the-role-of-biometrics-in-a-post-covid19-world; See also The Impact of COVID-19 on the California Consumer Privacy Act, Seyfarth Shaw (Apr. 7, 2020), https://www.seyfarth.com/news-insights/the-impact-of-covid-19-on-the-california-consumer-privacy-act-2.html (discussing private and public establishments need to screen and collect during the pandemic “physiological data of customers [and employees] entering the space for . . . temperature, prior testing results, and personal movement tracking based on cell phone information.”).

[3] Biometrics: authentication & identification (definition, trends, use cases, laws and latest news) – 2020 review, Gemalto (last updated Mar. 26, 2020),  https://www.gemalto.com/govt/inspired/biometrics.

[4] Natalie Prescott, The Anatomy of Biometric Laws: What U.S. Companies Need to Know in 2020, Mintz (Jan. 15, 2020), https://www.mintz.com/insights-center/viewpoints/2826/2020-01-15-anatomy-biometric-laws-what-us-companies-need-know-2020.

[5] Stewart, supra note 1, at 349.

[6] Biometrics: authentication & identification, supra note 3.

[7] Chris Burt, Author of US digital ID bill discusses central role of biometrics in ID2020, Biometric Update (Nov. 20, 2020), https://www.biometricupdate.com/202011/author-of-us-digital-id-bill-discusses-central-role-of-biometrics-in-id2020-webinar.

[8] J.C. Boggs, Michael Dohmann, Scott Ferber, The “Improving Digital Identity Act of 2020” Presents Bipartisan Digital Identity Infrastructure Reform, JD Supra (Oct. 2, 2020), https://www.jdsupra.com/legalnews/the-improving-digital-identity-act-of-29898/.

[9] David Cardinal, How Apple’s iPhone X TrueDepth Camera Works, ExtremeTech (Sept. 14, 2017, 2:15 PM), https://www.extremetech.com/mobile/255771-apple-iphone-x-truedepth-camera-works.

[10] Sarah Meyer, Biometric Identification – Knowing Who (and Where) You Are, Chief Privacy Officer Magazine: Data Privacy (Dec. 24, 2018), https://www.cpomagazine.com/data-privacy/biometric-identification-knowing-who-and-where-you-are/.

[11] See Roy Maurer, More Employers Are Using Biometric Authentication, Society for Human Resource (Apr. 6, 2018), https://www.shrm.org/resourcesandtools/hr-topics/technology/pages/employers-using-biometric-authentication.aspx. (discussing cybercriminals’ persistence and companies’ concerns over “losing proprietary information, and private communications getting out because of compromised security.”).

[12] See e.g. id. (revealing one director of program management in Microsoft’s identity division, who claims that he “spends over $2 million in help desk calls a month helping people change their passwords,” which incentivized Microsoft to launch Windows Hello, “which uses face scans or fingerprints to log in to Windows devices.”).

[13] Meyer, supra note 10.

[14] See Andrew Griffin, iPhone X: Is Apple Really Building A Huge Database of People’s Faces? What Does Face ID Actually Do? (Nov. 6, 2017 1:45 PM), https://www.independent.co.uk/life-style/gadgets-and-tech/news/iphone-x-apple-face-id-facial-recognition-privacy-security-truth-real-safety-a8040131.html (detailing the tech giant’s insistence that while “Your phone knows what your face looks like, in minute detail[.] . . Apple doesn’t”).

[15] Sigal Samuel, The growing backlash against facial recognition tech, Vox (Apr. 27, 2019, 8:00 AM), https://www.vox.com/future-perfect/2019/4/27/18518598/ai-facial-recognition-ban-apple-amazon-microsoft.

[16] Karen Hao, The two-year fight to stop Amazon from selling face recognition to the police, MIT Technology Review (June 12, 2020), https://www.technologyreview.com/2020/06/12/1003482/amazon-stopped-selling-police-face-recognition-fight/.

[17] Id.

[18] Id.

[19] Brandi Vincent, Senators Call for a Moratorium on Government’s Use of Facial Recognition, Nextgov (Feb. 14, 2020), https://www.nextgov.com/emerging-tech/2020/02/senators-call-moratorium-governments-use-facial-recognition/163131/.

[20] S.4084 – Facial Recognition and Biometric Technology Moratorium Act of 2020, 116th Congress.

[21] See Danny Ross, Processing biometric data? Be careful, under the GDPR, The Privacy Advisor (Oct. 31, 2017), https://iapp.org/news/a/processing-biometric-data-be-careful-under-the-gdpr/ (commending the GDPR’s active approach to biometric data privacy, which classifies biometric data as a “sensitive category of personal information, warranting robust protection.”).

[22] See generally 2007 ILL. ALS 994, 2007 Ill. Laws 994, 2007 ILL. P.A. 994, 2007 ILL. SB 2400 (outlining in the BIPA statute the relevant benefits promised by biometric usage for “streamlining financial transactions and security screenings.”).

[23] Stewart, supra note 1, at 349.

[24] See 5 Ways to Reduce Risk Under Ill. Biometric Privacy Law, Law 360 (Feb. 19, 2019,  4:21 PM), https://www.law360.com/articles/1128573/5-ways-to-reduce-risk-under-ill-biometric-privacy-law (discussing the ways in which BIPA has now finally cleared a path for private rights of actions despite the early judicial deference to arguments that private individuals could not sue for a lack of Article III standing).

[25] Chris Hoffman, Seventh Circuit Suggests that Unions Can Negotiate Workers’ Biometric Data Privacy Rights with Employers, Am. Bar Assoc. (Aug. 14, 2019), https://www.americanbar.org/groups/business_law/publications/committee_newsletters/cyberspace/2019/201908/unions/.

[26] See generally 2007 ILL. ALS 994, 2007 Ill. Laws 994, 2007 ILL. P.A. 994, 2007 ILL. SB 2400.

[27] Jennifer Lynch & Adam Schwartz, Victory! Illinois Supreme Court Protects Biometric Privacy, Electronic Frontier Foundation (Jan. 25, 2019), https://www.eff.org/deeplinks/2019/01/victory-illinois-supreme-court-protects-biometric-privacy.

[28] Hanley Chew et al., Five Steps to Help Reduce Risk of Using Biometrics Following Illinois Supreme Court BIPA Ruling, Fenwick & West (Jan. 29, 2019), https://www.fenwick.com/publications/Pages/Five-Steps-to-Help-Reduce-Risk-of-Using-Biometrics-Following-Illinois-Supreme-Court-BIPA-Ruling.aspx.

[29] Id.

[30] Id.

[31] Diane Flannery et al., Does Continued Collection of The Same Biometric Information Increase BIPA Violations? The Seventh Circuit (or Illinois Supreme Court) Has an Opportunity to Clear the Air, McGuireWoods: Password Protected law (Oct. 16, 2020), https://www.passwordprotectedlaw.com/2020/10/biometric-information/.

[32] See generally Patel v. Facebook Inc., 290 F. Supp. 3d 948 (N.D. Cal. 2018).

[33] Id.

[34] Will Yakowicz, A New Lawsuit Says Snapchat is Illegally Collecting Biometric Data, Slate (July 19, 2016), https://slate.com/business/2016/07/snapchat-sued-under-illinois-biometric-information-usage-law.html.

[35] Linn F. Freedman, Google Sued Under Illinois Biometric Information Privacy Act, Lexology: Data Privacy + Security Insider (Oct. 3, 2019), https://www.lexology.com/library/detail.aspx?g=51dd0122-9399-48e9-b6ef-fc357760d387.

[36] Corrado Rizzi, Shutterfly Violated Illinois Biometric Law by Collecting, Storing Face Scans from Uploaded Photos, Class Action Says, ClassAction (June 14, 2019), https://www.classaction.org/news/shutterfly-violated-illinois-biometric-law-by-collecting-storing-face-scans-from-uploaded-photos-class-action-says.

[37] Patel v. Facebook Inc., 290 F. Supp. 3d at 953 (N.D. Cal. 2018).

[38] Id.

[39] Id.

[40] Devin Coldewey, Facebook will pay $550 million to settle class action lawsuit over privacy violations, Tech Crunch (Jan. 29, 2020), https://techcrunch.com/2020/01/29/facebook-will-pay-550-million-to-settle-class-action-lawsuit-over-privacy-violations/.

[41] Kashmir Hill, The Secretive Company That Might End Privacy as We Know It, The New York Times (Feb. 10, 2020), https://www.nytimes.com/2020/01/18/technology/clearview-privacy-facial-recognition.html.

[42] Jennifer L. Henn, Clearview AI Wants Facial Recognition Class Action Lawsuit Dismissed, Top Class Actions (Oct. 12, 2020), https://topclassactions.com/lawsuit-settlements/privacy/clearview-ai-wants-facial-recognition-software-class-action-dismissed/.

[43] Luana Pascu, California residents file class action against Clearview AI biometric data collecting citing CCPA, Biometric Update (Mar. 16, 2020), https://www.biometricupdate.com/202003/california-residents-file-class-action-against-clearview-ai-biometric-data-collection-citing-ccpa.

[44] Jeffrey N. Rosenthal et. al, Analyzing the CCPA’s Impact on the Biometric Privacy Landscape, Legaltech news (Oct. 2020), https://www.law.com/legaltechnews/2020/10/14/analyzing-the-ccpas-impact-on-the-biometric-privacy-landscape/?slreturn=20201126000324.

[45] Id.

[46] See id. (highlighting plaintiffs’ usage of the CCPA as “predicate for causes of action under [CA’s] plaintiff-friendly Unfair Competition Law . . .”).

[47] See e.g. Michael Beckerman, Americans Will Pay a Price for State Privacy Laws, The N.Y Times (Oct. 14, 2019), https://www.nytimes.com/2019/10/14/opinion/state-privacy-laws.html  (suggesting that federal oversight “would allow Americans to benefit from multiple privacy protections, including the option to delete their data, transparency in data collection and the ability to move data among services.”).

[48] Standardized Regulatory Impact Assessment: California Consumer Privacy Act of 2018 Regulations, CA DOJ Office of the Attorney General at 11 (Aug. 2019), http://www.dof.ca.gov/Forecasting/Economics/Major_Regulations/Major_Regulations_Table/documents/CCPA_Regulations-SRIA-DOF.pdf.

[49] Beckerman, supra note 47.

[50] See id (arguing that the results from the IBM Security and Ponemon Institute suggest that complying with various state laws for “California businesses could cost billions.”).

[51] Id.

[52] Flannery et al., supra note 31.

[53] See Michael D. Hayes, Robert J. Tomaso, & Anne M. Mayette, Overview of Recent Decisions Interpreting the Illinois Biometric Information Privacy Act, Husch Blackwell (Oct. 15, 2019) (highlighting Bimbo Bakeries’  discussion of what “each violation” meant and its effect on the necessary amount in controversy for federal cases).

[54] Id.

[55] Flannery et al., supra note 31.