Anything You Say May Be Used Against You: Corporate Voiceprint Tactics Trigger Latest Privacy & Security Concerns

By Shruti Panchavati*

“We raid speech for its semantic meaning, and then discard the voice like detritus leftovers.”[1]

I. Introduction

Work is being done to integrate various biometrics into mobile devices, but the human voice is a natural choice for businesses because public attention on voiceprinting is shockingly low.  For instance, it came as no surprise when privacy concerns began to take form even as Apple unveiled its fingerprint scanner on the newest iPhone 5S;[2] lawmakers and advocates declared it a hacker’s “treasure trove.”[3]  And yet, despite its obvious functional similarities, Apple’s voiceprint scanner “Siri” has received little public scrutiny, suggesting a widespread misunderstanding about the human voice, one that mobile giants have been quick to market.[4]  The result is chilling: in the absence of legal and regulatory guidelines these corporations could be on their way to creating the largest name to voice database, without even trying.

An increasing number of mobile companies are combining voiceprint technology with broad privacy policies to gain an unfettered right to collect, store, and use an individual’s data for an indefinite period of time.  This Article examines Apple’s voiceprint policy and argues that modern-day remedial strategies have failed to protect users’ privacy and security.  In response, states should adopt and implement California’s Right to Know Act, which would allow users to access and track their digital footprint.  Part II of this Article highlights the sweeping implications of corporate voiceprinting.  Part III exposes the wide-reaching privacy and security implications in Apple’s ill named “Privacy” Policy.  Part IV recommends a practical, effective solution that balances the privacy concerns of the user against the commercial interests of the mobile industry.

II. An Audible Signature

Voiceprinting (also referred to as “voice biometrics”) creates a mathematical representation of the sound, pattern, pitch, and rhythm of an individual’s voice, which can then be used for any number of purposes, such as recognition or identification.[5]  The technology has the distinct advantage of basing authentication on an intrinsic human characteristic—the human voice.  It is our audible signature and, just as no two fingerprints are alike, no two voices are alike.[6]  It is also a powerful guide to the speaker’s most terrifyingly intimate details.[7]  With just a few words, the voice can reveal an individual’s gender, age, height, health, emotional state, sexual orientation, race, social class, education, and relationship to the person being spoken.[8]  It is a remarkably rich resource that is largely taken for granted, in part, because of the spread of mobile devices.

Mobile technology appears to have dissociated the voice from the body, lulling the public into a false sense of security about corporate voiceprinting.  To see its implications, consider that financial service organizations have already implemented voice biometrics to allow users to check account balances, make payments, track transactions simply using their voice.[9]  Additionally, governments across the globe are investing in voice biometrics that would allow them to tuck away millions of voiceprints for surveillance and law enforcement.[10]  Indeed, the human voice is now more valuable than any password or PIN number and widespread corporate collection, storage, and use of our audible signatures raises grave privacy and security concerns, begging the question: can mobile companies be trusted to handle this technology responsibly?

III. Unraveling Apple’s Voiceprint Policy

On October 4, 2011, Apple unveiled the iPhone 4S with Siri, a built-in interactive personal assistant.[11]  While it was not the first foray into speech-recognition technology,[12] it is the most popular and, after only five months of availability, the iPhone 4S sold about 15.4 million units.[13]  It is undoubtedly a remarkable technological achievement, but combined with its overbroad Privacy Policy, it can have many unforeseeable consequences for innocent users.

Apple’s iOS7 Software Licensing Agreement, in relevant part, notes that, “[w]hen you use Siri or Dictation, the things you say will be recorded and sent to Apple in order to convert what you say into text and to process your requests.”[14]  In other words, anything said to Siri is recorded and sent to the company’s data farm in North Carolina, where Apple converts the spoken words into a digital code.[15]  Not mentioned in the Privacy Policy is that the company assigns each user a randomized number and, as voice files from Siri requests are received, the data is assigned to that number.[16]  After six months, Apple then “disassociates” the user number from the voice clip, but maintains records of these disassociated files for up to eighteen months for “testing and product improvement purposes.”[17]  However, it remains unclear what Apple really does when it “dissociates” these files or what it means to use user voiceprints for “testing and product improvement purposes.”  Moreover, without any regulatory oversight, there is no guarantee that Apple ever actually deletes these records after eighteen months or at all.

Siri’s Privacy Policy further states that “[b]y using Siri or Dictation, you agree and consent to Apple’s and its subsidiaries’ and agents’ transmission, collection, maintenance, processing, and use of this information, including [the user’s] voice input and User Data, to provide and improve Siri, Dictation, and dictation functionality in other Apple products and services.”[18]  This information collected includes “all types of data associated with your verbal commands and may also include audio recordings, transcripts of what [is] said, and related diagnostic data.”[19]  What Apple is referring to here is a voiceprint, so by signing the licensing agreement, a user consents to the company’s collection, storage, and use of their voice biometric data.  Additionally, Apple gives itself the right to share this data with any of its unnamed partners and subsidiaries without notice or cause and for an indefinite period of time.

It may be argued that Apple and other like companies know better than to misuse user information because it would be poor public relations strategy.  There is no evidence to prove that corporations are currently exploiting their position.[20]  However, the problem remains that no one—users, lawmakers, privacy advocates, or politicians—knows what is happening behind closed doors and Apple is not saying either way.[21]  Personal data economy has become a largely elusive and highly lucrative world and, as always, real concern in privacy and security is not what is happening, but what could happen.

IV. Recommendation & Conclusion

With the widespread use of voiceprint technology in mobile phones, it is no surprise that companies, such as Apple, have digital portfolios on each user.  Banning voice biometric technology is not a desired option and admittedly companies do need some information about a user and his or her preferences to operate applications, such as Siri, efficiently.[22]  However, present-day remedies do not provide sufficient protections against corporate intrusions and data thefts.[23]

In the face of this dilemma, California’s “Right to Know” Act sets an unprecedented level of corporate transparency that gives users the right to access and track their own private data.[24]  Specifically, the Act requires that “any business that holds a customer’s personal information to disclose it within 30 days of that customer’s request. Adding to this, names and contact information of all third parties with which the business has shared that customer’s data with during the previous 12 months must also be disclosed.”[25]  Additionally, if the company refuses disclosure, the user has the legal right to bring a civil claim, forcing them to comply with the law.[26]  The Act mimics the right to access data that is already available to residents in Europe, proving that big technology giants, such as Apple, already have the procedures in place to respond.[27]  As more and more companies continue to implement efficient strategies to facilitate the process, the Act will not only have introduced corporate transparency into the digital age, but will likely have also made it the norm.

It may be argued that the Right to Know Act is too modest and does not actually give users the right to correct or delete their personal data.  These are certainly important considerations down the road and, in a perfect world, users would have full and complete control of all of their information.  However, it may be a long time, if ever, before that robust privacy and security strategies can be implemented.  In the meantime, the Right to Know Act is an important first step in putting privacy and security back in the hands of the user.


*J.D. Candidate, University of Illinois College of Law, expected 2015.  B.S. Psychology with Neuroscience Option, Pennsylvania State University, 2012.  I am grateful to the editors of the Journal of Law, Technology, and Policy for their advice and insight on this piece.

[1] Anne Karpf, The Human Voice: How this Extraordinary Instrument Reveals Essential Clues About Who We Are 13 (Bloomsbury USA, 1st ed. 2006).

[2] Chenda Ngak, Should You Fear Apple’s Fingerprint Scanner?, CBS News (Sept. 24, 2013, 10:12 AM),

[3] Charlie Osborne, iPhone Fingerprint Scanner Sparks Privacy Worries, CNET (Sept. 17, 2013, 9:55AM),

[4] See Kevin C. Tofel, How to Enable Experimental “OK Google” Voice Recognition on your Chromebook, Gigaom (Nov. 21, 2013, 8:33 AM), (noting that Google Voice is already a popular feature on the Android smartphone and Chrome).

[5] Authentify, Voice Biometric Authentication, (last visited Sep. 15, 2014).

[6] See id. (“A voice biometric or ‘voice print,’ is as unique to an individual as a palm or finger print.”).

[7] Karpf, supra note 1, at 10–11.

[8] Id.

[9] Omar Zaibak, 3 Banks Using Voice Biometrics for Security and Authentication, Voice Trust (Mar. 24, 2014),

[10] Noel Brinkerhoff, Governments Begin to Build Voice Print Databases, All Gov (Oct. 6, 2012),

[11] Press Release, Apple Launches iPhone 4S, iOS 5 & iCloud (Oct. 4, 2011), available at

[12] Bernadette Johnson, How Siri Works, HowStuffWorks, (last visited Sep. 15, 2014).

[13] Id.

[14] iOS Software License Agreement, Apple, available at (last visited Sep. 15, 2014) (emphasis original).

[15] John W. Mashni & Nicholas M. Oertel, Does Apple’s Siri Records and Store Everything You Say?, Technology Law Blog (July 17, 2012),

[16] Eric Slivka, Anonymized Siri Voice Clips Stored by Apple for Up to Two Years, MacRumors (Apr. 19, 2013, 6:42 AM),

[17] Id.

[18] iOS Software License Agreement, supra note 14.

[19] John Weaver, Siri is My Client: A First Look at Artificial Intelligence and Legal Issues, N.H. B. J., Winter 2012, at 6 available at

[20] See Matthew Panzarino, Apple Says It Has Never Worked With NSA To Create iPhone Backdoors, Is Unaware of Alleged DROPOUT JEEP Snooping Program, Tech Crunch (Dec. 31, 2013), (indicating that Apple denied creating any iPhone “backdoors” for the National Security Agency that would allow NSA to monitor Apple’s users).

[21] Barbara Ortutay, Apple Privacy Concerns: Experts Slam Apple Over ‘Locationgate,’ The Huffington Post (June 28, 2011),

[22] iOS Software License Agreement, supra note 14.

[23] Australian Associated Press, Facebook Gave Government Information on Hundreds of Australian Users, The Guardian (Aug. 28, 2013, 2:41 AM) (noting the failure of a claim by an Austrian law student, who invoked a “habeas data” right by demanding Facebook data).

[24] Rainey Reitman, New California “Right to Know” Act would Let Consumers Find out who has their Personal Data—and Get a Copy of it, Electronic Frontier Foundation (Apr. 2, 2013),

[25] Assembly Bill, 14 California Legislature 1291, (2013), available at

[26] Id.

[27] Reitman, supra note 24.