Caught Between Old Crimes And New Tech: Anti-Human Trafficking Efforts In The Modern Digital Age

By Jessica Wilkerson 

Introduction 

As society has become increasingly intertwined with and reliant upon the Internet, so have criminal investigations. While this explosion in digital evidence has in many ways been a boon—some commentators speak of a “golden era of surveillance”[1]—the growth and continued evolution of relevant technologies poses significant challenges to the prosecution of criminal acts.

This is especially true in the context of human trafficking investigations, which tend to heavily leverage digital infrastructures like mobile phones and the Internet. This article explores two evolving technologies—device encryption and DNS-over-HTTPS—to provide an explanation of how they work, and the challenges, both practical and legal, that they create for law enforcement efforts to combat human trafficking. In doing so, this article aims to create a deeper understanding of these technologies, dispel myths or confirm theories about their impacts, and explore proposals for ways in which necessary advancements in technology can, should, and must coexist with the needs of law enforcement to prosecute crime.

A.   Device Encryption

In human trafficking investigations, access to device-based evidence is crucial. For example, in a case involving the commercial sex trafficking of three minor girls, the defendant’s phone was critical in establishing that he took photos of the minor victims to advertise them online and that he then in fact created advertisements for the victims.[2] Additionally, the victims’ phones themselves served as evidence that the defendant had provided the phones for the purpose of receiving calls from clients.[3] The recovered evidence was therefore not only useful in establishing the substance of the trafficking charges,[4] but also for satisfying the necessary mens rea:[5] that based on the evidence gathered from the defendant’s phone, and the fact that he provided phones to the minor victims, “a jury could infer that [the defendant] knew” that the minor victims would be “caused to engage in a commercial sex act.”[6]

In fact, digital devices have become so central to human trafficking investigations that law enforcement officers seeking warrants often aver that “[a]ny cell phone or electronic device . . . may contain evidence of human trafficking” and that “individuals involved in [human trafficking] use cell phones and media devices to pay and post the advertisements for the victim’s services on the internet.”[7] However, as additional activity has moved online, and as digital devices have become the primary mechanisms through which individuals interact with the Internet, each other, and society at large, several companies have begun adding security measures to their products that have made access to this evidence more difficult. These measures have primarily involved the deployment of encryption.

1.   Deployment of Encryption on Devices

Contrary to the “flip phone” era, in which investigators could seize a phone and search it at their leisure with appropriate legal process or consent, as encryption initially began to spread across devices, data like texts, contacts, and more began being “scrambled” by encryption. Due to practical, legal, and other obstacles to obtaining passcodes to render this information accessible,[8] law enforcement instead turned to the devices’ OS developer for assistance in bypassing devices’ encryption.[9] At the time, they were able to receive such assistance because the encryption was software-based: the OS developers—typically Apple or Google—could bypass the encryption when served with legal process because they could bypass their own software.[10] However, over time both companies learned of secondary issues arising from this capability.[11] Consequently, both Apple and Google began making changes to their products to eliminate it.

Focusing on Apple, beginning with the iPhone 6 and Apple’s “iOS 8,” Apple made three key changes to the way its devices implement encryption and passcode use that significantly increased the deployment of both – Secure Enclave Processor (SEP) integration, changes to the set-up process to strongly encourage passcodes, and the introduction of biometric authentication capabilities known as Touch ID[12] and Face ID.[13] These changes combined make decryption default, as well as hardware-based. Because it is hardware-based, neither Apple (nor Google, due to analogous capabilities in their products) can bypass it.[14] And while few would argue that increased data protections for digital devices amounts to a net-negative, the spread of default, strong encryption has impacted law enforcement investigations.[15]

2.    Specific Impacts on Law Enforcement 

Law enforcement refers to the continued deployment of default, strong encryption as “Going Dark.”[16] As hyperbolic as the name may seem, the consequences are difficult to exaggerate. Law enforcement has lost the ability to search the devices of suspected terrorists, drug smugglers, human traffickers, and child predators where the suspect both uses a device with default, strong encryption, and where that suspect uses a passcode. Not only is law enforcement incapable of bypassing the encryption themselves, they can also no longer approach a given device’s OS developer for assistance, regardless of the legal process used.[17] For this reason, law enforcement often refers to default, strong encryption as “warrantless encryption.”[18]

In the context of the Going Dark debate, several proposals exist to address issues created by default, strongly encrypted devices, including “backdoors.”[19] However, backdoors are notoriously difficult to design and operate in a secure manner,[20] and many experts have universally condemned them.[21] Another proposal involves so-called “legal hacking,” i.e., finding or purchasing exploits to bypass features such as encryption.[22] But exploitable vulnerabilities sometimes do not exist,[23] and exploits can become “stale” quickly, making legal hacking potentially expensive and unreliable.[24] Another alternative is metadata analysis,[25] which focuses on the trail of digital “footprints” that individuals leave behind as they navigate the Internet.[26] However, law enforcement argues that it simply cannot replace actual content oftentimes only available on encrypted devices.[27] Other commentators have suggested that better training can alleviate some Going Dark issues.[28] The final proposal involves a legal procedure known as compelled decryption, but this a proposition that will lead to years of robust legal challenges.[29]

B.   DNS-over-HTTPS

Though few likely realize it, anyone who uses the Internet uses the Domain Name System (“DNS”), a backbone protocol of the Internet that translates human-readable web addresses like “www.law.edu” to computer-readable internet protocol (IP) addresses like 52.162.253.182. As digital technologies are incapable of understanding the former, and humans can’t memorize thousands of the latter, DNS is the modern Internet “phonebook,”[30] but it is one with limitations[31] and privacy concerns.[32]

1.    Traditional DNS and Human Trafficking Investigations 

The flip side of the privacy concerns related to DNS traffic is its use in countering criminal and terrorist activity, including human trafficking. Such law enforcement use typically falls into three buckets: content blocking, the examination of internet browsing histories, and the receipt of internet subscriber information.

          a.     Content Blocking

Content blocking is the simple blocking of “known bad” content online. Where typically DNS takes a request for a website, translates it, and then returns the result, with content blocking, DNS instead checks the requested material against a “block list,” and cuts off the translation if there is a match. The website then fails to load, and traffickers—sellers or customers—are unable to access the material.[33]

This prevents some traffickers from accessing the digital infrastructure they require to carry out their crimes.[34] In so doing, these block lists help prevent the spread of trafficking material and the ability of traffickers to connect both with their potential victims and their potential customers.

          b.     Internet Browsing Histories

Law enforcement typically leverages internet browsing history once they have already identified a suspect and have received legal authorization to gather information on that suspect’s Internet usage. Often, this involves a “trap-and-trace” order, pursuant to which Internet Service Providers (“ISPs”) will “mirror”[35] a suspect’s Internet browsing history. This allows investigators to see, for example, whether a suspect is visiting websites related to online advertisements for commercial sexual services, the sexual abuse of children, or other such trafficking-related sites.

Often, this process serves as critical evidence of not only the substantive elements of a given trafficking crime, but the associated mens rea, i.e. that the visiting of an exploitative website was not a “mistake” or an “accident” when it—and others like it—were visited several times, over an extended period of time.[36] In addition, the monitoring of such traffic helps identify other service providers that may have relevant information and who may then be served with legal process in the search for additional evidence.

          c.     Internet Subscriber Information

Finally, investigators often learn about the possible existence of traffickers by monitoring “known bad” material online. When it is accessed, investigators capture the IP addresses of the visitors, and then tie the captured IP addresses to physical identities by serving ISPs with appropriate legal process for the production of relevant internet subscriber information.[37] That information typically leads to a physical address, which investigators can then use to identify suspects who may have accessed the “known bad” material.

2.    DNS-over-HTTPS 

The investigative strategies detailed above each rely in one way or another on DNS. More to the point, they rely on a given ISP having access to unencrypted DNS traffic. However, recent developments related to the DNS protocol raises potential questions around their continued viability.

Specifically, an “updated” version of the traditional DNS protocol now exists that encrypts DNS traffic as it travels over the Internet: DNS-over-HTTPS (“DoH”). The basics of both protocols—DNS and DoH—remain the same. However, DoH makes two changes—the on-device encryption of DNS packets[38] and the use of a specialized DoH server, rather than a traditional DNS server[39]—that create two main downstream effects.

The first is the “blinding” of ISPs’ ability to view unencrypted DNS traffic. With DoH, the packets containing DNS requests are encrypted on the user’s machine.[40] Consequently, when those packets travel from the user’s machine to the ISP for routing, that ISP—unless that ISP is also the user’s DoH provider[41]—is no longer able to examine them to see which users are requesting which websites.

Second, the relocation of DNS traffic from ISPs to what are commonly referred to as “edge” providers[42] raises concerns about the different legal regimes under which the two groups operate.[43] The primary concern involves § 230 of the Communications Decency Act, which provides certain legal protections for website operators (edge providers).[44] Section 230 has been interpreted broadly, and has been used as a successful bar to lawsuits alleging damages proximately caused by or facilitated through websites, including in several cases of human trafficking.[45]

Critics fear that § 230 could be leveraged by edge providers to resist requests for DoH (DNS) data. However, § 230(e)(1) specifically states that § 230 “shall not be construed to impair the enforcement of [obscenity and child exploitation laws], or any other Federal criminal statute.”[46] The relevant legal statutes—including the Pen Register and Trap and Trace Act,[47] among others—would likely be considered such “Federal criminal statute[s],” even though they are procedural laws for collecting evidence, rather than substantive laws that provide offenses. However, because such a challenge has not been raised, the issue remains unsettled.

3.    Specific Impacts of DoH Deployment to Investigative Techniques 

Above, this article explained that law enforcement use of traditional DNS traffic typically falls into three buckets: content blocking, the use internet browsing histories, and the receipt of internet subscriber information.[48] It also highlighted that these investigative techniques rely on that DNS traffic being unencrypted and available to a given ISP. Because DoH generally vitiates both of these characteristics, DoH will impact each technique.

With respect to content blocking, non-DoH providing ISPs will lose the ability to perform content blocking entirely.[49] This is because such block lists require the comparison of unencrypted customer-requested websites against the block lists’ contents. Since ISPs will no longer have visibility into the DNS requests themselves, they will no longer be able to perform such comparisons and the associated “blocks” of exploitative material.

With respect to internet browsing histories, non-DoH providing ISPs will not be capable of turning over unencrypted—and therefore useful—records because they will not possess the necessary key to decrypt that traffic and make it usable to an investigation.[50]

The receipt of internet subscriber information, unlike the other two “buckets” of investigative techniques described above, will remain unchanged.[51] With regards to these three investigative techniques, then, the deployment of DoH will impact only the first two: content blocking and the receipt of internet browsing histories. However, these impacts may be limited or rendered irrelevant with concurrent updates to the investigative techniques themselves.

4.    Policy Proposals for Ensuring Continued Access to DNS Traffic 

As outlined above, the deployment of DoH creates several impacts[52] because over time investigative techniques have been built on the assumption that ISPs will have access to unencrypted DNS traffic. Thus, many concerns related to DoH deployment have centered around the “blinding” of ISPs.

While it is true that the two impacted techniques—content blocking and the receipt of internet browsing histories—are currently reliant on ISPs having access to unencrypted DNS traffic, they do not necessarily need to be. With adjustments, the techniques themselves may be updated to exist independently.

          a.     Content Blocking

Currently, ISPs may implement content blocking because they may view unencrypted DNS traffic and compare it to block lists of “known bad” websites. However, content blocking is currently dependent on ISPs only because stakeholders have chosen to make it so.

At its core, content blocking involves the comparison of a website request against a list of “known bad” materials. That core capability in no way relies upon who is doing that comparison: instead, it relies upon who has access to unencrypted DNS traffic. Thus, with the appropriate technological updates, DoH deployment will instead necessitate only a shift in where content blocking is performed.

Put bluntly, as a counterbalance to the loss of content blocking capability by ISPs as a direct result of DoH deployment, DoH providers should commit to performing appropriate content blocking through their own services. While the exact methods for doing so may not yet exist, the underlying technologies make it possible. Should DoH providers fail to take such preemptive, voluntary measures on their own, legislatures should require that such measures be undertaken by law.

          b.     Internet Browsing Histories

Similarly, law enforcement is currently able to request and receive internet browsing histories from ISPs because ISPs have access to unencrypted DNS traffic. But ISPs having such access is not the key to the investigative technique’s success. What matters is who has access to the unencrypted DNS traffic.

Like content blocking, then, DoH deployment will create a shift in the capability from ISPs to DoH providers. Instead of serving process on ISPs, investigators should serve DoH providers. As discussed above, doing so may require clarification of which legal processes need be used, and whether there are any legal gaps. Where such gaps exist, appropriate reforms should be made.

Similarly, DoH providers should commit to assisting any such efforts, and should work with law enforcement to ensure continued availability of internet browsing histories to investigators. Should DoH providers prove resistant, legislatures should update their legal frameworks to specifically require that DoH providers develop the capabilities to perform content blocking where appropriate, and to assist law enforcement in gaining access to browsing histories upon receipt of appropriate legal process.

Conclusion 

The deployment of default, strong encryption on devices and the implementation of DoH are critically important for privacy and cybersecurity writ large, and both present significant challenges for law enforcement, specifically those investigating human trafficking. However, the evolution of technology and the investigation of criminal activity need not exist in opposition to each other. As discussed in this article, venues for reconciliation of competing interests exist.


[1] ‘Going Dark Versus a ‘Golden Era of Surveillance’, Center for Democracy and Technology (Nov. 28, 2011), https://cdt.org/blog/%E2%80%98going-dark%E2%80%99-versus-a-%E2%80%98golden-age-for-surveillance%E2%80%99/.

[2] United States v. Corley, 679 F. App’x 1 (2d Cir.), cert. denied, 138 S. Ct. 205, 199 L. Ed. 2d 135 (2017) at 7.

[3] Id.

[4] People v. Guyton, 20 Cal. App. 5th 499, 229 Cal. Rptr. 3d 117 (Ct. App. 2018), review denied (May 9, 2018); State v. Sholar, 2018 WI 53, 381 Wis. 2d 560, 912 N.W.2d 89. (photos, text messages, videos, and metadata from two phones formed the basis for the charges against two separate defendants for sex trafficking). State v. Jackson, 2018-NMCA-066, 429 P.3d 674, cert. denied (Oct. 15, 2018). (photos and emails associated with one defendant’s phone were used to establish a connection between the defendant and several ads placed on a website advertising the sexual services of several girls, some of whom were minors at the time that the trafficking occurred.) Two separate 2019 cases describe how photos, text messages, and other documents provided evidence consistent with “pimping, pandering, and human trafficking People v. Calhoun, 38 Cal. App. 5th 275, 250 Cal. Rptr. 3d 623 (Ct. App. 2019), review denied (Oct. 30, 2019); People v. Boatwright, No. H044347, 2019 WL 5883683 (Cal. Ct. App. Nov. 12, 2019).

[5] People v. Higueros, No. 2D CRIM. B276709, 2018 WL 2112122 (Cal. Ct. App. May 8, 2018). Photographs and videos recovered from a defendant’s phone established not only that he had in fact had sexual relations with an underaged female, but that his taking of the photos and videos, and his later sending them to a third party, provided “substantial evidence from which a reasonable trier of fact could determine that [the defendant] induced [the victim] to engage in a commercial sex act.” The defendant’s phone was thus critical in not only providing evidence of the various crimes that had occurred, but once again, was necessary in establishing the mens rea of the human trafficking charge

[6] Id.

[7] North Carolina v. Boykins, No. COA 18-949, 2019 WL 5721819 (N.C. Ct. App. Nov. 5, 2019). See also United States v. Rogers, No. 4:16CR39 JAR/NCC, 2017 WL 3476778 (E.D. Mo. July 19, 2017), report and recommendation adopted, No. 4:16CR00039 JAR, 2017 WL 3458371 (E.D. Mo. Aug. 11, 2017) (containing warrant language used by the investigating officer regarding how “cellular telephones are used to facilitate [human trafficking] crimes”).

[8] See, e.g., Orin Kerr, Compelled Decryption and the Privilege Against Self-Incrimination, 97 Texas L.R. 768, 768-799 (2019), https://texaslawreview.org/wp-content/uploads/2019/03/Kerr.V97.4.pdf (discussing the Fifth Amendment’s forgone conclusion doctrine as applied to law enforcement requests for device passcodes).

[9] Danny Lewis, What the All Writs Act of 1789 Has to Do With the iPhone, Smithsonian (Feb. 24, 2016) https://www.smithsonianmag.com/smart-news/what-all-writs-act-1789-has-do-iphone-180958188/.

[10] Answers to your questions about Apple and security, Apple, https://www.apple.com/customer-letter/answers/ (“We’ve built progressively stronger protections into our products with each new software release”).

[11] Patrick Howell O’Neill, Cellebrite: Hacking into iPhones is Harder than Ever, CyberScoop (Oct. 6, 2017) https://www.cyberscoop.com/cellebrite-iphone-8-hacking/ (describing how Apple’s updated designs have made it harder for companies who sell device cracking capabilities to bypass security and encryption controls); WhatsApp, Inc. and Facebook, Inc.  v. NSO Group Tech. Ltd. and Q Cyber Tech. Ltd., https://www.documentcloud.org/documents/6532441-WhatsApp-Facebook-v-NSO-Group.html (lawsuit by WhatsApp/Facebook accusing NSO Group of impermissibly exploiting WhatsApp software in violation of the United States Computer Fraud and Abuse Act).

[12] See iOS Security iOS 12.3, https://www.apple.com/business/docs/site/iOS_Security_Guide.pdf (“Touch ID is the fingerprint sensing system that makes secure access to iPhone and iPad faster and easier”).

[13] Starting with the iPhone X, released in 2018, the company began offering iPhones and other similar devices with facial recognition. Touch ID and Face ID both require that a given device also have an associated passcode; to enable the former means to first enable the latter. The passcode is meant both as a backup. Id.

[14] Apple’s “Secure Enclave Processor” (SEP) into its products performs encryption at the hardware level. This change eliminates the ability of a given device’s OS developer to bypass or otherwise “work around” the encryption. Matthew Green, Why can’t Apple decrypt your iPhone?, A Few Thoughts On Cryptographic Engineering (Oct. 4, 2014), https://blog.cryptographyengineering.com/2014/10/04/why-cant-apple-decrypt-your-iphone/ (detailing the changes to Apple’s design that moved the encryption functionality from the software to the hardware).

[15] See infra.

[16] Going Dark — FBI, Federal Bureau of Investigation, https://www.fbi.gov/services/operational-technology/going-dark.

[17] See supra note 14.

[18] Chris Bing, Here comes the next round of encryption legislation, CyberScoop (Apr. 3, 2018) https://www.cyberscoop.com/new-encryption-bill-dianne-feinstein-chuck-grassley-senate-judiciary-comittee/.

[19] Report of the Manhattan District Attorney’s Office on Smartphone Encryption and Public Safety (Nov. 15, 2015),  https://cyber.harvard.edu/pubrelease/dont-panic/DA_Report_Smartphone_Encryption_Public_Safety_11182015.pdf (“Congress should enact a statute that requires any designer of an operating system for a smartphone or tablet manufactured, leased, or sold in the U.S. to ensure that data on its devices is accessible pursuant to a search warrant”).

[20] Harold Abelson et al., Keys under Doormats: Mandating Insecurity by Requiring Government Access to All Data and Communications (Cambridge, MA: MIT Computer Science & Artificial Intelligence Lab, 2015), 10, https://people.csail.mit.edu/rivest/pubs/AABBx15x.pdf (providing technical discussion of practical difficulty in securing backdoor capable/”exceptional access” systems).

[21] Id; Andrea Peterson, The debate over government ‘backdoors’ into encryption isn’t just happening in the U.S., The Wash. Post (Jan. 11, 2016) https://www.washingtonpost.com/news/the-switch/wp/2016/01/11/the-debate-over-government-backdoors-into-encryption-isnt-just-happening-in-the-u-s/.

[22] Cellebrite, https://www.cellebrite.com/en/home/; NSO Group, https://www.nsogroup.com/; Neri Zilber, The Rise of the Cyber Mercenaries, Foreign Policy (Aug. 31, 2018) https://foreignpolicy.com/2018/08/31/the-rise-of-the-cyber-mercenaries-israel-nso/.

[23] Andrea Peterson, Inside the economics of hacking, The Wash. Post (Nov. 5, 2015), https://www.washingtonpost.com/news/the-switch/wp/2015/11/05/inside-the-economics-of-hacking/.

[24] Dan Goodin, Zeroday exploit prices are higher than ever, especially for iOS and messaging apps, Ars Technica (Jan. 7, 2019)  https://arstechnica.com/information-technology/2019/01/zeroday-exploit-prices-continue-to-soar-especially-for-ios-and-messaging-apps/.

[25] Steven M. Bellovin, Matt Blaze, Sandy Clark & Susan Landau, Going Bright: Wiretapping Without Weakening Communications Infrastructure, IEEE SECURITY & PRIVACY, Jan/Feb 2013, at 64–66, available at https://www.cs.columbia.edu/~smb/papers/GoingBright.pdf (discussing ability of law enforcement/others to combine disparate types of digital data to obtain robust evidence).

[26] Id; Peter Swire and Kenesa Ahmad, Encryption and Globalization (November 16, 2011). Columbia Science and Technology Law Review, Vol. 23, 2012; Ohio State Public Law Working Paper No. 157. Available at http://dx.doi.org/10.2139/ssrn.1960602.

[27] Encryption Working Group Year End Report, H. Comm. on the Judiciary & H. Comm. on Energy and Commerce, (Dec. 2016),  http://energycommerce.house.gov/sites/republicans.energycommerce.house.gov/files/documents/114/analysis/20161219EWGFINALReport_0.pdf.

[28] They argue that law enforcement is not sufficiently aware of the non-device-based digital evidence that exists, and as a result, investigators have become unnecessarily reliant on gaining access to devices. Relatedly, these commentators also point to cases where law enforcement have served companies with the wrong types of legal process and cases where law enforcement has asked companies for data that they do not have, or for another companies’ data. Id. While better training is certainly needed, offering training at sufficient size, scale, or comprehensiveness to make up for the value of having a fully decrypted device in a given investigator’s hand, available when and where that investigator needs it, is infeasible. The United States Census Bureau reports that there are over 18,000 law enforcement agencies in the country. They vary in size, budget, sophistication, location, and in nearly all other respects, and all of them are faced with thousands of investigations per year, each presenting unique circumstances and challenges. Brian Reaves, Census of State and Local Law Enforcement Agencies, 2008, Dept. of Justice (2008), https://www.bjs.gov/content/pub/pdf/csllea08.pdf.

[29] See supra note 8.

[30] That said, DNS has its problems. Common misconfigurations, errors, or bugs in certain implementations can render DNS servers inaccessible to the systems that need them, essentially shutting off Internet access for those unable to bypass the servers by requesting a site’s IP address directly. Incorrect entries in DNS translation tables can be difficult to detect and even more difficult to fix.

[31] The number one issue with traditional DNS, and the one that has prompted hundreds of hours of discussion and pages upon pages of proposals for improvement is traditional DNS’s operation in “the clear.”, DNS Over HTTPS (doh), IETF Data Tracker, https://datatracker.ietf.org/wg/doh/about/; Geoff Huston, DOH! DNS over HTTPS explained, Asia-Pacific Network Information Centre (Oct. 12, 2018) https://blog.apnic.net/2018/10/12/doh-dns-over-https-explained/. In other words, traditional DNS is unencrypted, which allows anyone who is able to passively observe DNS traffic—an extremely low bar to clear—to identify which websites are being visited by which Internet users.

[32]The privacy impacts have been raised numerous times, over numerous years. Jon Brodkin, FTC investigates whether ISPs sell your browsing history and location data, Ars Technica (Mar. 27, 2019) https://arstechnica.com/tech-policy/2019/03/ftc-investigates-whether-isps-sell-your-browsing-history-and-location-data/. The Federal Communications Commission in 2016 implemented broadband privacy rules to, in part, address this issue, although those rules were then rolled back in 2017 by Congress. Kimberly Kindy, How Congress dismantled federal Internet privacy rules, The Wash. Post (May 30, 2017) https://www.washingtonpost.com/politics/how-congress-dismantled-federal-internet-privacy-rules/2017/05/29/7ad06e14-2f5b-11e7-8674-437ddb6e813e_story.html; PROTECTING THE PRIVACY OF CUSTOMERS OF BROADBAND AND OTHER TELECOMMUNICATIONS SERVICES, PL 115-22, April 3, 2017, 131 Stat 88, https://www.congress.gov/115/plaws/publ22/PLAW-115publ22.pdf.

[33] Such block lists may include, for example, known sites where child sexual abuse is livestreamed for both national and international consumption. See e.g., URL List | Internet Watch Foundation, https://www.iwf.org.uk/become-a-member/services-for-members/url-list (“The IWF URL List is a list of webpages where we’ve found images and videos of child sexual abuse”).

[34] Exposing child victims: The catastrophic impact of DNS-over-HTTPs, Internet Watch Foundation (June 10, 2019)

https://www.iwf.org.uk/news/exposing-child-victims-catastrophic-impact-of-dns-over-https.

[35] In cases where internet traffic is “mirrored,” it is duplicated in real-time and sent to a designated web service controlled by a given law enforcement agency, who may then examine it. See, e.g.,  How Traffic Mirroring Works, Amazon Web Services, https://docs.aws.amazon.com/vpc/latest/mirroring/traffic-mirroring-how-it-works.html.

[36] The human trafficking offenses contained in 18 U.S.C.A. § 1589-92 each require that a defendant “knowingly” undertake to cause a victim to be trafficked through force, fraud or coercion. Under federal law, this requires a showing that a defendant was aware of his or her actions and did not act or fail to act out of ignorance, mistake, or accident. See e.g. 1A Fed. Jury Prac. & Instr. § 17:04 (6th ed.) (“The term “knowingly”, as used in these instructions to describe the alleged state of mind of Defendant, means that [he] [she] was conscious and aware of [his] [her] [action] [omission], realized what [he] [she] was doing or what was happening around [him] [her], and did not [act] [fail to act] because of ignorance, mistake, or accident”).

[37] United States v. Acosta, 619 F.3d 956 (8th Cir. 2010) (FBI agent posted fake child sexual abuse materials on known exploitative website and collected IP addresses that attempted to access the materials, one of which was then associated with defendant); In re Austin B., 208 A.3d 1178 (R.I. 2019) (police detective observed IP address eventually linked to suspect being used to connect to “peer-to-peer” file-sharing system in order to access and download child sexual abuse images).

[38] When using DoH, unlike with traditional DNS, the “packet” containing the DNS request—which behaves on the Internet much like a letter behaves within the physical mail system—is encrypted before it leaves a given user’s machine. Thus, when that DNS packet leaves a user’s device and travels out into the Internet, it is indecipherable to anyone but the holder of the appropriate decryption key.

[39] Instead of being sent to a traditional DNS server—typically run by a given individual’s or business’ Internet Service Provider (ISP), such as Verizon, Comcast, etc.—DoH packets are sent to a DoH server. Only once it reaches the DoH server is the packet decrypted, the DNS request analyzed, and the appropriate IP address for a given website looked up in the DoH server’s “phonebook” of translation tables. At that point, the same process is then completed in reverse: the packet is re-encrypted with the translated IP address now stored in the “letter,” and sent back to the user.

[40] See supra note 38.

[41] While at the time of this writing, there are few ISPs offering DoH services, there is nothing to stop ISPs who have in the past deployed traditional DNS from offering DoH services as well.

[42] In the Matter of Preserving the Open Internet Broadband Indus. Practices, 25 F.C.C. Rcd. 17905 (2010) (defining edge providers as those “providing content, applications, services, and devices accessed over or connected to broadband Internet access service (“edge” products and services)).

[43] Letter from 20 Child Safety Groups to the Hon. Lindsey Graham, the Hon. Dianne Feinstein, S. Comm. on the Judiciary, the Hon. Jerrold Nadler, the Hon. Doug Collins, H. Comm. on Judiciary, the Hon. Roger Wicker, the Hon. Maria Cantwell, S. Comm. on Commerce, Science, and Transportation, the Hon. Frank Pallone, the Hon. Greg Walden, H. Comm. on Energy & Commerce, (Sep. 17, 2019), https://endsexualexploitation.org/articles/20-child-safety-groups-call-on-congress-stop-google-before-it-jeopardizes-online-child-safety/.

[44] 47 U.S.C.A. § 230 (West); Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12 (1st Cir. 2016) at 19.

[45] Id at 29 (dismissing Trafficking Victims Protection Act claims against website and explaining “The appellants’ core argument is that Backpage has tailored its website to make sex trafficking easier. Aided by the amici, the appellants have made a persuasive case for that proposition. But Congress did not sound an uncertain trumpet when it enacted the CDA, and it chose to grant broad protections to internet publishers. Showing that a website operates through a meretricious business model is not enough to strip away those protections”). See also M.A. ex rel. P.K. v. Vill. Voice Media Holdings, LLC, 809 F. Supp. 2d 1041 (E.D. Mo. 2011), J.S. v. Vill. Voice Media Holdings, L.L.C., 184 Wash. 2d 95, 359 P.3d 714 (2015), and Backpage.com, LLC v. Cooper, 939 F. Supp. 2d 805 (M.D. Tenn. 2013) (dismissing alleged trafficking offenses on CDA 230 grounds).

[46] 47 U.S.C.A. § 230 (West).

[47] 18 U.S.C.A. § 3121 (West) et seq.

[48] See supra Section 1(a)-(c).

[49] It is important to note that this loss of capability will not occur where an ISP is also a given customer’s DoH provider. In that case, the ISP-as-DoH-provider will possess the appropriate decryption key to “unlock” the DNS request packet, and therefore will have access to the packet’s unencrypted contents: the website request itself. In such cases, the deployment of DoH becomes, essentially, irrelevant to the ISP’s ability to perform content blocking. While it will require updated techniques and strategies, ISPs who choose to offer DoH will be able to continue performing content blocking unimpeded.

[50] This will change as economic and other incentives drive ISPs to deploy DoH. Such efforts are already underway. See Encrypted DNS Deployment Initiative, https://www.encrypted-dns.org/.

       [51] This is due to the fact that, unlike the other two techniques, while the “mapping” of physical identities to IP addresses is essential to the smooth operation of DNS, the generation of internet subscriber information exists independent of DNS itself. Instead, an entirely separate protocol is used to identify and assign those IP addresses; a process which takes place prior to and separate from DNS translation.

[52] See supra Section 3.