Skip Nav

JLTP is currently transitioning articles from our previous website to this updated platform, with completion expected by fall 2024. We apologize for any temporary inconveniences this may present. Our commitment remains to deliver an enhanced user experience and the highest standard of content. Thank you for your patience and continued support.

X

Practical Pieces & Perspectives

February 15, 2021
Caught Between Old Crimes And New Tech: Anti-Human Trafficking Efforts In The Modern Digital Age
Jessica Wilkerson
Introduction 

As society has become increasingly intertwined with and reliant upon the Internet, so have criminal investigations. While this explosion in digital evidence has in many ways been a boon—some commentators speak of a “golden era of surveillance”[1]—the growth and continued evolution of relevant technologies poses significant challenges to the prosecution of criminal acts.

 

This is especially true in the context of human trafficking investigations, which tend to heavily leverage digital infrastructures like mobile phones and the Internet. This article explores two evolving technologies—device encryption and DNS-over-HTTPS—to provide an explanation of how they work, and the challenges, both practical and legal, that they create for law enforcement efforts to combat human trafficking. In doing so, this article aims to create a deeper understanding of these technologies, dispel myths or confirm theories about their impacts, and explore proposals for ways in which necessary advancements in technology can, should, and must coexist with the needs of law enforcement to prosecute crime.

 

A.   Device Encryption

In human trafficking investigations, access to device-based evidence is crucial. For example, in a case involving the commercial sex trafficking of three minor girls, the defendant’s phone was critical in establishing that he took photos of the minor victims to advertise them online and that he then in fact created advertisements for the victims.[2] Additionally, the victims’ phones themselves served as evidence that the defendant had provided the phones for the purpose of receiving calls from clients.[3] The recovered evidence was therefore not only useful in establishing the substance of the trafficking charges,[4] but also for satisfying the necessary mens rea:[5] that based on the evidence gathered from the defendant’s phone, and the fact that he provided phones to the minor victims, “a jury could infer that [the defendant] knew” that the minor victims would be “caused to engage in a commercial sex act.”[6]

 

In fact, digital devices have become so central to human trafficking investigations that law enforcement officers seeking warrants often aver that “[a]ny cell phone or electronic device . . . may contain evidence of human trafficking” and that “individuals involved in [human trafficking] use cell phones and media devices to pay and post the advertisements for the victim’s services on the internet.”[7] However, as additional activity has moved online, and as digital devices have become the primary mechanisms through which individuals interact with the Internet, each other, and society at large, several companies have begun adding security measures to their products that have made access to this evidence more difficult. These measures have primarily involved the deployment of encryption.

 

1.   Deployment of Encryption on Devices

Contrary to the “flip phone” era, in which investigators could seize a phone and search it at their leisure with appropriate legal process or consent, as encryption initially began to spread across devices, data like texts, contacts, and more began being “scrambled” by encryption. Due to practical, legal, and other obstacles to obtaining passcodes to render this information accessible,[8] law enforcement instead turned to the devices’ OS developer for assistance in bypassing devices’ encryption.[9] At the time, they were able to receive such assistance because the encryption was software-based: the OS developers—typically Apple or Google—could bypass the encryption when served with legal process because they could bypass their own software.[10] However, over time both companies learned of secondary issues arising from this capability.[11] Consequently, both Apple and Google began making changes to their products to eliminate it.

Focusing on Apple, beginning with the iPhone 6 and Apple’s “iOS 8,” Apple made three key changes to the way its devices implement encryption and passcode use that significantly increased the deployment of both – Secure Enclave Processor (SEP) integration, changes to the set-up process to strongly encourage passcodes, and the introduction of biometric authentication capabilities known as Touch ID[12] and Face ID.[13] These changes combined make decryption default, as well as hardware-based. Because it is hardware-based, neither Apple (nor Google, due to analogous capabilities in their products) can bypass it.[14] And while few would argue that increased data protections for digital devices amounts to a net-negative, the spread of default, strong encryption has impacted law enforcement investigations.[15]

 

2.    Specific Impacts on Law Enforcement 

Law enforcement refers to the continued deployment of default, strong encryption as “Going Dark.”[16] As hyperbolic as the name may seem, the consequences are difficult to exaggerate. Law enforcement has lost the ability to search the devices of suspected terrorists, drug smugglers, human traffickers, and child predators where the suspect both uses a device with default, strong encryption, and where that suspect uses a passcode. Not only is law enforcement incapable of bypassing the encryption themselves, they can also no longer approach a given device’s OS developer for assistance, regardless of the legal process used.[17] For this reason, law enforcement often refers to default, strong encryption as “warrantless encryption.”[18]

 

In the context of the Going Dark debate, several proposals exist to address issues created by default, strongly encrypted devices, including “backdoors.”[19] However, backdoors are notoriously difficult to design and operate in a secure manner,[20] and many experts have universally condemned them.[21] Another proposal involves so-called “legal hacking,” i.e., finding or purchasing exploits to bypass features such as encryption.[22] But exploitable vulnerabilities sometimes do not exist,[23] and exploits can become “stale” quickly, making legal hacking potentially expensive and unreliable.[24] Another alternative is metadata analysis,[25] which focuses on the trail of digital “footprints” that individuals leave behind as they navigate the Internet.[26] However, law enforcement argues that it simply cannot replace actual content oftentimes only available on encrypted devices.[27] Other commentators have suggested that better training can alleviate some Going Dark issues.[28] The final proposal involves a legal procedure known as compelled decryption, but this a proposition that will lead to years of robust legal challenges.[29]

 

B.   DNS-over-HTTPS

Though few likely realize it, anyone who uses the Internet uses the Domain Name System (“DNS”), a backbone protocol of the Internet that translates human-readable web addresses like “www.law.edu” to computer-readable internet protocol (IP) addresses like 52.162.253.182. As digital technologies are incapable of understanding the former, and humans can’t memorize thousands of the latter, DNS is the modern Internet “phonebook,”[30] but it is one with limitations[31] and privacy concerns.[32]

 

1.    Traditional DNS and Human Trafficking Investigations 

The flip side of the privacy concerns related to DNS traffic is its use in countering criminal and terrorist activity, including human trafficking. Such law enforcement use typically falls into three buckets: content blocking, the examination of internet browsing histories, and the receipt of internet subscriber information.

 

          a.     Content Blocking

Content blocking is the simple blocking of “known bad” content online. Where typically DNS takes a request for a website, translates it, and then returns the result, with content blocking, DNS instead checks the requested material against a “block list,” and cuts off the translation if there is a match. The website then fails to load, and traffickers—sellers or customers—are unable to access the material.[33]

 

This prevents some traffickers from accessing the digital infrastructure they require to carry out their crimes.[34] In so doing, these block lists help prevent the spread of trafficking material and the ability of traffickers to connect both with their potential victims and their potential customers.

 

          b.     Internet Browsing Histories

Law enforcement typically leverages internet browsing history once they have already identified a suspect and have received legal authorization to gather information on that suspect’s Internet usage. Often, this involves a “trap-and-trace” order, pursuant to which Internet Service Providers (“ISPs”) will “mirror”[35] a suspect’s Internet browsing history. This allows investigators to see, for example, whether a suspect is visiting websites related to online advertisements for commercial sexual services, the sexual abuse of children, or other such trafficking-related sites.

 

Often, this process serves as critical evidence of not only the substantive elements of a given trafficking crime, but the associated mens rea, i.e. that the visiting of an exploitative website was not a “mistake” or an “accident” when it—and others like it—were visited several times, over an extended period of time.[36] In addition, the monitoring of such traffic helps identify other service providers that may have relevant information and who may then be served with legal process in the search for additional evidence.

 

          c.     Internet Subscriber Information

Finally, investigators often learn about the possible existence of traffickers by monitoring “known bad” material online. When it is accessed, investigators capture the IP addresses of the visitors, and then tie the captured IP addresses to physical identities by serving ISPs with appropriate legal process for the production of relevant internet subscriber information.[37] That information typically leads to a physical address, which investigators can then use to identify suspects who may have accessed the “known bad” material.

 

2.    DNS-over-HTTPS 

The investigative strategies detailed above each rely in one way or another on DNS. More to the point, they rely on a given ISP having access to unencrypted DNS traffic. However, recent developments related to the DNS protocol raises potential questions around their continued viability.

 

Specifically, an “updated” version of the traditional DNS protocol now exists that encrypts DNS traffic as it travels over the Internet: DNS-over-HTTPS (“DoH”). The basics of both protocols—DNS and DoH—remain the same. However, DoH makes two changes—the on-device encryption of DNS packets[38] and the use of a specialized DoH server, rather than a traditional DNS server[39]—that create two main downstream effects.

 

The first is the “blinding” of ISPs’ ability to view unencrypted DNS traffic. With DoH, the packets containing DNS requests are encrypted on the user’s machine.[40] Consequently, when those packets travel from the user’s machine to the ISP for routing, that ISP—unless that ISP is also the user’s DoH provider[41]—is no longer able to examine them to see which users are requesting which websites.

 

Second, the relocation of DNS traffic from ISPs to what are commonly referred to as “edge” providers[42] raises concerns about the different legal regimes under which the two groups operate.[43] The primary concern involves § 230 of the Communications Decency Act, which provides certain legal protections for website operators (edge providers).[44] Section 230 has been interpreted broadly, and has been used as a successful bar to lawsuits alleging damages proximately caused by or facilitated through websites, including in several cases of human trafficking.[45]

 

Critics fear that § 230 could be leveraged by edge providers to resist requests for DoH (DNS) data. However, § 230(e)(1) specifically states that § 230 “shall not be construed to impair the enforcement of [obscenity and child exploitation laws], or any other Federal criminal statute.”[46] The relevant legal statutes—including the Pen Register and Trap and Trace Act,[47] among others—would likely be considered such “Federal criminal statute[s],” even though they are procedural laws for collecting evidence, rather than substantive laws that provide offenses. However, because such a challenge has not been raised, the issue remains unsettled.

 

3.    Specific Impacts of DoH Deployment to Investigative Techniques 

Above, this article explained that law enforcement use of traditional DNS traffic typically falls into three buckets: content blocking, the use internet browsing histories, and the receipt of internet subscriber information.[48] It also highlighted that these investigative techniques rely on that DNS traffic being unencrypted and available to a given ISP. Because DoH generally vitiates both of these characteristics, DoH will impact each technique.

 

With respect to content blocking, non-DoH providing ISPs will lose the ability to perform content blocking entirely.[49] This is because such block lists require the comparison of unencrypted customer-requested websites against the block lists’ contents. Since ISPs will no longer have visibility into the DNS requests themselves, they will no longer be able to perform such comparisons and the associated “blocks” of exploitative material.

 

With respect to internet browsing histories, non-DoH providing ISPs will not be capable of turning over unencrypted—and therefore useful—records because they will not possess the necessary key to decrypt that traffic and make it usable to an investigation.[50]

 

The receipt of internet subscriber information, unlike the other two “buckets” of investigative techniques described above, will remain unchanged.[51] With regards to these three investigative techniques, then, the deployment of DoH will impact only the first two: content blocking and the receipt of internet browsing histories. However, these impacts may be limited or rendered irrelevant with concurrent updates to the investigative techniques themselves.

 

4.    Policy Proposals for Ensuring Continued Access to DNS Traffic 

As outlined above, the deployment of DoH creates several impacts[52] because over time investigative techniques have been built on the assumption that ISPs will have access to unencrypted DNS traffic. Thus, many concerns related to DoH deployment have centered around the “blinding” of ISPs.

 

While it is true that the two impacted techniques—content blocking and the receipt of internet browsing histories—are currently reliant on ISPs having access to unencrypted DNS traffic, they do not necessarily need to be. With adjustments, the techniques themselves may be updated to exist independently.

 

          a.     Content Blocking

Currently, ISPs may implement content blocking because they may view unencrypted DNS traffic and compare it to block lists of “known bad” websites. However, content blocking is currently dependent on ISPs only because stakeholders have chosen to make it so.

 

At its core, content blocking involves the comparison of a website request against a list of “known bad” materials. That core capability in no way relies upon who is doing that comparison: instead, it relies upon who has access to unencrypted DNS traffic. Thus, with the appropriate technological updates, DoH deployment will instead necessitate only a shift in where content blocking is performed.

 

Put bluntly, as a counterbalance to the loss of content blocking capability by ISPs as a direct result of DoH deployment, DoH providers should commit to performing appropriate content blocking through their own services. While the exact methods for doing so may not yet exist, the underlying technologies make it possible. Should DoH providers fail to take such preemptive, voluntary measures on their own, legislatures should require that such measures be undertaken by law.

 

          b.     Internet Browsing Histories

Similarly, law enforcement is currently able to request and receive internet browsing histories from ISPs because ISPs have access to unencrypted DNS traffic. But ISPs having such access is not the key to the investigative technique’s success. What matters is who has access to the unencrypted DNS traffic.

 

Like content blocking, then, DoH deployment will create a shift in the capability from ISPs to DoH providers. Instead of serving process on ISPs, investigators should serve DoH providers. As discussed above, doing so may require clarification of which legal processes need be used, and whether there are any legal gaps. Where such gaps exist, appropriate reforms should be made.

 

Similarly, DoH providers should commit to assisting any such efforts, and should work with law enforcement to ensure continued availability of internet browsing histories to investigators. Should DoH providers prove resistant, legislatures should update their legal frameworks to specifically require that DoH providers develop the capabilities to perform content blocking where appropriate, and to assist law enforcement in gaining access to browsing histories upon receipt of appropriate legal process.

 

Conclusion 

The deployment of default, strong encryption on devices and the implementation of DoH are critically important for privacy and cybersecurity writ large, and both present significant challenges for law enforcement, specifically those investigating human trafficking. However, the evolution of technology and the investigation of criminal activity need not exist in opposition to each other. As discussed in this article, venues for reconciliation of competing interests exist.

Back
Sign Up Now