Volume 2018 — Issue 2
"Does Technology Drive Law? The Dilemma of Technological Exceptionalism in Cyberlaw" by Meg Leta Jones
Download PDF | Abstract
Seemingly plagued by newness, the law, it is often claimed, cannot keep up with new technology. Digital technologies have only reinforced the legitimacy of this now well-established idiom. The sentiment has gone unchecked for decades, even in light of social and historical research that reveals the cultural nature of technology. In the field of law and technology (cyberlaw), the theory of technological exceptionalism is used to measure whether new technologies are transformative enough to uproot existing legal foundations. This Article is an attempt to disconfirm technological exceptionalism as a viable theory for cyberlaw research and policymaking by analyzing a number of information and communication technologies often labeled ‘exceptional:’ including the printing press, the Internet, photographic cameras, computers, and drones. If technologies can be exceptional—if their attributes drive social change and laws—the same linear pattern should appear across cultures where the technology is introduced: a technology enters society and allows for certain activities that place significant strains on social orders, existing law and legal concepts are applied but fall short, and necessary changes are made to account for the new technological capabilities. Because the theory of technological exceptionalism does not hold up—because the story of law and technological change is much more varied, messy, and political—it should be discarded and new theories of and approaches to law and technological change, such as the legal construction of technology, should be pursued.
"'One if by Land, Two if by Sea': The Federal Circuit’s Oversimplification of Computer-Implemented Mathematical Algorithms" by Christian Dorman
Download PDF | Abstract
The modern, connected world relies on advanced computer-implemented mathematical algorithms to manage the storage and movement of digital data. Whether these algorithms, including those related to error correction, compression, and encryption, should be patent eligible is on the razor’s edge of the questions surrounding patent eligibility today. While, generally, the generic computer implementation of abstract ideas is not patent eligible, when an abstract idea is claimed that provides a “technological improvement,” the answer is less clear. The Federal Circuit recently held in RecogniCorp that claims directed to image encoding were patent ineligible as being directed to an abstract idea without an inventive concept. This decision is hard to reconcile with past case law, especially considering that the image encoding itself provides a technological improvement to the computer implementing it by increasing the computer’s efficiency. This Article argues that the RecogniCorp decision was misguided and that the claimed image encoding should have been deemed patent eligible based on the technological improvement to the computer’s efficiency. Even more damning, though, is the Federal Circuit’s blanket statements as to the lack of patent eligibility for claims directed to any computer-implemented mathematical algorithm, whatever technological improvement that algorithm may provide. Considering the importance of data processing in modern technology, the effects of such a restriction would be dire. This Article stresses the critical need for a second look at the RecogniCorp decision to ensure the patent eligibility of computer-implemented mathematical algorithms that provide technological improvements.
"Good Intentions and the Road to Regulatory Hell: How the TCPA Went from Consumer Protection Statute to Litigation Nightmare" by Stuart L. Pardau
Download PDF | Abstract
The Telephone Consumer Protection Act (TCPA) and attendant FCC regulations contain textual ambiguities that are inherent in any statute or regulation but have been exacerbated by the drastic changes in technology since the early 1990s. Those ambiguities have been exploited by the plaintiffs’ bar to the outsized detriment of the business community. The purpose of this Paper is to highlight some of those ambiguities, and propose some common sense solutions to help make the TCPA achieve its original purposes as a consumer protection statute which does not incentivize frivolous and exploitative litigation.
To that end, Part II of the Paper briefly discusses the history of telemarketing and the TCPA, including Congress’ and the President’s express intent to curb abuses without unduly hampering business communications; Part III highlights six ambiguities in TCPA jurisprudence—the definition of “autodialer,” the definition of “telemarketing” and “advertisement,” the nature of “dual purpose” communications, the nature of “consent,” the identity of a “called party,” and the identity of the party that “initiates” the call (i.e., vicarious liability issues)—which have been exploited by the plaintiffs’ bar; Part IV charts the drastic spike in TCPA litigation over the last ten years; Part V suggests some solutions to bring clarity back to the TCPA and stem the tide of abusive litigation; and Part VI concludes.
"Preserving Capital Markets Efficiency in the High-Frequency Trading Era" by Gaia Balp & Giovanni Strampelli
Download PDF | Abstract
Although HFT has become an important feature of financial markets internationally, its impact on the functioning of equity markets is still under discussion, as HFT can negatively affect market quality and stability. Regulatory measures recently adopted on both sides of the Atlantic to better control HFT-related risks chiefly focus on the stability, orderly functioning and integrity of markets, but give insufficient consideration to how HFT interacts with the allocative function of price discovery. In order to fill this gap, this article focuses on how HFT-related informational inequalities among investors threaten equity markets’ (long-term) efficiency. Subscription to newswires and market data-feeds, along with co-location, grant HFTs early access to market-moving information that allows for latency arbitrage and trading ahead of other investors, which can discourage informed (slower) traders from carrying out costly fundamental analysis. Therefore, HFT challenges the theoretical framework underlying the Efficient Capital Markets Hypothesis, and can negatively affect price accuracy, real resource allocation and equity markets’ allocative efficiency. Against this backdrop, this Article develops an analytical framework for possible regulatory strategies that seek to limit the negative effects of HFT on allocative market efficiency by reducing HFTs’ speed advantage or by incentivizing fundamental informed traders to enter markets where they face costly pressures to compete with HFTs. Restricting the sale of trade data feeds or mandating speed bumps may discourage HFT and weaken its positive effects in terms of increased liquidity and better short-term price discovery, without however definitively curbing HFT-related risks concerning long-term price accuracy, while the replacement of the current continuous trading regime with a batched auctions-based regime would require major regulatory changes. The introduction of a continuous, event-driven, and faster issuer disclosure regime could limit these possible drawbacks by providing
"Facing the Facts on Biometric Phone Locks: Your Face and Thumb are NOT Secure" by Bilal Adra
"Business Method Patents: Let the PTAB Kill Them All? A Case for Narrow Reading of CBM Review Eligibility" by Roman Perchyts
"The National Bioengineered Food Disclosure Standard: A Solution to the GMO Labeling Political Debate?" by Zoe Spector