Skip to main content
Skip to main content

Hack Backs, Hatchbacks, and Cyberattacks: Effectively Legislating Cybercrime

By Zachary Breit  | Inquiry Essay

My hometown of Fanwood, NJ is a microscopic blip on the map of a state better known for its proximity to New York than for any of its own offerings. Only a square mile in diameter, my tiny little suburb might be the most uninteresting, uneventful town in existence. That’s what I’d believed, anyways, until 2017, when a little-known resident of Fanwood named Paras Jha pleaded guilty to developing the infamous Mirai computer virus. His creation made its debut on October 26, 2016, when it was used to take down Internet service across the entire East Coast (Graff). I still remember my programming class coming to a standstill that very afternoon, as we lost access to all the websites that we needed to complete our daily assignments. The virus was originally coded by Jha and his Rutgers University roommates, who were all business partners in the Minecraft server industry. As part of their scheme to deny players entry into rival servers, Jha’s team designed malware that could flood their competitors with debilitating amounts of network traffic (Graff). Soon enough, the more benign intentions of the virus were subverted when Jha released his code to a community of malicious hackers. Almost immediately, the virus was disseminated to thousands of unsecured embedded devices that collectively downed services as large as Spotify, Netflix, and the New York Times on that fateful October day (Graff).

After hearing about the Mirai virus, I was immediately drawn to the subject of cybercrime—after all, Jha’s hacking incident was probably the first substantial news story to come out of my town in the past few decades. As I traversed down a rabbit hole of online articles, blog posts, and YouTube videos about cybersecurity, I eventually stumbled upon a technical report published on WIRED that immediately sparked my attention: “Hackers Remotely Kill a Jeep on the Highway—With Me in It.” The article elucidated a vulnerability in 2014 Jeep Cherokees that was uncovered by cybersecurity experts Chris Valasek and Charlie Miller. By connecting to these Jeeps through a security flaw in the entertainment/navigation system, Valasek and Miller were able to “fully kill the engine, abruptly engage the brakes, or disable them altogether” (Greenberg). As if that wasn’t scary enough, because the Jeep Cherokee has Internet connectivity, all these attacks could be conducted remotely, from a laptop “anywhere in the country” (Greenberg). Miller and Valasek’s discovery provides ample evidence that the threat of hacking transcends the digital world and has the potential to threaten us physically.

The thought of having my breaks cut off or my accelerator malfunction on the highway is deeply chilling. Even more frightening, though, is the fact that Jeep Cherokees and the thousands of Internet-connected devices compromised by Jha’s virus might only scratch the surface of targets that are vulnerable to widescale cyberattacks. As more and more of our daily lives migrate from the physical to the digital realm—for evidence, look no further than the new wave of Internet-connected cars, speakers, and refrigerators—we must consider the potential for malicious entities to wield that technology against us. Considering how threatening cyberattacks could be to individuals, companies, and society, it is unsettling to learn that the “software industry has matured in a ‘legislative void’ in which software developers generally bear no liability for their products” (Daley 537). What laws, if any, should be enacted to help prevent and mitigate the damage of cybercrime? Unsurprisingly, legal experts in the field of cybersecurity are still engaged in heated debates over this emerging topic.

One prevailing opinion is that corporations should bear most of the burden to protect technology from attacks. Legal scholars such as John Daley theorize that lenient economic and legal ramifications for cybersecurity flaws are one of the biggest drawbacks with current legislation (537). Not only have software developers been able to “blame cybercrime… on the expertise… of third party criminals… and on careless users,” but technology consumers have often failed to boycott products with bad security (536). Because there are very few regulations governing cybersecurity, vulnerabilities in software have become increasingly common. The lack of regulations likely arose from fears that stringent rules would stifle technological innovation.

Daley reaffirms the importance of enabling innovation and believes that strict security regulations might prevent companies from developing software without immediate fears of public and legal backlash (538). Instead, he advocates for an open-source software model, in which companies openly publish the code for their products and allow everyday users to contribute to its development. This approach has gained widespread popularity in various “popular development tools, such as Hadoop and Kafka, as well as… businesses, such as Cloudera and MongoDB” (Daley 539). At its surface, the open-source model might seem counterintuitive—after all, wouldn’t keeping proprietary code private be a better way to prevent hackers from finding vulnerabilities? Perhaps unexpectedly, the security of software benefits from freely inspectable, open-source code “by exposing the work of developers to scrutiny” (Daley 540). When put under the public eye, programmers would likely follow better security practices for fear of being exposed. Additionally, cybersecurity experts and hobbyists can more closely examine the published code to discover lurking bugs that may have gone unnoticed during the development process (Daley 540). According to Daley, the net result of an open-source software model is that “it could increase accountability more broadly and lead to much more secure [software]” (540). By mandating open-source development, Daley’s model could encourage companies to build more robust software without stifling their innovation.

Despite the likely efficacy of Daley’s open-source requirements, corporate unwillingness to release source code could prevent their widespread adoption. Legal experts such as Caleb Kennedy consider a compromise: requiring industry databases that outline known security flaws in common software. For example, in the automotive and aviation industries, “the products from each manufacturer are similar enough that they often face common challenges regarding security threats” (Kennedy 350). This situation has prompted the creation of networks where companies are mandated to disclose all discovered security vulnerabilities to each other. The Aviation Information Sharing and Analysis Center, or A-ISAC, has already helped airlines share flaws in their avionics software so that the entire industry can quickly develop and implement security patches (Kennedy 350). In the automotive industry, a similar auto-ISAC debuted in 2015 to collectively improve the cybersecurity practices of car manufacturers (350). By sharing resources and disclosing vulnerabilities, companies can address security issues swiftly and collaboratively. Compared to Daley’s open-source model, Kennedy’s proposal to encourage more industries to adopt vulnerability disclosure networks better caters to corporate interests and potentially avoids overturning industry norms of keeping source code private.

Kennedy further placates companies by asserting that current product recall laws may be too stringent on companies. For instance, in the automotive industry, “large civil penalties… along with the cost of the recall itself might dissuade manufacturers from reporting vulnerabilities to the auto-ISAC” (Kennedy 351). If companies fail to utilize disclosure networks due to fear of legal or economic consequences, the networks will be rendered pointless. To ensure that vulnerabilities are disclosed, Kennedy wants to relax existing recall regulations and implement policies that allow companies to patch software bugs over the Internet without the need for a recall (351). In this way, Kennedy hopes there would be fewer drawbacks to making security flaws known to the public.
Although regulating software development practices could be beneficial, other scholars place more importance on expanding the legal toolkit that governs how companies can mitigate and prevent cyberattacks. In their article “US Policy on Active Cyber Defense,” Angelyn Flowers and Sherali Zeadally, who specialize in cybersecurity law, claim that today’s companies are under constant attack from malicious hackers and do not have the proper legal support to deter them (292). When describing existing cyber defense laws, Flowers and Zeadally draw a distinction between active and passive cybersecurity. Passive cybersecurity, the more legally tenable method of cyber defense, involves reacting to cyber threats as they arise. Examples of passive methods include malware monitoring and firewalls (Flowers and Zeadally 292). On the other hand, active cyber defense can involve deceiving potential attackers, intrusively analyzing digital traces to determine the source of an attack, and even offensive hacking to nullify and deter attacks in progress (Flowers and Zeadally 293). According to Flowers and Zeadally, new laws that allow corporations to practice active cyber defense would reduce the number of cyberattacks by instilling fear in potential criminals. They contend that the persistence and intensity of cyberattacks, as can be seen by the 782% increase in attacks against the US government from 2006 to 2012, is proof that passive cyber defense is not enough to address threats against digital systems (292). If active cyber defense was legalized, as per Flowers and Zeadally’s suggestion, organizations could employ more aggressive cybersecurity strategies that could potentially deter criminals from launching attacks and provide the organizations with new forms of recourse in the event of a breach.

The adoption of active cyber defense remains an unsettled issue. While some scholars such as Flowers and Zeadally argue for active cyber defense’s necessity, others assert that it may even be counterproductive. One such opponent is Chris Cook, a US Department of Justice lawyer and expert in cybersecurity law. Cook fears that allowing corporations to pursue active cyber defense to determine the perpetrators of hacks (e.g., data breaches) may prompt retaliatory attacks from criminals that would not have otherwise occurred (215). On top of these reactive attacks, Cook worries that there may be an international response if organizations intruded into the network of a hacker from a foreign nation (207). After all, perpetrators of cyberattacks can live virtually anywhere in the world, meaning that a private company “hacking back” in response to a network breach could be construed as an aggressive action by foreign governments. To facilitate digital investigations without triggering conflict between nations, Cook suggests applying an existing law passed in 2018, known as the Clarifying Lawful Overseas Use of Data (CLOUD) Act. Although it does serve other purposes, the CLOUD Act would allow companies to request digital information from foreign law enforcement (208). In this way, corporations could turn to allies for digital forensic evidence instead of conducting potentially dangerous cyber investigations of their own. Unlike active cyber defense, which risks offending foreign nations, the CLOUD Act could receive more support or even adoption by other countries due to its collaborative nature.

As we place more and more trust in technology to control infrastructure as critical as our electrical systems or even to keep people’s hearts beating, the necessity of cybersecurity becomes increasingly evident. Despite all the discord between legal experts on how to deal with cybercrime, there seems to be a common thread connecting their opinions: technological security is an issue that affects us all. Whether we implement open-source regulations or legalize active cyber-defense is still up for debate, but the current laws that deal with cybercrime are likely insufficient. Without policy proposals and swift actions from legislators, technological innovation could, like the 2014 Jeep Cherokees, grind to an abrupt halt.

 

Works Cited

Bamsby, Robert E., and Shane R. Reeves. “Give Them an Inch, They’ll Take a Terabyte: How
States May Interpret Tallinn Manual 2.0’s International Human Rights Law Chapter.” Texas Law Review, vol. 95, no. 7, June 2017, pp. 1515–1530. EBSCOhost, search.ebscohost.com/login.aspx?direct=true&db=asn&AN=123896556&site=ehost-live.
Banks, William. “State Responsibility and Attribution of Cyber Intrusions After Tallinn 2.0.”
Texas Law Review, vol. 95, no. 7, June 2017, pp. 1487–1514. EBSCOhost, search.ebscohost.com/login.aspx?direct=true&db=asn&AN=123896555&site=ehost-live.
Cook, Chris. “Cross-Border Data Access and Active Cyber Defense: Assessing Legislative Options for a New International Cybersecurity Rulebook.” Stanford Law & Policy Review, vol. 29, no. 2, July 2018, pp. 205–236. EBSCOhost, search.ebscohost.com/login.aspx?direct=true&db=asn&AN=130412093&site=ehost-live.
Daley, John. “Insecure Software Is Eating the World: Promoting Cybersecurity in an Age of Ubiquitous Software-Embedded Systems.” Stanford Technology Law Review, vol. 19, no. 3, Spring 2017, pp. 533–546. EBSCOhost, search.ebscohost.com/login.aspx?direct=true&db=asn&AN=123284244&site=ehost-live.
Flowers, Angelyn, and Sherali Zeadally. “US Policy on Active Cyber Defense.” Journal of Homeland Security & Emergency Management, vol. 11, no. 2, June 2014, pp. 289–308. EBSCOhost, doi:10.1515/jhsem-2014-0021.
Graff, Garrett M. “How a Dorm Room Minecraft Scam Brought Down the Internet.” Wired, Conde Nast, 13 Dec. 2017, www.wired.com/story/mirai-botnet-minecraft-scam-brought-down-the-internet/.
Greenberg, Andy. “Hackers Remotely Kill a Jeep on the Highway-With Me in It.” Wired, Conde
Nast, 21 July 2015, www.wired.com/2015/07/hackers-remotely-kill-jeep-highway/.
Kennedy, Caleb. “New Threats to Vehicle Safety: How Cybersecurity Policy Will Shape the Future of Autonomous Vehicles.” Michigan Telecommunications & Technology Law Review, vol. 23, no. 2, Spring 2017, pp. 343–356. EBSCOhost, search.ebscohost.com/login.aspx?direct=true&db=asn&AN=128799471&site=ehost-live.