Do Robot Cars Dream of Legal Liability?

By: Mason Hudon

In the next decade, driverless cars will likely become commonplace on American streets. Promising reduced fatalities from car crashes, more time to engage in work or play for the occupants, and potentially more streamlined and efficient transit systems, so-called autonomous vehicles (AVs) have captured the minds and hearts of those who dream of a better tomorrow. Inevitably, however, AVs are not without their critics, and in the wake of serious pedestrian and operator casualties in places like Arizona, Taiwan, and Florida, legal professionals are burdened with the difficult question: whose fault is it when a vehicle without a driver causes death or injury and consequently provides grounds for a lawsuit in tort or criminal charges?

The answer, unfortunately, is complicated. For standard non-AV vehicles, liability for car accidents is relatively simple in both criminal and civil actions. The tortfeasor or lawbreaker is easily identifiable and charged; it is the person behind the wheel, actually operating the vehicle and making decisions that have led to serious consequences. In contrast, a driverless car accident may be due to operator error, automated vehicle error, or a combination of both. Because of these complexities, lawmaking concerning the topic has lagged behind tech innovations, and both prosecutors and civil lawyers representing plaintiffs have, up until now, taken the path of least resistance.

For example, in 2018, during routine testing in Maricopa County, Arizona, an Uber-owned autonomous vehicle operated by Uber contractor Rafaela Vasquez hit and killed a pedestrian walking her bike across a roadway at night. “In the six seconds before impact, the self-driving system classified the pedestrian as an unknown object, then as a vehicle, and then as a bicycle,” and ultimately relied on Vasquez to activate emergency brakes. By the time Vasquez, who had been apparently watching The Voice on her cellphone, was informed by the vehicle that a collision was imminent, it was far too late for her to react. In fact, a National Transportation Safety Board Report found that the Uber vehicle informed Vasquez only 1.3 seconds before impact that a collision was imminent. It was only then that the car software allowed for the driver to use emergency braking maneuvers, which had been deactivated for the majority of Vasquez’s trip that day. She was subsequently determined to be criminally liable under a theory of negligence, but many critics decried the fact that she was charged for putting her trust in software that should have been programmed to handle the situation for her, with Jesse Halfon for Slate.com writing, “the decision to use criminal sanctions against only the backup driver in this case is legally, morally, and politically problematic.”

When asked why Uber was not charged in this situation, University of Washington School of Law Professor Ryan Calo opined that in lieu of clearer guidelines for charging driverless car crimes, the prosecutors opted for a simple, relatable story that a jury could easily understand. “That’s a simple story, that her negligence was the cause of [the pedestrian’s] death,” Calo said, “[b]ring a case against the company, and you have to tell a more complicated story about how driverless cars work and what Uber did wrong.” Additionally, University of South Carolina Law Professor Bryant Walker Smith commented, “[…]I’m not sure it tells us much about the criminal, much less civil, liability of automated driving developers in future incidents.” The question thus remains, did Uber actually do anything wrong here and “escape” criminal liability? It appears so.

According to the National Transportation Safety Board’s final report on the collision, while the vehicle operator was, in fact, the probable cause of the accident, “[c]ontributing to the crash were the Uber Advanced Technologies Group’s (1) inadequate safety risk assessment procedures, (2) ineffective oversight of vehicle operators, and (3) lack of adequate mechanisms for addressing operators’ automation complacency—all a consequence of its inadequate safety culture.” But, “[i]nstead of grappling with those nuances, [the prosecutors] appear to have elected to pursue an easy target in the name of hollow accountability.”

Many also question the sentiment espoused by a Maricopa County prosecuting attorney that a driver has “a responsibility to control and operate [a] vehicle safely and in a law-abiding manner” when the term “driverless car” is discussed. The entire point of an AV is to allow the driver to disengage and attend to other matters like work, gaming, resting, or engaging with other passengers. Not to mention that studies have shown operators of driverless cars are prone to vigilance decrement, a significant decrease in alertness, after only 21 minutes of autonomous vehicle operation due to boredom and a lack of stimulation. If drivers still have to worry about liability for car accidents when they are not even operating a vehicle, then the viability of AVs may be seriously called into question. A partial solution, however, may be found in product liability jurisprudence for tort actions.

As of 2015, “Florida, Nevada, Michigan, and the District of Columbia [had] enacted statutes limiting product liability actions against [Original Equipment Manufacturers (OEMs)] when the action [was] based upon a defect in an autonomous vehicle.” The statutes generally provide that “OEMs are not liable for defects in an autonomous vehicle if the defect was caused when the original vehicle was converted by a third party into an autonomous vehicle or if equipment installed by the autonomous vehicle creator was defective.” In the Maricopa County case, however, Uber was able to settle out of court with the decedent’s family and escape criminal retribution entirely, while its contractor seemingly took the fall for the criminal aspect of the accident, even though she was informed of an imminent collision only 1.3 seconds before the crash occurred.

The emergence of AVs on our roads will require not only greater guidance from legislatures in charging and holding companies like Uber accountable through statutes, but it will also necessitate a revaluation of existing tort law and criminal liability standards. A 2014 Brookings Institute report focusing on AV development in the long run asserted, “federal attention to safety standards for autonomous vehicles will be needed, and those standards will have [to have] liability implications.” Such federal safety standards may take the form of mandatory operator accessible emergency brakes, a requirement that operators maintain their vehicle to a certain standard to use autonomous features, or even only allowing AV use on certain qualified roads with few to no crosswalks. These potential standards, however, will undoubtedly have to be extensively tested before they can be implemented and incorporated into a federal regulatory scheme, likely requiring states to pick up the slack in the interim. In moving forward, states should consider situations like that which occurred in Maricopa County, Arizona and ensure that they are holding the right entity accountable, especially when it comes to criminal sanctions, rather than simply taking the easy way out.

Liability Theater: Can Live Theater Reopen in the Time of COVID-19?

By: Paige Gagliardi

“The show must go on!” …But can it during a pandemic?

The novel coronavirus presents a slew of new barriers for the American live entertainment industry. Broadway has been shut down since March 12and will remain so until June 2021. Over 23,000 live events are cancelled, and an estimated 90 percent of independent venues are expected to permanently close as the entertainment industry faces a loss in revenue of over $160B. While artists and venue owners alike cry out for a safe way to reopen, to return to the pre-pandemic model of live theater could expose venues to an exorbitant amount of liability.

Since the initial industry shut down, unique forms of entertainment production and consumption have emerged. From filming on “closed sets”, to record-breaking streaming of taped Broadway productions, to drive-in concerts, the entertainment industry has begun to adapt. But due to the unique conditions posed by indoor auditorium seating, backstage work, and performance, many concert halls, stadiums, and historic theaters remain closed. In an attempt to provide consistency and to cope with the ever-changing state regulations about reopening measures, entertainment industry unions such as IATSE (The International Alliance of Theatrical Stage Employees) released recovery plans for future live events. In their the 27-page document, IATSE stated their new guidelines for their union venues. These new guidelines require a designated COVID-19 Compliance Officer (very similar to those now seen on every Hollywood set) that oversees and monitors the adherence to protocols and training. Further, the guidelines require a venue to have a written COVID-19 safety plan, reduced personnel, diagnostic testing, daily screening, adequate ventilation, health-safety education, touchless ticket scanning, reduced patron capacity, and paid sick leave for staff pursuant to the Families First Corona Response Act.

But are these internal measures enough to decrease the spread of COVID-19 in a live entertainment environment? They may not be enough in the midst of union infighting and industry turmoil. While the leaders of 12 major unions met in solidarity in May, currently SAG-AFTRA and Actors’ Equity Association, two of the nation’s largest performing arts unions, are now locked in a jurisdictional dispute over the whose territory, meaning whose regulations and personnel, the taping of live theater productions belongs to. And although twelve states have begun enacting legislation to narrow the liability limits related to and stemming from COVID-19, it does not absolve employers of their duty to maintain safe operations for workers and customers.

Under the Occupational Safety and Health Act (OSHA), the employer has a legal obligation to provide a safe and healthful workplace.  Because OSHA does not have any specific regulation regarding the virus, it would fall under the employer’s duty of care. The “General Duty Clause,” Section 5(a)(1) of the Act, requires an employer to protect its employees against “recognized hazards” to safety or health which may cause serious injury or death. For this to be applied to COVID-19 is no obscure fear; although tracking ground-zero of an illness can be hard to prove, companies such as Princess Cruise Lines, Walmart, and at least three elderly care facilities already face wrongful death lawsuits. So, unless the issue of liability over COVID-19 transmission is addressed in future legislation, unions and venue owners must proactively seek to limit any potential liability. Patrons may need to sign a participation waiver before entering, or their ticket may include waiver for any claims arising from the transmission of a communicable disease (just as the back of a baseball ticket traditionally contains a waiver of liability for any physical injuries sustained due to a foul ball).

All that said, unless something changes quickly, the live entertainment industry as we know it could become another casualty of this pandemic. As entertainment lawyer Jason P. Baruch of Sendroff & Baruch, LLP put: theater “is not likely to be economically viable with social-distancing requirements in place that cull audiences by half or more…With the exception of the occasional one-person show, concert or small play, most [shows] simply will not be producible until the theaters can be filled again.

Live theater, with its earliest record in Western history being the 6th century BCE, has survived revolution, oppression and disease; so while there is no doubt live theater will return, upon reopening, it will confront many legal and economic challenges. Live theater may never be the same, and the government and venue responses to these issues of liability will shape how live theater survives and determine when it will flourish again.

What Does Washington’s New Non-Compete Law Have in Store for the Tech Industry?

By: Shelly Mittal

What Does Washington’s New Non-Compete Law Have in Store for the Tech Industry?

The rivalry between Amazon and Google is often on display. One area where we recently saw this rivalry spread its tentacles was in attracting talent. When Google hired Amazon’s marketing executive Brian Hall in April of this year, Amazon decided to enforce a non-competition agreement against him. This caused quite a splash in the industry.

Enforcing non-competition agreements against former employees is not a new trend in the tech industry. Amazon, itself, has brought a series of lawsuits to enforce such agreements including one against Philip Moyer, a former Amazon Web Services sales executive, who took a job with Google Cloud last year. 

Non-competition agreements (often called non-competes) are essentially contracts/clauses in employment agreements that prohibit employees from joining competitors or starting a competing firm for a specified amount of time and in a specified geographic region to protect trade secrets, client lists, and other intangible assets. They have always been particularly controversial in the tech industry, which faces challenges in structuring non-competes that balance attracting talent and protecting sensitive information with preventing unfair competition by former employees.

Many states have developed common law, through court decisions, that govern non-competes, while others have enacted statutes. Washington courts, too, formulated the reasonableness standard, wherein non-competes were enforced if they were reasonable in scope, geographic reach, and duration, as determined on a case-by-case basis. On the other hand, non-competes are unenforceable in states like California, the heart of the global tech industry.  Opponents of non-competes credit this approach to the growth of Silicon Valley, which required a liberal flow of employees from big tech companies to startups. Washington passed a new law this year which restricts non-competes (w.e.f. Jan 1, 2020). While the new law does not go so far as to ban non-competes, it does impose new restrictions.  Under the new law, non-competes will only be enforceable if (a) the employer discloses the terms of the covenant in writing when making an offer or earlier; (b) the employee earns more than $100,000 a year; and (c) the non-compete is enforceable for a period not longer than 18 months.

So how does the salary threshold affect tech employees?

As reported by the 2019 Hired report, tech salaries have been on the rise in Seattle, with the average pay jumping from $125,000 in 2015 to $138,000 in 2018. The report suggests a 10% jump from 2015 to 2018. Based on the average salary and the suspected jump, it is safe to say that most of Washington’s tech employees fall above the $100,000 salary threshold in Washington’s new non-compete law. This means two things for the Washington tech industry: first, the salary threshold does not exempt most Washington tech employees from being bound by non-competes; and  second, since employers can enforce non-competes against their employees, they will be less likely to use premium constraints like bonuses or stock compensation to restrict employee mobility. So, while cafeteria workers and receptionists (non-tech employees) at big tech companies will be free to leave and start their own ventures, those with tech expertise will, most often, have to wait.

The low salary threshold may also make things more difficult for startups. Because Washington’s new salary threshold is low in comparison to a regular tech salary, startups may struggle to recruit great talent from larger tech companies, which will be able to effectively restrict their employees through non-competes.In other words, Washington’s new non-compete statute is an imperfect law for startups: its salary threshold can potentially free some tech employees to take new jobs or go out on their own, but by and large, it can help big tech companies to restrict employees.  

So is it all bad?

Although a higher salary threshold would have been a better choice for the tech industry, the predictability that the new law provides by defining the ‘reasonable’ standard and taking discretion away from the courts can help Washington’s tech industry continue to grow. It can reduce both un-assessable risks and potential litigation costs for both employers and employees. Less focus and expenditure on these concerns will likely result in a more profitable employment market and could foster industry growth. Therefore, although the new law does not eliminate the enforceability of non-competes for techies in Washington, the present low rate of enforcement combined with the certainty that the new law brings is a welcome positive change.

Peeved with Your Pre-Order? Legal Solutions to Videogame False Advertising

By: Moses Merakov

It’s no secret that videogame publishers and their developers often make broad, sweeping claims about their game in an effort to sell their product. Regardless of whether that takes the form of tangibly misrepresenting the graphical fidelity of the game or omitting that the game will use predatory microtransactions, many gamers have become disenchanted with the industry and large triple A game publishers. A consumer that pre-orders a title may get a wildly different product upon release from what was initially promised. Is there any legal recourse for these consumers?

Potentially yes. A consumer can sue for false advertising. Federally, a consumer can make a claim under section five of the Federal Trade Commission Act (FTC Act), which states that “unfair or deceptive acts or practices in or affecting commerce” are declared illegal. On a state level, many states have statutes parallel to the FTC Act or at least allow consumers to pursue common law based false advertising claims.

In Washington State, the legislature enacted the Consumer Protection Act, which similarly codified that “[u]nfair methods of competition and unfair or deceptive acts or practices in the conduct of any trade or commerce are” unlawful (RCW 19.86.020). To establish a claim under the Washington Consumer Protection Act (CPA), a plaintiff must prove five elements: (1) an unfair or deceptive act or practice that (2) affects trade or commerce and (3) impacts the public interest, and (4) the plaintiff sustained damage to business or property that was (5) caused by the unfair or deceptive act or practice. Hangman Ridge Training Stables, Inc. v. Safeco Title Ins. Co. (1986).  “Failure to satisfy even one of the elements is fatal to a CPA claim.” Sorrel v. Eagle Healthcare (2002). While each element carries its own set of particular criteria and micro-elements, courts generally construe such statutes as liberally as possible in order to protect consumers from conniving sellers. Panag v. Farmers Ins. Co. of Washington.

If the courts and legislature are so friendly to consumers, and the claims are seemingly easy to pursue, why are there a distinct lack of false advertising claims/cases in the videogame industry? Many customers simply don’t feel it is necessary to pursue a legal claim because their 60-dollar game turned out to not carry the particular features that were promised. The cost of litigation far outweighs the $60. It would likely take the coordinated effort of a law firm amalgamating thousands of consumers into a class-action lawsuit to make a claim against a videogame company profitable.

Additionally, even assuming gamers successfully band together to bring a lawsuit, publishers and their related developers are often careful with their marketing phrasing as to avoid false advertising claims. A notorious example involves the videogame Crash Team Racing Nitro-Fueled. In 2019, Publisher Activision and developer Beenox repackaged a 20–year-old game, Crash Team Racing, releasing a rehashed version with minor content additions and graphical improvements. Knowing that many gamers are distraught with modern gaming’s extensive use of microtransactions and hoping to solidify that the updated version will stay true to its roots, a member of the Nitro-Fuel team claimed in an E3 convention presentation that the entire game would avoid microtransactions. According to Beenox, the game would offer new content for free during the game’s lifecycle.

The game eventually released and there were no microtransactions. However, only months after the game released, Activision/Beenox introduced microtransactions to the game and changed certain game mechanics to encourage consumers to pay additional money to obtain in-game content faster. Consumers were met with a seemingly different product than what was advertised to them. While this certainly angered many consumers of the game, a closer look at the language the Nitro-Fuel E3 presenter used reveals that he separated microtransactions for cosmetic items in the game from microtransactions for new playable content. While new playable content was guaranteed to be free, cosmetic items were not. Thus, the two companies adding microtransactions later on for cosmetic items are likely not liable to a false advertising claim.

All in all, while consumers may pursue false advertising claims against fraudulent publishers/developers, it may be economically unviable or difficult to pursue once an investigation of marketing language is done.

Is Your Personal Health Still Personal? Privacy Issues With Wearable Tech

By: Shelly Mittal

Who does not love the convenience of instant health data at their fingertips? However, like everything else, this convenience comes with a price. With so much insight into our daily steps, calories, sleep patterns, body fat, heart rate and more, the wearables have given a whole new meaning to our personal health. Wearable technology is any device worn on the body that is equipped with sensors to collect information from both the body and the surrounding environment. This ability to quantify our health has the potential to radically improve human health and fitness. Consequently, the wearable technology industry is projected to maintain double digit growth through 2024, which speaks to its acceptance among users. However, the security vulnerabilities in wearable health devices pose significant challenges to users’ data privacy.

While most engineers focus on extending battery life, creating rich functionality with minimal computational resources and minimizing design constraints, security of these devices often takes a backseat. These devices run the risk of physical unauthorized access of data as, often, there is no user authentication required (e.g., a PIN, password or biometric security). The less computational power of wearables causes the absence of some complicated security mechanisms on the device. Secondly, the wearable devices tend to connect to our smartphones or tablets wirelessly via Bluetooth, NFC or Wi-Fi. This need for communication creates another entry point into the device making it prone to information leakage. The lack of encryption, in some cases, makes data in transit insecure. Thirdly, many wearables run their own operating system and need to be patched and updated to avoid the latest security vulnerabilities.

These security vulnerabilities, when put together with the regulatory issues, paint a scary picture for data privacy. Regulatory framework for the wearable technology industry is in flux with hardly any application of the Food, Drug and Cosmetic Act (FD&C) or the Health Insurance Portability and Accountability Act (HIPAA). Although these wearables collect the most intimate health information, collection and use of this information is not governed under HIPAA because health data, such as number of steps, calories, and sleep history, is not formally considered Protected Health Information (PHI) unless collected by your doctor or insurance provider. Only the health care providers, health plans and health clearinghouses (referred to as covered entities under HIPAA) are subject to HIPAA’s extensive privacy regulations. Companies who make wearables and collect health data are not yet subject to HIPAA. So, for as long as the Department of Health and Human Services (the regulatory body under HIPAA) decides not to focus their attention on wearables, the privacy of its users is mostly dependent on the privacy policies they accept while setting up the device.

Businesses are free to draft their own privacy policies for controlling information and data that falls outside the scope of HIPAA. In January 2015, the Federal Trade Commission (FTC), which  has relatively more enforcement powers in the wearables industry, issued guidance on privacy and security protection that should be included with the Internet of Things (IoT), including wearables. It also required a disciplined and structured approach for design, development and management of these devices and the data they produce.

The privacy policies, unilaterally drafted by companies, are often vague and include a lot of “may(s)” to give flexibility to the companies. Ambiguous terms give them enough wiggle room to use the health data for their own good. Therefore, it is more important than ever to not skip the privacy policy page and give it a thorough read before accepting. It is imperative for users to know if their data is actually being encrypted; if the companies periodically review and monitor access to their data; and to know who owns their data and how they can get more control over it. Hence, the solution to present privacy concerns lies in using FTC’s Fair Information Practice Principles of notice, choice, and consent in this self-regulating space of wearables.