Your Employer Can Monitor You While You Work From Home—Should They?

By: Joshua Waugh

Since “pandemic life” began, as many as 40% of American workers have worked from home. If you’ve been lucky enough to trade the crowded bus or the gridlocked highway for the shorter bedroom-to-laptop commute, chances are you’ve wondered just how closely your employer is watching you. The truth is that telework, for all its benefits, also has a major downside: near limitless opportunity for high-tech surveillance. And while it is clear that employers have the legal capability and the technology to monitor their employees, it’s less clear that employee surveillance is actually a good idea at all.

Can my employer really monitor me?

It is no secret that American privacy and technology laws are often lacking. At the federal level, the primary law dealing with electronic privacy is the Electronic Communications Privacy Act (ECPA), which was passed in 1986. The law is so old that Title I of the Act only contemplates a third party’s “interception” of a message sent by “wire, oral, or electronic communication”; the law doesn’t address the possibility of accessing stored communications, such as email, post-transmission.

Furthermore, Title I of the ECPA has been interpreted to include a carveout specifically allowing employers to monitor employees as long as the employer can show a legitimate business purpose. The ECPA also permits employers to electronically surveil employees upon their consent, which, given often imbalanced employee-employer power dynamics, is not great for the ordinary employee.

Title II of the ECPA, or the Stored Communications Act (SCA), provides more protection to employees, though the law is still just as dated as Title I. Under the SCA it is fairly well established that your employer can’t log in to your personal email without your permission. So rest assured, your employer cannot see the thousands of unread advertising emails in your inbox unless you give them access.

All of that said, there is not much legislation on electronic privacy at the federal level. That may seem surprising considering we’ve seen privacy controversy after privacy controversy from practically every big tech company in recent years, but electronic privacy regulation seems to be generally left to the states. The end result is that only Californians (and to a lesser extent Coloradans and Virginians) enjoy broad statutory protections against electronic employer surveillance. In most of the other states, as long as you are using an employer’s device or network, your employer may surveil you as much as they’d like. And surveillance software is readily available, including keyloggers that record every keystroke you make, activity monitors, and even software that records every website or app you access on the device. In fact, if your workplace is using the Microsoft Office 365 Suite, your employer is already able to monitor and analyze your work activity.

Where do we go from here?

If you’re concerned about your general lack of privacy rights living in America, you are not alone. Researchers have published studies showing that extensive employer surveillance can breed distrust among employees and such surveillance can be a significant hindrance on worker productivity and other positive performance outcomes. The feelings of distrust are even stronger when employees discover that they were being surveilled without their knowledge.

Despite evidence suggesting employee surveillance may have negative effects, surveys show that 62% of executives planned to use monitoring software in 2019, and that number is certain to have grown during the pandemic work-from-home era. Meanwhile, we’re also in the midst of a radical transformation in the labor force—the U.S. Bureau of Labor Statistics reported that 2.9% of the entire U.S. workforce, 4.3 million people, quit their jobs in August 2021. By all appearances, the Great Resignation is accelerating as 4.4 million workers went on to quit during September 2021, topping August’s record numbers. At a time when people are rethinking their relationship with work, struggling with burnout, and dealing with burdensome household issues such as child- and elder-care, employers should spend less time secretly surveilling their employees, and instead put effort into employee engagement. Essentially the opposite of paranoid surveillance, companies should engage with their workers by providing flexibility and building trust. Employee engagement is more likely to boost productivity than surveilling, and more importantly, in today’s climate, has been shown to increase employee retention. Ultimately, under current U.S. law, your employer can surveil you to its heart’s content in most states—but you can also resign if you feel your privacy rights have not been respected. As more and more in the labor force decide to do so, we’ll just have to wait and see how legislators respond.

The FTC Takes on Health and Fitness Apps’ Rampant Privacy Problems

By: Laura Ames

More and more Americans are turning to mobile health and fitness applications, but many worry about the lack of regulations would ensure that developers of these products keep user information secure and private. The Federal Trade Commission (“FTC”) recently addressed this concern with a policy statement (“Statement”) including app developers among the entities who must follow certain notification procedures after security breaches. However, many question the Statement’s practical effects and whether the FTC had the authority to issue it.  

Health App Trends

Mobile health and fitness apps have gained popularity in recent years, and the COVID-19 pandemic only accelerated this growth. In fact, the United States led the world in health and fitness app downloads as of October 2020 with 238,330,727 downloads that year alone. Even with this increased usage, a recent poll showed that over 60% of U.S. adults felt at least somewhat concerned regarding the privacy of their health information on mobile apps. These worries appear to be well-founded. Flo Health Inc., the developer of a menstrual cycle and fertility-tracking app, currently faces a consolidated class action alleging the company disclosed users’ health information to third parties without users’ knowledge. This is not an isolated concern. A recent study of over 20,000 health and fitness apps found that a third of these apps could collect user email addresses and more than a third transmitted user data to third parties such as advertisers.

The Original Health Breach Notification Rule

Congress enacted the Health Information Technology for Economic and Clinical Health (“HITECH”) Act as an investment in American health care technology. Subtitle D of this Act delegated authority to the FTC to promulgate breach notification requirements for breaches of unsecured protected health information. In 2009, the FTC issued its Health Breach Notification Rule (“HBNR”) covering vendors of personal health records (“PHR”) and PHR-related entities who experienced a security breach. The HBNR requires these entities to notify affected individuals and the FTC. Crucially, the HITECH Act defines a PHR as an electronic record that can be drawn from multiple sources.

The FTC has never enforced the HBNR, but the possibility for changes to the rule has been on the horizon for some time. In 2020, the FTC requested public comments on the HBNR, which functions as a part of their rulemaking process, saying that it was merely a periodic review of the rule. However, before that comment period ended, the Commission issued a policy statement that turned heads.

The FTC Makes a Bold Move

On September 15, the FTC issued a statement with two of the five Commissioners dissenting. The FTC’s stated goal was to clarify the HBNR and put entities on notice of their security breach obligations. The FTC explained that the HBNR is triggered when “vendors of personal health records that contain individually identifiable health information created or received by health care providers” experience a security breach. The first major revelation was that the FTC considers developers of health apps or connected devices as health care providers because they provide health care services or supplies.

Additionally, the FTC stated that it interprets the rule as covering apps that are capable of drawing information from multiple sources, like through a combination of consumer inputs and application programming interfaces (“APIs”). The statement gave two examples of apps that are covered under this understanding. First, an app that collects information directly from users and has the capacity to draw information through an API that enables syncing with a user’s fitness tracker. Second, an app is implicated if it draws information from multiple sources even if the health information only comes from one source. For example, if a consumer uses a blood sugar monitoring app that draws health data only from that consumer’s inputs but also draws non-health data from the phone’s calendar, that app is covered by the HBNR.

Additionally, the FTC sought to remind entities that a breach is not limited to cybersecurity intrusions but also includes unauthorized access to information. Under this interpretation, companies that share information without a user’s authorization would also be subject to the Rule. Although the FTC had not previously enforced the Rule, this Statement also served as signaling the FTC’s willingness to do so. It mentions that businesses could face potential civil penalties of $43,792 per violation per day.

Obviously, these clarifications could subject many app developers and other companies to the FTC’s rule. However, in the eyes of some, including the two dissenting Commissioners, this statement is not a mere clarification but a fundamental policy change. It could not only lead to potential confusion but could also be a breach of the FTC’s statutory authority and rulemaking process.

Critiques and Larger Questions

Some legal experts argue that this statement represents an expansion of the HBNR that could lead to further confusion for app companies and others. The two dissenting FTC Commissioners go further than potential confusion in their statements.

Commissioner Christine S. Wilson argued that this Statement both short-circuits the FTC’s rulemaking process and also improperly increases its statutory authority by expanding the definitions of terms without legislative approval. Commissioner Noah Joshua Phillips agreed that this statement’s first problem is its issuance in the middle of a request for public comment. Wilson pointed out that the FTC’s own business guidance for dealing with the HBNR directly contradicted the statement by saying that “if consumers can simply input their own” health data on a business’ site, for example, a weekly weight input, then the business is not covered by this rule. Wilson also expressed concerns that this interpretation of “health care provider” was a potentially slippery slope. For instance, does Amazon qualify as a health care provider given that users can purchase Band-Aids and other medical supplies through its phone app?

In the coming months, we might see the FTC forcing app developers to notify customers of data disclosures, but the debate around this statement also reveals larger questions concerning health care at the moment. Fundamental questions that once might have seemed easy to answer, such as who qualifies as a health care provider, are growing murkier. In the wake of COVID-19’s effects on telehealth and health technology in general, it seems unlikely that health care will phase out of this continued intermingling with technology. If that is the case, then legislation and regulations surrounding health care will continue to have to scramble to catch up with this rapid technological evolution.

Carpenter v. United States – What future for digital privacy?

Picture1By Jabu Diagana

On November 29th, 2017, the Supreme Court will hear Carpenter v. United States and decide whether the government violates the Fourth Amendment when it accesses a third party’s record of an individual’s cell phone location without a warrant.

Carpenter was a 2011 case where the defendant was convicted of a series of interstate robberies based on his phone location data, also known as cell-site-location information (CSLI). CSLI is maintained by wireless carriers and is a record of the cell towers our phones connect to every time we transmit calls, texts, emails, or any other digital information. It usually includes the precise geolocation of each tower as well as the day and time the phone tried to connect to it. The government obtained CSLI under the Stored Communications Act (SCA), a 1986 federal statute which provides that a “governmental entity may require a provider of electronic communication service or remote computing service to disclose” records using either a warrant, or, as in Carpenter, using a court order issued “if the governmental entity offers specific and articulable facts showing that there are reasonable grounds to believe that the contents of a wire or electronic communication, or the records or other information sought, are relevant and material to an ongoing criminal investigation.”

Stated differently, the real question is to what extent does the SCA allow the government to obtain CSLI without a warrant? Or to put it more bluntly, is the SCA unconstitutional?

The Sixth Circuit holding in Carpenter turned on the “third-party doctrine.”

The third-party doctrine originated in Smith v. Maryland, a 1979 case in which the Supreme Court found that installing and using a pen register to record a phone user’s dialed numbers was not an illegal search and didn’t merit Fourth Amendment protections. According to the Smith court, although the contents of our phone conversations are protected, information about the sender or receiver is not, since they willingly disclose that information to the phone company every time they place a call. Following this logic, the Sixth Circuit first found that the third-party doctrine also authorizes the government to access CSLI as “business records” directly from a cell phone company without a warrant. Additionally, it found that when a person uses their cell phone, they should be aware that their location data is shared with the service provider and should not have any “reasonable expectation of privacy” with respect to that data.

Although Carpenter is about users’ cell locations information, the principle at issue spans over other aspects of our digital privacy, given all the data we now share with third parties through the use of smartphones, wifi hotspots, apps, and cloud-based services. As Justice Sotomayor highlighted in her United States v. Jones concurrence, whatever our current societal expectations of privacy are, our citizenry can “attain constitutionally protected status only if our Fourth Amendment jurisprudence ceases to treat secrecy as a prerequisite for privacy.”

Whether Carpenter is affirmed or overruled, the court discourse will likely revolve around the impracticability of the “third-party doctrine” in the digital age. Does sharing with one mean sharing with many? It is tempting to recommend that the court abandons the “third party” doctrine, but that may be over simplistic. If the court choose to modify it, then where should the line be drawn? should there be a difference between information voluntarily conveyed to a third party or stored on the cloud? There is also a time component to this issue.  How long is continuous tracking too long? All these questions, a priori theoretical will be fundamental to the future of our privacy.

Are My Emails Beyond the Grasp of the U.S. Government?

gavelBy Mackenzie Olson

Companies like Microsoft and Google store a lot of customer data in storage centers overseas. As of July 2016, 2nd Circuit precedent indicated that, due to the foreign location of those centers, the U.S. government could not compel these companies to turn over data, even by issue of a search warrant. The case that rendered this decisions was In the Matter of Warrant to Search a Certain E–Mail Account Controlled and Maintained by Microsoft Corporation. (But also take note of the dissent in the denial of en banc review). As the Southern District of New York adjudicated the Warrant case, the Second Circuit Court of Appeals was its final arbiter. Accordingly, the Court of Appeals’ judgment only controlled as precedent in that jurisdiction. And though its opinion has been persuasive elsewhere, at least one judge, based in the Third Circuit, now disagrees with its outcome.

On February 3, 2017, Magistrate Judge Thomas J. Rueter of the Eastern District of Pennsylvania issued an opinion and subsequent orders compelling Google to turn over certain data stored in overseas facilities, per the request of two previously issued search warrants.

In his opinion, Judge Rueter explains that, “the present dispute centers on the nature and reach of the warrants issued pursuant to section 2703 of the Stored Communications Act, 18 U.S.C. §§ 2701 (“SCA”).

He frames the relevant issues as follows: “The court must determine whether the [g]overnment may compel Google to produce electronic records relating to user accounts pursuant to search warrants issued under section 2703 of the SCA, or in the alternative, whether Google has provided all records in its possession that the [g]overnment may lawfully compel Google to produce in accordance with the Second Circuit’s ruling.” Rueter ultimately holds that “compelling Google to disclose to the [g]overnment the data that is the subject of the warrants does not constitute an unlawful extraterritorial application of the [SCA].”

In its reporting of the decision, news outlet Reuters particularly emphasizes Judge Rueter’s reasoning that “transferring emails from a foreign server so FBI agents c[an] review them locally as part of a domestic fraud probe d[oes] not qualify as a seizure . . . because there [i]s “no meaningful interference” with the account holder’s “possessory interest” in the data sought . . . [the retrieval] has the potential for an invasion of privacy, [but] the actual infringement of privacy occurs at the time of disclosure in the United States.”

Orin Kerr, law professor at The George Washington University School of Law, notes numerous problems with Judge Rueter’s decision. “The issue in this case is statutory, not constitutional. Even if you accept the (wrong) framing of the issue as being whether the SCA applies outside the United States, the answer has to come from what Congress focused on, not where the constitutional privacy interest may or may not be. Where you place the Fourth Amendment search or seizure strikes me as irrelevant to the extraterritorial focus of the statute.”

Kerr further contends that, “Even accepting the court’s framing, I don’t think it’s right that no seizure occurred abroad. As I see it, copying Fourth Amendment-protected files seizes them under the Fourth Amendment ‘when copying occurs without human observation and interrupts the stream of possession or transmission’. . . . That test is satisfied here when the information was copied. The court suggests that bringing a file back to the United States is not a seizure because Google moves data around all the time and ‘this interference is de minimis and temporary.’ I don’t think that works. Google is a private company not regulated by the Fourth Amendment, so whether it moves around data is irrelevant.”

It will come as no surprise that Google plans to appeal the Third Circuit decision. Likely a slough of other tech and media companies that previously filed amicus curie briefs in the Microsoft case will file briefs again, such as Apple, Amazon, AT&T, eBay, and Verizon.

Key questions that remain, then, are what will the Third Circuit decide on review?

Will the court follow the precedent set by the Second Circuit in Warrant?

Will it adopt the reasoning of the dissenters in the denial of Warrant‘s en banc review?

Will it follow Judge Rueter’s reasoning in the case at bar?

Or will it render an entirely novel opinion?

And though we can be sure that the losing party will petition the Supreme Court, one also must consider whether a final player emerge, in the form of Congress directly intervening? After all, the SCA was enacted in 1986, and many consider it not only out of date, but also relatively unworkable for modern technological issues. The time certainly seems ripe for a statutory update.

Image Source

 

Game of Drones

DronesBy Jessy Nations

Sometime during the past decade or so we started taking the idea of making robots a part of our everyday lives more seriously. Naturally, we went from joking about making machines serve us by doing our menial chores, to teaching them to kill. Once our base needs for violence and subservience were satisfied, we quickly began adapting this technology for the highest, noblest, and most human of all endeavors: bothering our neighbors. Meanwhile, our local legislatures are trying to rein these nuisances in and we have to work with seemingly outdated common law theories until they’re finished.

I’m talking, of course, about small flying robots known as drones. What was once the pinnacle of modern robotics – despite being a glorified RC helicopter with a camera –  is now available from the corner 711 for $30. (No seriously. I’ve almost bought one out of curiosity.)

Continue reading