
Photo by Pixabay on Pexels.com
By Eleanor Lyon
“You never want a good crisis to go to waste,” Rahm Emmanuel, President Obama’s Chief of Staff, once famously said. It seems that Attorney General William Barr and Senator Lindsey Graham were listening.
Last month, Bloomberg reported that Senator Lindsey Graham was planning to introduce legislation which would purport to hold tech companies accountable for allowing child pornography to be shared on their sites. The central change proposed by the bill is that it would allow for companies to be sued for recklessly distributing child pornography. If this sounds like a reasoned and principled stand to you, you might want to look a little more closely at what it means to “recklessly distribute” child pornography.
Privacy advocates warn that the bill is really targeting end-to-end encryption, a method of protecting online communications by preventing the company hosting the interaction from deciphering what the sender and the recipient are saying to one another. WhatsApp has long offered end-to-end encryption on its messaging platform, and Facebook integrated the practice into its “Messenger” app after acquiring WhatsApp. US Attorney General William Barr has repeatedly condemned the practice, arguing that it supports the proliferation of child pornography and organized crime. In searing testimony at the “Lawful Access Summit” hosted by the Department of Justice in October, FBI Director Christopher Wray argued that Facebook’s end-to-end encryption (then still a plan for the future) would be a “dream come true for predators and child pornographers.”
Haven’t I heard this argument before?
Yes. Battles between the DOJ and large tech companies have been in the news since 2016, when Apple resisted an order from a federal judge to help the FBI decrypt the phone of one of the San Bernadino shooters. Federal prosecutors in that case had been granted a search warrant to access the phone but risked losing the data contained upon it if they failed to correctly enter the password ten times. So, they asked a judge to force Apple break into its own product. Apple refused to comply, but the FBI managed to access the phone by other means, avoiding a protracted legal battle.
But Apple’s stance ruffled powerful feathers. Presidential candidate Hillary Clinton, FBI director James Comey, and most of the Republican presidential candidates at the time called for “backdoors” to be built into devices to ensure that law enforcement would always be able to access the data of accused criminals. Privacy advocates, of course, argued that this would violate the consumer’s right to be secure in their personally identifying information, and pointed out that end-to-end encryption protects victims of intimate partner violence, journalists, and political dissidents from dangerous surveillance, and that all people have a reasonable expectation of privacy in the messages they may choose to send to one another.
So, what is new here?
The latest gauntlet in the fight between federal law enforcement and Silicon Valley is called the “Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2019,’’ or the ‘‘EARN IT Act.” The acronym “EARN IT” appears to be a reference to the broad immunity from liability which online platforms currently enjoy under Section 230 of the Communications Decency Act (CDA). Section 230 provides that online platforms are not liable in civil suit for the speech of their users on the platform. As Riana Pfefferkorn, Associate Director of Surveillance and Cybersecurity at the Stanford Center for Internet and Society, explains it, “[i]f you defame me on Twitter, I can sue you for defamation, but I can’t sue Twitter. Without the immunity provided by Section 230, there might very well be no Twitter, or Facebook, or dating apps, or basically any website with a comments section.” But EARN IT would allow victims of child pornography (probably those whose images are being circulated, but this is not clear in the current text of the Act) to sue Twitter directly if their images were being exchanged on their site.
EARN IT asks online platforms to do just that: earn their liability shield. It replaces the broad liability protections with two safe harbor provisions. Companies can earn back their immunity from suit by either (1) complying with the as-yet-undefined “best practices” which will be written by a 15-member commission created by the Act or (2) by implementing “reasonable measures” to combat the spread of child pornography on this platform, though following this second prong does not guarantee a liability shield.
Is that really so bad?
It’s the first safe harbor provision that worries privacy advocates. Online service providers are already expected to monitor their sites for child pornography and report to the DOJ under 18 U.S.C. § 2258A in exchange for civil and criminal liability protection, and online platforms complied with this law 45 million times last year. But their ability to monitor the content being exchanged on their site cannot possibly extend to systems which are encrypted end-to-end, as the company itself does not know what is occurring on those messaging platforms. Nevertheless, they could be liable for failing to detect and report child pornography that is exchanged therein under EARN IT.
Experts believe that this 15-person committee would likely limit acceptable encryption types under the authority of this act, declaring that end-to-end encryption without a backdoor for law enforcement can never be a “best practice.” Why does this seem likely? Because Attorney General Barr, who has routinely railed against encryption, receives the recommendations of the commission and has the power to unilaterally amend them before they are finalized. And even if the commission does not declare that providing end-to-end encryption is per se reckless with regard to child pornography risk, courts could find that such encryption is reckless ignorance of the risk to children under the new lowered standard. Proponents of the law admit that “EARN IT will require companies that offer end-to-end encryption to weigh the consequences of that decision” and “don’t doubt that this will make the decision to offer end-to-end encryption harder.”
Is there a solution?
The choice we are being asked to accept, between rampant child pornography on the one hand and unimpeded surveillance on the other, is a false dichotomy. Both tech companies and the DOJ and its allies are playing on our fears to manipulate us to their own ends. Raising the specter of rampant litigation is the oldest trick in the tort playbook by industries which have grown used to their liability shields, but the government’s demands for unfettered access to private communications, the overwhelming majority of which are innocent, is unreasonable. The time is coming for us to demand more from tech companies, perhaps by labeling them information fiduciaries with a duty not to harm us. But in the new world of ubiquitous surveillance capitalism, consumers are demanding that their conversations be protected from the prying eyes of the companies that facilitate those exchanges, and we should not give up what few spaces we have carved out for ourselves because of disingenuous threats by our own government.