Who’s Your Drug Dealer? Snapchat and Section 230 Under Scrutiny

By: Caroline Dolan

While Snapchat may no longer feel as trendy as it once was, the social media platform is alive and well. However, the same cannot be said for all its adolescent users. Snapchat’s unique filters and features attract 406 million daily active users, but the app is being dubbed “a haven for drug trafficking” by grieving parents. Numerous parents are seeking justice for their children who used Snapchat to purchase drugs unknowingly laced with fentanyl. While Section 230 would normally immunize a social media platform from civil liability and be grounds for dismissal, a Los Angeles judge has denied Snapchat’s invocation of Section 230 immunity and overruled twelve of its sixteen demurrers. In other words, the judge has determined that the causes of action asserted by the Plaintiffs have merit and will continue through the litigation process. 

A Snapshot of Section 230 

The Communications Decency Act (“CDA”) of 1996 was passed in light of the internet’s rise and Congress’s desire to protect children from exposure to dangerous content, particularly pornography. However, out of fear that platforms would overly censor themselves to avoid violating the CDA, Congress passed the Internet Freedom and Family Empowerment Act, better known as Section 230. Section 230(c) governs the liability of providers of an “interactive computer service” and states that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” In other words, an “information content provider” like Twitter or Facebook cannot be held civilly liable for what you or your friend post. It also cannot be held liable for voluntarily moderating content in good faith. While Section 230 does not protect against federal crimes, ​​electronic communications privacy law, or intellectual property violations, it is still a wide-reaching shield and online platforms rarely hesitate to invoke it. 

The Suit Against Snap

Represented by the Social Media Victims Law Center, the Plaintiffs asserted that Snap’s product features and business choices resulted in the serious injury and foreseeable deaths of their children. The Plaintiffs alleged that Snap’s automatic message deletion, location mapping, and “quick-add” features create an inherently dangerous app and enable kids to connect with adult predators. The parents contend that Snap has been aware that the app is an “open air drug market,” yet failed to implement any meaningful changes to improve age and identity verifications or prevent foreseeable consequences. Notably, the Plaintiffs did not allege that Snap is liable for failing to eliminate or moderate the content of the third-parties selling drugs, but rather that the “feature-packed social media app” facilitates an unreasonably dangerous avenue for strangers to contact vulnerable adolescents. 

Snap demurred to all sixteen alleged causes of action and invoked general immunity under Section 230. It advocated for an extremely broad reading of the statute and asserted a “but for”/ “based on”/ “flows from” construction wherein, “if the alleged harm flows from the content provided by third parties, Section 230 applies.” In Snap’s view, it should be privy to Section 230 immunity because the Plaintiffs’ children would not have been injured but for the content of the third-party drug dealers.

The Ninth Circuit’s three-prong test established in Barnes v. Yahoo!, Inc. applies Section 230 immunity to “(1) a provider or user of an interactive computer service (2) whom a plaintiff seeks to treat, under a state law cause of action, as a publisher or speaker (3) of information provided by another information content provider.“ 

In Neville v. Snap, Inc., the judge agreed with Snap that as an (1) “interactive computer service,” Section 230 absolves it from liability as a (2) “publisher or speaker” of any (3) information posted by third-party users. However, the judge concluded that the Plaintiffs had not alleged that Snap was a “publisher or speaker.” Rather, the allegations centered on the purported unreasonably dangerous and defective product. The judge recognized the unsettled bounds of Section 230 but nonetheless found that the claims related to Snap’s product and business decisions were “independent … of the drug sellers’ posted content” and beyond Snap’s “incidental editorial functions” (e.g. choosing to publish, remove, or edit content), which Courts consistently have held as protected under Section 230.

Snapchat On The Docket

Section 230 does not have the same meaning or relevance as it did nearly forty years ago. Yet, it continues to tread through pressing issues related to AI technology, national security, and public health and safety. The Supreme Court has continued to sidestep these questions but may soon be forced to more clearly define this statute. Neville v. Snap, Inc. will seek to clarify the outer bounds of Section 230 as well as provide justice and solace to the victims’ families.

No Filtering Snapchat’s Third Party Woes

 

snap-ghost-yellow

Snapchat’s Ghost logo

By Mackenzie Olson

Snapchat is an app that allows users to send one another “snaps”, which are pictures that disappear after a few seconds.  Users can also add a “filter” to their pictures to alter or enhance it. However, Snapchat filters are quite unlike those of other apps. Sure, many iPhone photos instantly become more attractive—or at least more “like”-able—under the effects of the photo sharing app Instagram’s many popular filter options. (If in doubt, opt for the Valencia filter. It’s nearly foolproof.) Snapchat filters, however, can turn a user into a surreal version of him or herself.  Ever wondered what you might look like as a dog? A zombie? Or with your best friend’s (or the Starbucks lady’s) face? Snapchat offers all of these options, among others, and they are virtually risk-free.

“Virtually.”

Continue reading

“Disappearing Forever” Too Good to be True? Snapchat Reaches Settlement with FTC

ImageBy Chris Ferrell

On May 8th, the Federal Trade Commission (“FTC”) announced that Snapchat, a mobile application company, had agreed to settle with the FTC over several charges, including deceptive advertising, failure to maintain security features, and collecting data from application users. The FTC alleged that Snapchat deceived users by claiming that their “snaps” (which are pictures users take with their cell phones and send to other users) would “disappear forever” after being viewed. According to Snapchat, users send 400 million photos and videos per day. However, recipients of a snap can save the snap in different ways, including: taking a “screen shot” of the picture, downloading the picture as original content, or, at the extreme, hacking into different Snapchat users’ accounts and stealing their photos. We’ve previously covered the legal ramifications of taking a screenshot of snaps in the context of revenge porn.

The FTC further alleged that Snapchat’s failure to secure its “Find Friends” feature resulted in a security breach that enabled attackers to compile a database of 4.6 million Snapchat usernames and phone numbers. Snapchat also allegedly took contacts from Apple iOS users’ address books, as well as geolocation information from people using Android-based phones. Snapchat does not have to pay a fine, but, under the settlement, it is prohibited from misrepresenting the extent to which it maintains the privacy and security of users’ information. Snapchat must also implement a comprehensive privacy program that will be monitored by a third-party privacy group for the next 20 years. Although Snapchat claims to have already addressed the FTC’s concerns by “improving the wording of their privacy policy” and implementing security counter measures, is that enough to allow applications like Snapchat to continue to exist? Continue reading