By: Janae Camacho
Over 20 years ago, Xanga was founded as a site for sharing music and book reviews. Looking back, it was one of the earliest social media platforms. Around the time that the internet was first coming into being, the most social media-like interaction involved logging on to Xanga and creating a blog. Today, many young people cannot recall a day when social media and the internet were not around. The internet and social media platforms have become permanent fixtures in our everyday lives; it is almost impossible to find someone not plugged into one of many social media apps, such as Tik Tok, Facebook, and Instagram. While we have all become accustomed to using these apps and their access to our information, the effects of social media on children, and social media companies’ access to children’s personal data, have become an increasing concern of parents, adults, and legislators alike.
Various groups have made calls to action for more protection of children and their data. These calls to action have become even more apparent as these large companies seek to expand their customer base further, focusing on their most significant base, teens, and even looking to include younger children. Legislators like Rep. Kathy Castor (D-FL) have suggested further protections for children above the age of 13. Rep. Castor introduced the Protecting the Information of our Vulnerable Children and Youth Act (Kids PRIVCY Act) on July 29, 2021. The Act would, among other things, extend the protections to teens 13 through 17, require expressed consent to collect their personal information and ban advertisements directed at children.
While these suggested measures found in the Kids PRIVCY Act may help limit social media’s adverse effects on children, it is unlikely that increasing protections for those over 13 years old will prevent those harms from occurring. The complexity of enforcing such age verifications while maintaining the privacy of the minors presents a significant barrier to implementation.
What is the Children’s Online Privacy Protection Act (COPPA)?
In the 1990s, due to the ease of gathering private data from children, the public pressured Congress to protect children’s data through legislation. In response, Congress passed the Children’s Online Privacy Protection Act. COPPA was written in 1998, at the cusp of the widespread use of the internet surge, to prevent online platforms from collecting and using the personal data of children under the age of 13 for tracking and ad targeting purposes. The rule applies to operators of commercial websites and online services that are directed at children under the age of 13 years old, as well as those operators of general audience websites or online services with “actual knowledge” that they are collecting, using, or disclosing personal information from these children under 13 years old.
What is the Protecting the Information of our Vulnerable Children and Youth Act (Kids PRIVCY Act)?
The Protecting the Information of our Vulnerable Children and Youth (PRIVCY) Act is a bill that Representative Kathy Castor (D-FL) proposed to update COPPA to meet the increased need for protection amongst a more invasive internet than in the 1990s. The proposed legislation seeks to, among other things, (1) extend the coverage to teens ages 13 to 17; (2) prohibit the collection of personal information from 13 to 15-year-old children without their expressed consent; (3) Change COPPA’s actual knowledge standard to a “constructive knowledge” standard; (4) provides a right of action to the parents of the children; and (5) ban targeted advertising that is directed at children. The bill’s drafters have included many of the critical elements of the UK’s Age Appropriate Design Code within the bill, with the aim that the companies will place the best interest of our youth over profits.
The Difficulties of Age Verification
Many companies have found it challenging to comply with the COPPA age restriction standard due to the lack of adequate, verifiable ways to identify the age of the person creating the account. Companies have used age-verification methods, such as requiring users to input their birth date, which is ultimately ineffective because it is taken at face value and not further verified by the company. With the lack of actual verification, many children under the age of 13 can bypass the age restriction by submitting a birth date that makes them at least 16 years old. According to a study conducted by Irish research center Lero, in 2021, children under the age of 13 could bypass many websites that use birthday-based age screening. The study also found that while stricter rules might help protect minors’ privacy, children were able to bypass the more stringent age verification techniques.
Although there are many other ways that companies can determine whether users under 13 years old, such as the use of “classifiers” to assist companies in predicting age and requiring proof of suspected young users to verify their age, concern remains on whether increasing the age range subject to children’s privacy protections would further prevent the misuse of minors’ data. With the difficulty of effectively verifying children’s ages, some tech companies, such as Instagram and Meta, have suggested possible remedies for further verification. In a December 8, 2021 committee hearing, Instagram’s CEO, Adam Mosseri, indicated that parents could provide their child’s age through speaking to the phone through the app or voluntarily input the child’s age into their cell phone that could be directly accessed by all apps being used on that cell phone. Additionally, Meta has confirmed that they are looking into working with other tech companies to potentially share information in “privacy-preserving ways” to assist in determining a user’s age. These solutions may bring up other possible privacy concerns, especially if there is the use of voiceprints to verify the identity of the person who is verifying the child’s age. While these proposed remedies to the age verification issue may be the solution, it is unclear how exactly these companies will verify whether the person providing the child’s age is actually a parent.
While further regulation of these companies is needed to protect minors from the influences of social media and behavioral advertising, there is no clear way to implement such protections effectively. As we have seen through the current difficulty in enforcing COPPA age limits, children can continue to circumvent such measures with or without the age limit increase. Furthermore, any attempts to verify the ages of minor users effectively would further delve into their personal information. The privacy concerns, such as third-party sharing of information, personal data security, and the use of biometric data without consent that arise from those measures could outweigh the perceived protection of an increased age requirement. Due to the challenges with age verification, the legislature now has the task of finding a way to protect children and teens from the harms of social media and the internet. In doing so, they must balance the need for accurate, verifiable age-gating with privacy concerns, minimizing the data that these companies collect from children.