Photo from FaceApp
By: Justin Brascher
Lately, chances are you scrolled though social media and came across a picture of someone you know, only they looked about 50 years older. Such technology is the product of FaceApp, an app developed by the Russian company Wireless Lab. The company’s purported AI technology can take a user’s face and alter the image with a number of different filters. The most popular of these filters is the aging filter, which took social media by storm last year, becoming one of the biggest social media trends of 2019. At one point FaceApp was the number one downloaded app in the IOS app store. However, concerns over FaceApp’s privacy policy and its connections to Russia quickly took center stage.
FaceApp’s questionable privacy policy
FaceApp’s terms of service quickly presented significant privacy concerns amongst its users. One area that drew particular criticism was a section that states a user grants FaceApp, “A perpetual, irrevocable, nonexclusive, royalty-free, worldwide, fully-paid, transferable sub-licensable license to use, reproduce, modify, adapt, publish, translate, create derivative works from, distribute, publicly perform and display your User Content and any name, username or likeness provided in connection with your User Content in all media formats and channels now known or later developed, without compensation to you.”
Simply put, by using the app the user grants FaceApp and its parent company, Wireless Lab, full license to do anything with the user’s picture they want, without any royalty or monetary payment due. This creates the potential for FaceApp to build up a giant library of data based on the facial scans it has conducted, and then sell that information to the highest bidder, with no requirement to inform anyone. This potential is especially problematic given the fact that FaceApp is owned by a Russian company, and Russian infiltration into American data processes has increased.
Compounding these fears, by granting access to FaceApp a user also grants access to other apps inside their phone, such as the search app and safari app on iPhone. When a user first opens the FaceApp, it asks for permission to access certain apps on a user’s phone, such as a camera, to operate the app. This is normal for interactive apps such as FaceApp. However, the app also asks for permission to access apps that are inconsequential to FaceApp, such as the safari app. Most users typically grant access without question and give permission without reading the fine print. As a result, most users do not look closely enough to know that FaceApp still functions without access to these other apps on their phones–it is not required. This means that not only does the app have access to a user’s face, allowing the company to do whatever it wants with it, but it also access to most of a user’s other information.
Implications of such data collection
Such an ability immediately brings to mind the Cambridge Analytica scandal that has dominated attention over the last few years. In a similar way, a large corporation now has access to the personal information of millions of users, without any requirements surrounding how to use it. This has serious privacy concerns. However, it should be noted that up until now there have been no reports of Wireless lab doing anything improper with the information they have collected.
Further research determined that the user information would most likely stay on Amazon servers here in America, thus negating worries about user information becoming an even greater part of the already pervasive Russian hacking issues. However, that does not distract from the concern that was rightfully created regarding the User Term Agreements and the potential uses of user information.
Despite the fact that no one has reported their information being used improperly as a result of FaceApp, it really begs the question of just how far we are willing to let user agreements go. How much of our information can we sign over in agreements that we never look at, and what point does this become simply malicious behavior on the part of the company in question? One thing that could be beneficial is if agreements such as this one—that so blatantly appear to be designed to take user information and package it for purposes of selling to questionable third parties—receive more scrutiny from the judicial system. Another solution could be found in liability waivers that people sign when entering locations such as theme parks and ski resorts. In those cases, an important factor is public policy. Courts look at whether the activity that the waiver pertains to is “highly regulated” and if it is, courts likely will not enforce the waiver. Similarly, if someone wants to keep a company like Wireless lab from using their information, a legal analysis that determines whether agreements like FaceApp’s privacy policy are really in the best interests of public policy would be hugely beneficial. While it has yet to cause a large issue, it only takes one successful nefarious plot to take all of our information and make it available to anyone who wants it.