By: Andy Paroff
If you ever happen to visit the website for United States Citizenship and Immigration Services (USCIS), you might notice a friendly-seeming face pop up in the corner of the screen. That face belongs to Emma, a “computer-generated virtual assistant” who can “answer your questions and even take you to the right spot on [USCIS’s] website.”
What is a chatbot?
The existence of pop-up chatbots such as Emma is relatively old in internet terms, some of them have been around for a decade or more. In fact, Microsoft’s Clippy came to their Word software in 1996, giving many computer users a bad first experience with virtual assistants. Today, chatbots usually appear on websites, asking how they can direct you to what you are interested in finding. Chatbots can be a helpful and interactive alternative to making a call or sending an email. However, one does not generally expect to see a chatbot on a governmental agency website, especially when that agency is often roiled in accusations of human rights abuses and other controversial debates. The smiling face of a chatbot might make someone feel they can share information they otherwise might not share with a USCIS agent over the phone or in person. But what information does Emma gather and what information does Emma retain when one is done using its features?
Who might use Emma?
People visiting the USCIS website are generally immigrants or undocumented people seeking to achieve legal status in the country. The information they may share with a chatbot such as Emma can be highly personal and, in some cases, dangerous. Imagine a situation where someone who has recently come into the country without legal status logs on to USCIS’s website to learn more about what options they have to gain legal status. In the eyes of the United States Government, that person is in this country illegally and would most likely qualify for deportation. Now imagine that the person decides to interact with Emma, and in doing so divulges identifying information such as location or family member names. This person would probably not reveal this information to an immigration officer for fear of deportation, but they have told a virtual assistant in the employ of the agency. Has this person put themselves in danger of deportation by sharing information with Emma?
USCIS acknowledges privacy concerns
By asking users to share personally identifiable information (PII) with Emma, USCIS has opened itself up to many potential privacy concerns. Recognizing this issue, the Department of Homeland Security (DHS, USCIS’s parent agency) conducted a Privacy Impact Assessment (PIA) in May, 2017. In the PIA, DHS moves to quash any fears by immediately addressing the privacy question. The second page states that Emma “only maintains the customer interaction . . . data in its memory for the duration of the session. This information is neither stored nor retained for analytical or improvement purposes. Once the interaction with [Emma] is complete, the collected/attached data is cleared from the servers without future storage or use.” In other words, DHS is quick to claim that Emma does not keep any of the information given to it.
Information Emma does collect
However, the PIA goes on to acknowledge that some technical information is in fact stored for an extended period of time. “On the backend, [Emma] automatically collects session information about each interaction . . . This includes: date and time of visit, [USCIS] content visited, mobile or desktop device. [Emma] also collects and stores session/technical data (e.g., IP addresses, server data) for 2 years.” (emphasis added). DHS admits that this PII is accessible in Emma’s servers by database administrators. PII such as an IP address can be the source of major concern in the wrong hands, as it can alert the viewer to the IP address’s geographic location.
Another potential issue the PIA raises is the possibility of the query with Emma being escalated to a live chat with a Customer Service Representative (CSR). This happens in situations where Emma is unable to find information on the USCIS website that adequately answers the user’s question. The chat log will be forwarded to a CSR, who will help resolve the issue. DHS points out that PII that has been shared with Emma will be “automatically masked” when being transferred to the CSR. However, the CSR is free to ask the user for more information, thereby potentially revealing the PII again. Here, the PIA contradicts itself. While DHS initially claims that information gathered by the CSR will only be retained for the duration of the chat, they later admit that the “Live Chat conversation between the agent and the customer will be stored for 13 months from the day of the chat interaction.” Only a few officers have access to the chat logs, however they may be “shared on a ‘need to know’ basis.” Additionally, DHS states that there “is no privacy impact related to information sharing because information is not shared with external entities.” While on the surface this seems reassuring, the fear amongst immigrants is not that their information will be shared with external entities, but rather USCIS’s sister agency, Immigration and Customs Enforcement (ICE), the enforcement arm of DHS’s immigration capacity.
Ultimately, despite Emma’s smiling face and cheerful disposition, immigrants have good reason to be suspicious of sharing personal information with the chatbot. USCIS claims to not retain any of the information given to Emma, they do collect some potentially endangering PII through the technical backend of the application. Additionally, Emma can forward the user to a live chat with a CSR, which can also lead to information being recorded and potentially shared within the agency. An inquisitive undocumented person would be wise to seek immigration information from other sources and not directly from USCIS, no matter how charming and harmless their chatbot appears.