Alexa: Are You Going to Testify Against Me?

By: Melissa Torres

Life seems pretty great in a world where we can turn lights off, play music, and close the blinds by simply speaking it into existence. But, what happens when your conversations or home noises are used against you in a criminal investigation? 

Smart speakers, such as Google Home and Amazon Alexa, are marketed as great tech gifts and the perfect addition to any home. A smart speaker is a speaker that can be controlled with your voice using a “virtual assistant”. It can answer questions for you, perform various automated tasks and control other compatible smart devices by simply activating its “wake word.”

According to Amazon.com, in order for a device to start recording, the user has to awaken the device by saying the default word, “Alexa.” The website states, “You’ll always know when Alexa is recording and sending your request to Amazon’s secure cloud because a blue light indicator will appear or an audio tone will sound on your Echo device.” Unless the wake word is used, the device does not listen to any other part of your conversations as a result of built-in technology called “keyword spotting”, according to Amazon.

Similarly, Google states, “Google Assistant is designed to wait in standby mode until it detects an activation, like when it hears ‘Hey Google.’ The status indicator on your device will let you know when Google Assistant is activated. When in standby mode, it won’t send what you’re saying to Google servers or anyone else.” 

Consumers consent to being recorded when they willingly enter a contract with these smart devices by clicking “I agree to the terms and conditions.” However, most people assume this refers only when implicating the “wake word.” Despite assurances from tech giants that these devices do not record without being prompted, there have been many reports that suggest otherwise. And recent in years, these smart devices have garnered attention as they have been called as the star witness in murder investigations.  

In October 2022, someone fatally shot two researchers before setting fire to the apartment they were found in. According to the report, Kansas police believe the killer was inside the apartment with the duo for several hours, including before and after their deaths. Investigators found an Amazon Alexa device inside the apartment and filed a search warrant for access to the device’s cloud storage, hoping it may have recorded clues as to who is responsible for the murders. If the police obtain relevant information, they may be able to use it in court, depending on how this evidence is classified.

Under the Federal Rules of Evidence, all relevant evidence is admissible unless another rule specifies otherwise. Specifically, statements that are considered hearsay are not admissible unless an exception applies. Hearsay is any statement made outside the presence of court by a person for the purpose of offering it to prove the truth of the matter asserted. Although these devices technically do produce statements, courts have held that a statement is something uttered by a  person, not a machine. However, there is an important distinction between machines that have computer stored and computer generated data. Computer stored data that was entered by a human has the potential to be hearsay, while computer generated data without the assistance or input of a person is not considered hearsay.  The question of how these statements will be classified and whether they will be permitted in court is up to the judge. 

As such, this isn’t the first time police have requested data from a smart speaker during a murder investigation. In 2019, Florida police obtained search warrants for an Amazon Echo device believing it may have captured crucial information surrounding an alleged argument at a man’s home that ended in his girlfriend’s death. In 2017, a New Hampshire judge ordered Amazon to turn over two days of Amazon Echo recordings in a case where two women were murdered in their home. In these previous cases, the parties consented to handing over the data held on these devices without resistance. In 2015, however, Amazon pushed back when Arkansas authorities requested data over a case involving a dead man floating in a hot tub. Amazon explained that while it intends not to obstruct the investigation, it also seeks to protect its consumers First Amendment rights. 

According to the complaint, Amazon’s legal team wrote, “At the heart of that First Amendment protection is the right to browse and purchase expressive materials anonymously, without fear of government discovery,” later explaining that the protections for Amazon Alexa were twofold: “The responses may contain expressive material, such as a podcast, an audiobook, or music requested by the user. Second, the response itself constitutes Amazon’s First Amendment-protected speech.” Ultimately, the Arkansas court never decided on the issue as the implicated individual offered up the information himself.      

Thus, a question is still unanswered: Exactly how much privacy can we reasonably expect when installing a smart speaker? As previously mentioned, these smart speakers have been known to activate without the use of a “wake word”, potentially capturing damning conversations. Without a specified legal standard, there’s not much consumers can do to protect their private information from being shared as of now, fueling the worry that these devices can be used against them. Tech companies, like Amazon and Google, suggest going into the settings and turning off the microphone when you aren’t using it, but that requires trusting the company to actually honor those settings. Users also have the option to review and delete recordings, but again you have to trust the company to honor this. The only sure way to protect yourself from these devices is by simply not purchasing them. If you can’t bring yourself to do that, be sure to unplug the devices when you’re not using them. Otherwise, it’s possible these smart speakers may be used as evidence against you in court.

The Cellphone: Our Best Helper or an Illegal Recorder? 

By: Lauren Liu

We have all experienced that shocking moment when we realized that the advertisement or post appearing on our screen happens to be the exact topic that we talked about in a very private conversation. Although we did not Google or browse that topic on the internet, somehow, that idea of upgrading our laptop or buying that new pair of shoes slipped into our browser and started waving at us from across the screen. We are in awe, and can even feel violated.

Such an experience has become so common that we forget how much our browser or the apps that we use are tracking us, and how much our cellphones are listening in on our every conversation. Especially after the revelations from Thomas le Bonniec, a former contract consultant for Apple, such an issue has raised more concerns for customers. According to Bonniec, Apple created a quagmire for itself involving many ethical and legal issues, including Siri’s eavesdropping. In many instances, iPhones record users’ private conversations without their awareness of it, and without any activation of Siri, which listens to users’ vocal commands and assists with their needs. The problem stems from the fact that every smartphone, including iPhone and Android devices, is a sophisticated tracking device with very sensitive microphones that can capture audio by the users, or even anyone within the vicinity. Furthermore, with 4G LTE and its bandwidth, these recordings can be stored and uploaded into the seller’s database without the knowledge or consent of the owner. Bonniec mentioned Apple’s explanation that these recordings were gathered into Apple’s database for analytics and transcription improvements. However, Bonniec’s revelation of Apple’s internal operation still caused many privacy concerns from customers and raised potential legal issues. 

In response to such concerns, companies created long consent forms for customers to sign before purchasing the product. The legal definition of consent is that a person with sufficient mental capacity and understanding of the situation voluntarily and willfully agrees to a proposition. Based on such a definition, a majority of customers could not have validly consented, because when most of them sign these consent forms, they do not read or fully understand the content in these forms. More specifically, regarding the problem of Siri, customers often do not clearly understand what Siri listens to or how their iPhones record their conversations. Most ordinary iPhone users often assume that Apple only evaluates voice commands and questions after they activate Siri for specific commands. 

Federal law (18 U.S.C. § 2511) requires one-party consent, which means that a person can record a phone call or conversation, so long as that person is a party to the conversation. If a person is not a party to the conversation, he or she can only record if at least one party consents and has full knowledge that the communication is being recorded. Most state laws follow such federal laws. It remains a question whether or not Apple or Siri should be legally considered a party to a conversation, but based on common sense, most consumers would likely think that it is not. Furthermore, it remains unclear whether or not the signing of a consent form without a comprehensive understanding of the form’s content is considered valid consent. Thus, even if a customer signs such a consent form, it remains possible that he or she still does not consent to be recorded.

In addition to learning about the law, consumers should also ask questions regarding potentially illegal recordings by electronic devices. How much private information is obtained? What confidentiality agreements were in place, and what oversight was implemented? Are actual audio recordings retained, and if so, for how long? With so much ambiguity still remaining, these questions can at least begin the process of addressing consumers’ concerns and reducing potential legal disputes for sellers.

Careful! Big Brother is Watching (or rather Listening)

By: Enny Olaleye

Earlier this year, social media users may have been surprised to see #LiveListen trending on websites such as Twitter and TikTok. This hashtag represented one of Apple’s newest innovations called Live Listen, an accessibility feature designed to help the hearing-impaired, by permitting users to use their AirPods to turn their electronic devices (iPhones, iPad, etc.,) into a microphone—which sends sound to their AirPods. However, what Apple intended to be a simple new feature for their products quickly transformed into a social media craze, where Apple users discovered that they could use this new function to eavesdrop on other people’s conversations. 

Activating the Live Listen feature is as easy as opening your iPhone’s settings application. Once activated, the Live Listen feature allows users to hear conversations more clearly, by tuning out any background noise present. With your AirPods in your ears and your iPhone near the person you are trying to hear, Live Listen will transmit the audio to your AirPods. While navigating this new feature, users soon found out that when their AirPods were connected, they were able to listen in on any conversations happening in the room the iPhone was placed in—even when they were in a different room from the device. Live Listen remains active until the AirPods are put back in their case or disconnected from their mobile device. This feature means that, even if the connected iPhone or iPad is hidden somewhere out of sight, it can still clearly pick up conversations within the same room. 

Social media users began to label this new advancement as a “game-changer,” publicly admitting the different ways as to how they planned to utilize this feature to eavesdrop on their friends, partners, and even their employers. 

Thus, the question arises: Are AirPods our newest security threat? 

When you think of the word “wiretapping,” or what is commonly referred to as “eavesdropping,” you may imagine a black-and-white scene with a bunch of men in suits huddled around a clunker of a machine wearing oversized headphones—looking intently into the distance. Well, thanks to Ring cameras, high-definition drones, and of course smartphones, wiretapping laws have greatly expanded from what they used to be back in the day of drama-filled, black-and-white criminal television shows. The Electronic Communications Privacy Act of 1986 (ECPA), made it a federal crime to engage in, possess, use or disclose information obtained through illegal wiretapping or electronic eavesdropping. This statute applies to any face-to-face conversations, emails, texts, phone calls, or “electronic communication,” that are reasonably expected to be private. 

“But—I don’t plan to record the conversation; I just want to listen in.” Still…no. 

Aside from the literal action of using AirPods as a wiretapping device, the ECPA also considers it a felony to intentionally intercept electronic communication—which translates to setting up your AirPods to listen into private conversations. Further, the ECPA also considers it a felony to attempt to intercept an electronic communication—which includes the mere action of attempting to set up the LiveListen feature for the purpose of listening into a reasonably private conversation. Regardless of whether you are recording or just listening in, the consequences of even attempting to wiretap or eavesdrop include imprisonment of up to five years (if criminal intent can be proven) and up to a $250,000 fine. 

With the advancement of technology not dwindling down any time soon, it brings up the matter that if your peers can so easily listen into your conversations, what does that mean for those with more resources and power? 

Electronic surveillance, whether through AirPods or government-funded access to encryption tools, is fundamentally at odds with personal privacy. Under the Fourth Amendment, government agencies must obtain a warrant, approved by the judge, before engaging in wiretapping or electronic surveillance. However, while government agencies are required to secure a warrant, their requests for wiretaps are almost never turned down by judges. Once authorized, both wiretapping and electronic eavesdropping enable the government to monitor and record conversations and activities without revealing the presence of government listening devices. 

Legislation concerning wiretapping and privacy rights continuously lag behind the fast-paced advancement of technology. Even so, products as simple as AirPods and iPhones will never be tagged as security threats due to the sheer awareness that they already exist everywhere. The old-time anecdote that “Big Brother is Watching You” is slowly coming into fruition as user privacy can be surpassed at our own fingertips. While the expansion of electronic surveillance was originally meant to reduce serious violent crimes after 9/11, it has only led to the heightened violations of privacy rights amongst those in the United States. 

“So now what?” 

Well, simply put—in most circumstances, listening in to conversations that are “reasonably expected” to be private, without the consent of those participating in the conversation, will most likely constitute a federal crime. Thus, activating LiveListen and utilizing it outside its designated role as an accessibility feature is not a good idea. With respect to protecting yourself and your information—that is a bit more difficult. Avoiding the entire “surveillance economy,” by not using Apple products or avoiding Google and Twitter is just very unlikely (I still haven’t been able to give up Amazon Prime). However, taking action can be as small as searching on secure networks only (with the little lock on the search bar), to as large as applying pressure to your state’s representatives to pass legislation centered on protecting our individual privacy rights is a step in the right direction. 

The bottom line is; without the assurance that our private communications are, indeed, private, privacy rights will continue to be glazed over and decisions based upon free will and personal choice will slowly be replaced by decisions centered in prudence and fear.