By: Zoe Wood
In February of this year, I wrote about how lethal autonomous weapons systems—or Killer Robots, depending upon who you ask—are under-regulated both nationally and internationally. In March, New York City councilmember Ben Kallos proposed what is likely the nation’s first law regulating law enforcement use of robots armed with weapons.
Proposed Int. No. 2240-A is just fifteen lines long and it bans two types of conduct. First, the New York Police Department (NYPD) “shall not authorize the use, attempted use or threatened use of a robot armed with any weapon.” Second, the NYPD “shall not authorize the use, attempted use or threatened use of a robot in any manner that is substantially likely to cause death or serious physical injury, regardless of whether or not the robot is armed with a weapon.” A weapon is “a device designed to inflict death or serious physical injury,” and a robot is “an artificial object or system that senses, processes and acts, to at least some degree, and is operated either autonomously by computers or by an individual remotely.”
Councilmember Kallos proposed Int. 2240 in response to a late-February 2021 incident during which the NYPD brought an (unarmed) robot to an active crime scene in the Bronx. Kallos reportedly watched footage of this event “in horror.” “Robots can save police lives, and that’s a good thing,” says Kallos, “[b]ut we also need to be careful it doesn’t make a police force more violent.”
The robot in question strongly resembles a dog despite a complete lack of fur, ears, or anything resembling a face, and is generically called “Spot” by its manufacturer, Boston Dynamics. Spot is approximately two-and-three-quarters feet tall and three-and-a-half feet long. It weighs 11.5 pounds and can last up to 90 minutes on one battery charge. Spot can also travel at about three miles per hour. Why exactly would the Spot robot be useful at an active crime scene? The robot’s primary utility is for surveillance. According to Boston Dynamics, “Spot is an agile mobile robot that navigates terrain with unprecedented mobility, allowing you to automate routine inspection tasks and data capture safely, accurately, and frequently.” At an active home inspection like the one in late February in the Bronx, Spot was likely brought in by the NYPD to surveil the residence with cameras before police officers entered the premises. Although Spot has the ability to operate autonomously, the NYPD has reportedly only used its Spot robot with a remote control.
A prominent robotics company owned by Hyundai Motor Group, Boston Dynamics is perhaps most popularly famous for its dancing robots. However, the advertising on its website focuses on solutions for construction, industrial inspection, and work in warehouses. Notably, policing is not actively advertised as an implementation for Spot. In fact, policing is mentioned just twice, embedded within two FAQs. The same goes for military use of Spot, which is not advertised and appears just once in an FAQ.
In fact, Boston Dynamics is opposed to weaponizing its robots. CEO Robert Playter has explained that “[a]ll of our buyers, without exception, must agree that Spot will not be used as a weapon or configured to hold a weapon.” This language is included in the Software License section of the Spot robot’s Terms and Conditions of Sale, which reads in relevant part: “The License will automatically and immediately terminate, and we may disable some or all Equipment functionality, upon (a) intentional use of the Equipment to harm or intimidate any person or animal, as a weapon or to enable any weapon,” or “(b) use or attempted use of the Equipment for any illegal or ultra-hazardous purpose.” Playter explains “[a]s an industry, we think that robots will achieve long-term commercial viability only if people see robots as helpful, beneficial tools without worrying if they’re going to cause harm.”
There are several glaring reasons why this paragraph in Spot’s Terms and Conditions of Sale is insufficient to ensure that Spot not be weaponized. First, and foremost, while Terms and Conditions of Sale are binding, they lack the staying power of a law or regulation and can be changed from contract to contract. Second, there are many different kinds of robots currently in use by law enforcement, and they should all be regulated uniformly. Regulation lacking in uniformity could lead to a complex legal landscape that is difficult to enforce, riddled with loopholes, and which favors certain technologies over others. Third, and most specifically, the Boston Dynamics Terms and Conditions of Sale do not define key terms such as “intimidate” and “ultrahazardous.” Finally, the commercial viability of robots, as cited by Boston Dynamics, should not be the primary motivation behind regulations that ban killer robots. What makes a product commercially viable is wont to change, and besides, human-centric policy which values people over property should form the basis for regulation that bans killer robots.
This is where Councilmember Kallos’s bill excels: it would have staying power, would apply to all police robots uniformly, and appears to be motivated by human-centric policy considerations. And although the bill may seem premature, there have been several instances of law enforcement robots using force against people, at least once resulting in death. Most famously, in 2016, the Dallas Police Department used a military robot—a Remotec Andros—to curtail a standoff with a sniper who had killed two police officers and gravely injured three more. Faced with an hours-long standoff, Dallas police officers placed a pound of C-4 explosive on the robot’s extension arm and maneuvered it by remote-control into the building where the sniper hid. When the C-4 exploded, it killed the sniper and ended the standoff. Less famously, police in Dixmont, Maine used the same type of robot with the same type of explosive when they were called to the home of a man having a mental health crisis. When officers could not get the man to come out of the house, and after the man began shooting out of his window at the armored police vehicle, police maneuvered a robot armed with an explosive device towards the man’s home. When the device detonated, it caused the man’s house to collapse, but miraculously killed neither the man, nor the dog or kitten living with him.
Still, Councilmember Kallos’s bill is vulnerable to some criticism. First, Int. No. 2240 does not appear to take into account the years of organizing work done by the Campaign to Stop Killer Robots. The Campaign advocates for maintaining meaningful human control over the use of force, and Int. No. 2240 stops just short of this. While it prohibits “use of a robot in any manner that is substantially likely to cause death or serious physical injury,” it does not ban outright the use of force by remote controlled or autonomous robots. The Campaign has written policy that explains why such a ban is necessary, and Int. No. 2240 would benefit from being informed by such policy.
Finally, Int. No. 2240 does not contemplate that weaponization, lethal or otherwise, is not the only aspect of police robots that is concerning. Although this need not necessarily be reflected in a bill that bans weaponization, discussion about regulating police robots should keep in mind their myriad uses, including their capacity for surveillance.