Google, Microsoft, and Amazon have all been under pressure these last few months over being suppliers of technology to the law enforcement and defense markets. Pretty much all technology vendors have sold into these markets since well, the beginning of technology as we know it. And much of the technology we know and love today grew out of government, particularly military, requirements and projects. The Internet and GPS are the two most obvious, but others are all around us. Supercomputers, though now used for many commercial applications, exist almost entirely because of decades of U.S. nuclear weapons labs’ insatiable thirst for compute power. The current level of microchip ubiquity owes a lot to U.S. Military concerns that U.S. industry would be unable to keep up with the military’s need for advanced semiconductors, and thus they funded SEMATECH for the first decade of its existence. While there has always been some public opinion risk to selling technology into law enforcement and defense markets, the current wave of pressure is based on a new dynamic. The cloud changes everything, where the technology provider doesn’t just (fairly quietly) sell hardware and software into a controversial market but also operates it (rather publicly) for its customers. Make the service something AI-related, the 21st Century equivalent of 20th Century nuclear weapons and energy concerns, and you have a topic ripe for public discourse.
Before getting more directly into the ACLU taking issue with the Amazon Rekognition service that AWS offers I was going to set a little more context. The current cloud leaders are primarily (or at least, in the case of Microsoft, heavily) focused on consumer-direct offerings. It’s a lot easier to use public indignation as a weapon when a company sells to the public then when its customers are other industrial companies. For example, how much public pressure could you put on IBM or Digital Equipment to stop selling for defense use? You go to Lockheed or Boeing or Northup-Grumman and say “stop buying from these guys because they sell to the CIA, Air Force, Navy, etc.” and they look at you like you have two heads (or none at all actually) because those are their customers too. Bad analogy? Ok, you go to Ford and tell them to stop and they start telling you about this Ford Aeronautics subsidiary that sells to the defense market. Ford Aeronautics (which Ford sold to Loral in 1990) was a subcontractor on multiple nuclear missiles, amongst other things. Now Ford would seem to have been a target for protests against nuclear weapons, but I suspect any such effort in the 50s-80s would backfire. How about GE, GM, Goodyear, Chrysler, etc.? Same story. So while Digital Equipment Corporation never made military-specific products, their products were used everywhere in defense and law-enforcement realms.
Until at least Watergate, and really up until the end of the cold war, being a supplier to the defense of the United States was nearly always a net PR positive. Let’s not forget that John F. Kennedy was elected President partially by hammering home a message that the U.S. was behind in nuclear missiles, the so-called Missile Gap, and appointed Ford-executive Robert McNamara to be Secretary of Defense. Or that Ronald Reagan later used the industrial might of the U.S. to force an end to the Cold War. I don’t bring this up to be political, but rather to point out these issues are often orthogonal to political party or philosophy.
So now to the ACLU. The ACLU has gone to war against Amazon Web Services offering facial recognition technology (Amazon Rekognition) to law enforcement agencies. Note that Rekognition is not specifically about facial recognition, and doesn’t specifically target law enforcement requirements. It is a generalized image (and video) recognition technology, and it is this generality that makes it a cost effective commercial offering. Facial recognition is, not surprisingly, a popular use case. The ACLU’s first attack came back in May when they discovered Rekognition was being used by some law enforcement agencies for facial recognition. Then this week they launched another barrage by showing that using default settings Rekognition falsely identified members of Congress as matching images found in a mugshot database. I felt really bad for the Rekognition leadership, former co-workers and friends, as I’m sure they never expected to find themselves being attacked by the ACLU. However, in some ways this was obviously coming. The ACLU doesn’t appear to have much influence with Law Enforcement, it is a generally adversarial relationship. The ACLU doesn’t appear to have much more of a fan base amongst members of the current Congress. So attacking a technology supplier, particularly one part of a consumer-focused company, is one of the few tools at the ACLU’s disposal. In other words, you can’t get Law Enforcement to stop using facial recognition so maybe you can make it harder for them to obtain the technology.
For all the hoopla here, AWS has no exclusivity on providing facial nor general image recognition technology. Beyond other commercial technology suppliers, the FBI, Homeland Security, and other large law enforcement agencies have privately developed and operated systems for doing facial recognition. What AWS has done with Rekognition is democratize the availability of this technology, making it affordable for (amongst many others) smaller law enforcement agencies. If AWS stops selling Rekognition to Law Enforcement it will have no impact on, for example, the NYPD’s use of facial recognition. It may create a country of have and have not agencies, where the NYPD has the ability to scan a crowd for a kidnapped child but small departments can not. Admittedly that’s the positive spin on Rekognition, a more negative spin is that New York will become an Orwellian nightmare while small cities and towns remain free of the surveillance state. If you believe preventing small agencies from having access to Rekognition will keep the surveillance state at bay then I have a bridge to sell you in Brooklyn, surveillance cameras (which you can rip out) and all. What will really happen is that an alternate service, from a provider without a consumer business and perhaps privately held (so even shareholder pressure doesn’t work), will emerge. Or Congress could even mandate that a Federal Government-developed solution be offered to local law enforcement agencies at subsidized pricing.
This leads back to where this is really going, that attacking Rekognition is all about trying to force the Federal government to put in place acceptable (to the ACLU of course) rules for the use of facial recognition technology. Microsoft’s Brad Smith argued this exact end-game a couple of week’s ago. While regulation, even more so premature or overreaching regulation, is not something I’m a fan of some regulation in this space is inevitable. Without it we will end up with a patchwork of legal rulings that attempt to map 21st Century technology to our Bill of Rights and century-old laws that are aging badly in the face of new technology. Brad called out some very good issues that should be addressed.
Today’s blowup is largely a technology stunt by the ACLU. Let’s say you want to present a picture with an animal in it and ask one of three questions. Question one is “Is there a dog in this picture”. Question two is “Is it a Bernese Mountain Dog”. Question three is “Is it MY Bernese Mountain Dog”. The use cases for these three questions may be very different, and the confidence level required may be different as well. The default confidence level for Amazon Rekognition is 80%, which is fine for doing quick scans of photos looking for dogs. Yes you will get an occasional false positive in there, such as a coyote, fox, or house cat. Asking the Bernese Mountain Dog question likely requires more than 80% confidence to avoid an overwhelming number of false positives, because there are enough other breeds with similar colors. Or take the Greater Swiss Mountain Dog, the differences (most obvious to the casual observer is coat length), means at 90% you may still see a lot of Swissies in with the Berners. Trying to pick “my” dog out of the crowd probably requires 95% confidence and even then will yield occasional false positives, something I know from my own experience looking at a Berner picture and mistakenly thinking it was of my dog. So when the ACLU used an 80% confidence level to match members of Congress with mugshots yielded a bunch of potential Congressional criminals that should have come as no surprise. 80% seems like basically what you’d get from a mediocre criminal sketch artist drawing. Enough to take a closer look at someone, but not a definitive match. Had the ACLU used the 95% confidence level it would have seemed like less of a stunt and more of a real warning about use of the technology, but I suspect the press will mostly echo the ACLU’s message.
For me the ACLU’s attack on Amazon Rekognition damages their credibility, and as a sometimes contributor/member probably sends me into another cycle of being negative on them. I just don’t like seeing good, and indeed broadly game changing, technology being used as a whipping boy to get around their (or anyone’s) public policy impotence. I guess I’m just not generally a “the ends justify the means” kind of guy.
Tangential to your main point, but what does 80% confidence level indicate in the canine image recognition? About 20% of the flagged images turn out not to be a dog or expect to miss about 20% of pictures with dogs or something more fuzzy that every vendor defines subjectively and flexibly depending on the type of query?
Probably depends on if the question is “is there a dog in this picture” or if it is “is this animal a dog”. Ultimately it means whatever the algorithm says it’s confidence is.