Uncategorized

Your Face is Not Your Own

By KIM BELLARD

I swear I’d been thinking about writing about facial recognition long before I discovered that John Oliver devoted his show last night to it.  Last week I wrote about how “Defund Police” should be expanded to “Defund Health Care,” and included a link to Mr. Oliver’s related episode, only to have a critic comment that I should have just given the link and left it at that.  

Now, I can’t blame anyone for preferring Mr. Oliver’s insights to mine, so I’ll link to his observations straightaway…but if you’re interested in some thoughts about facial recognition and healthcare, I hope you’ll keep reading.

Facial recognition is, indeed, in the news lately, and not in a good way.  Its use, particularly by law enforcement agencies, has become more widely known, as have some of its shortcomings.  At best, it is still weak at accurately identifying minority faces (or women), and at worst it poses significant privacy concerns for, well, everyone.  The fact that someone using such software could identify you in a crowd using publicly available photographs, and then track your past and subsequent movements, is the essence of Big Brother.  

For once technology companies are at least pretending to be concerned.  IBM was the first, saying it was getting out of the facial recognition business entirely, including research, due to concerns about bias and potential for abuse.  CEO Arvind Krishna’s letter to Congress urged

We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies. Artificial Intelligence is a powerful tool that can help.   

Both Amazon and Microsoft subsequently put moratoriums on police use of their facial recognition software.  “We’re implementing a one-year moratorium on police use of Amazon’s facial recognition technology… We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested,” Amazon announced.  

Microsoft President Brad Smith echoed the call, stating: “We’ve decided that we will not sell facial-recognition technology to police departments in the United States until we have a national law in place, grounded in human rights, that will govern this technology.”    

Of course, Amazon and Microsoft are still selling their software to other parties, and there are several companies still selling to law enforcement, such as NEC and Clearview AI.  

The seemingly sudden (and narrowly focused) concern for our privacy rights has been spurred by the resurgence of the Black Lives Matter movement, in the wake of the nationwide protests about the George Floyd murder.  It made people uneasy that facial recognition might be used by the police to identify protesters, who were, of course, protesting the police.

There are at least two ironies here.  One is that the masks many protesters have been wearing due to coronavirus concerns make it more difficult for facial recognition software to identify them, although that is a technical challenge developers are addressing.   Masks have served a dual privacy/pandemic role in Hong Kong for some time, and there is an interesting design battle going on for masks and clothing that help defeat or at least confuse facial recognition, so it remains an open question whether they are a roadblock or just a speed bump.  

The other irony is it was smartphone video, another almost constant form of surveillance, that captured Mr. Floyd’s demise.  In The Wall Street Journal, Joanna Stern wrote

Many white Americans, myself included, failed until recently to grasp one of the biggest impacts of the smartphone: its ability to make the world witness police brutality toward African-Americans that was all too easy to ignore in the past. We could now see, with our own eyes, the black sides of stories that were otherwise lost when white officers filed their police reports.

One activist told her: The smartphone is a weapon that tells the story. This is going to tell what happened to me, this is what will tell what took place.”

Technology gives, technology takes away.

Years ago I speculated that facial recognition could be used to identify when we might be sick, and perhaps be used to diagnose us, and this is now within reach of existing technologies.  We’ve got an array of surveillance measures that are tracking who we are, where we are, what we’re doing, and even how we might be feeling.   

And we thought it was bad when we realized Google was reading our emails or Facebook was monetizing our interests.  

Whether we like it or not, whether we realize it or not, we not only don’t know who is monitoring what we’re doing on the internet, but we also don’t know when what cameras are watching us, nor who is using which software to do what with those images.  Americans like to believe we have a constitutional right to privacy, but most would be surprised to find that such rights are more implied than explicit.

We should applaud the positions that Amazon, IBM, and Microsoft are taking on facial recognition, and we should welcome an explicit discussion about what the limits of facial recognition should be, but we shouldn’t kid ourselves that the technology isn’t going to advance faster than our privacy laws.  We’ve known that privacy has been an issue in tech for some time now, and despite efforts like the E.U.’s GDPR or California’s Consumer Privacy Act, few of us would say we’ve become more reassured about our online privacy (and note how COVID-19 is pushing GDPR to its limits).  Facial recognition just adds to those concerns.  

Few of us would protest using facial recognition to reunite lost/missing children, to capture escaped killers, or perhaps even to isolate people infected with a deadly infectious disease.  Few, though, would probably be comfortable with third parties, whether they be law enforcement agencies or simply advertisers, always knowing where we are and have been.  There’s a line to be drawn, but it’s going to be a blurry one.    

I don’t know exactly where the line is, but I’d start with cui bono: who benefits?  

As Ms. Stern wrote about smartphone cameras, “Like any technology story, what we do with them, and the world we want them to capture, is up to us.” 

Kim is a former emarketing exec at a major Blues plan, editor of the late & lamented Tincture.io, and now regular THCB contributor.