Does This Picture Mean the 4th Amendment Is Dead?

January 26, 2020 Topic: Security Blog Brand: The Buzz Tags: Facial RecognitionSurveillance

Does This Picture Mean the 4th Amendment Is Dead?

The language of the Fourth Amendment strongly suggests that warrants are required for searches, and the Supreme Court has permitted only narrow exceptions to the warrant requirement.

 

Last week, The New York Times reported on a facial recognition technology company offering law enforcement, federal government agencies, and companies the ability to identify people simply by uploading a photograph. Clearview AI has compiled more than three billion images by scanning Facebook and other platforms. It analyzes uploaded facial images and returns public photos that match the photo subject, along with links to where those photos appeared. Clearview’s service, says the Times, “could identify activists at a protest or an attractive stranger on the subway, revealing not just their names but where they lived, what they did and whom they knew.”

Code within the technology would allow it to integrate with augmented-reality glasses, allowing someone walking down the street, perhaps, to identify each person encountered in short order. Perhaps such glasses or stationary monitors could be tuned to alert on criminal suspects.

 

The reporter who wrote the story, Kashmir Hill, has been at it for years, using an experiential curiosity to draw out the peculiarities, trends, and threats animating our onrushing digital society. She lived on bitcoin for a week back when cryptocurrency was new and cool.

Her role in this story was interesting: Clearview wasn’t responding to her inquiries. She invited police officers with access to Clearview’s technology to search her face. The company then contacted the police officers asking if they were talking to the media. The vignette helps make real the Orwellian possibilities in this technology offering.

The reaction to the story about Clearview has been strongly negative, at least in my field of vision. Sundar Pichai, the CEO of Google’s parent company, Alphabet, has called for a moratorium on the technology. One exotic proposal would thwart Clearview’s collection of facial images through a class-action copyright lawsuit.

If facial recognition technology or implementations of it are to see a legal response, I’d like it to be founded in some process that articulates what right, existing or inchoate, it offends. Decisions about technology deployment should not be based on feelings or even similarity to what China does. Rather, we should ask what we get and what we give up.

That possibility of identifying suspects in near-real time is a tantalizing potential benefit of this technology. It could dramatically speed apprehensions, ending crime sprees before they start. You could get automated all-points bulletins that immediately scour cities for wanted persons. Those are substantial law enforcement and security benefits indeed.

On the other side of the ledger are a number of values, including privacy. Facial recognition technology will have error rates, and some studies have shown higher error rates when used on minorities. These are matters for programming and tuning so that any use of the technology does not violate Due Process and Equal Protection rights — not small challenges.

As to privacy, the technology could undo the sense dominant since urbanization that one can walk down the street mostly unrecognized, unrecorded, and unmonitored. Losing that would be a big change and a big loss for legal and social freedom, as well as some dimensions of personal security. Manifold details of the technology raise more specific privacy issues, such as who accesses and what happens to data about identifications.

I’ve given some thought to one dimension of this enormous problem. In 2017, I opined about facial recognition technology in a paper that laid out a methodology for administering the Fourth Amendment faithfully to its text, even in a high-tech context. Some rather exotic arguments in that paper have rapidly become timely. Law enforcement use of facial recognition (even through a commercial product) is, of course, subject to Fourth Amendment constraints.

Instead of the “reasonable expectation of privacy” test that has dominated Fourth Amendment cases for much of the last 50 years, I believe courts should return to applying the Fourth Amendment by its terms: “The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated.” That language asks whether there has been a seizure or search of particular things and, if so, whether that was reasonable.

 

Is there a seizure involved in facial recognition? Perhaps with drivers’ license photos, but when photos are taken consensually, generally, no. In “Administering the Fourth Amendment in the Digital Age,” I said:

Our faces are exposed to the public every day, of course. Facial recognition can be done on photographs that were taken voluntarily or with the acquiescence of the subject, so collecting the appearance of the face is typically not a seizure. There is no right to exclude others from such imagery per se.

What about search?

Gathering a facial image (in the visible spectrum) does not give exposure to concealed things, so collection of a facial image is not a search on that basis. That does not foreclose the question whether exposed facial images once collected might be searched.

There is a common trope that things in public are exposed, so they can’t be searched. It’s not quite right. In Kyllo v. United States (2001), the Supreme Court noted that, when the Fourth Amendment was adopted: “to ‘search’ meant ‘[t]o look over or through for the purpose of finding something; to explore; to examine by inspection; as, to search a house for a book; to search the wood for a thief.” Often, exposure given to concealed things signals that a search has happened. But the “purpose of finding something” is the essence of searching. You can search a wood, which is open to all. You can also search faces displayed in public, I believe.

Facial recognition systems rely on what I’ve called pre-search. With today’s technologies, you can canvass an area before you know what you’re going to look for. Facial recognition relies on searching every face to create a signature, then searching all those faces again via the signature when a candidate for matching is being sought. As I point out in my paper:

Searching has two conceptual parts, which generally occur in a particular order. First, the specific thing to be searched for is identified. Next, the field in which it may be found is examined. Searching a forest, for instance, involves identifying the person, instrumentality, or evidence to be found, then marching through the area with eyes peeled for that thing. Facial recognition reverses these processes. It collects the material to be canvassed—facial signatures—then at any later time canvasses the earlier-collected facial signature data for a match. The fact that the steps in the process are reversed should not change the conclusion that facial recognition is a search technology and the use of it is a search. Conversion of a facial image to a facial signature that can be scanned for matches is a search of the face itself to render data that make the face amenable to being the object of a later search. It is enough of a step in the process of searching that it is best recognized as a search occurring at the time the processing is done. Facial recognition is an example of exposed things being searched because of the “purpose of finding something.” There is no other purpose to facial recognition technology.

The language of the Fourth Amendment strongly suggests that warrants are required for searches, and the Supreme Court has permitted only narrow exceptions to the warrant requirement. A warrant for a program that searches all faces could not issue under the Fourth Amendment, as it would fail the particularity requirement. It would be a general warrant.

The methodology on display here, as I said, is exotic. But it is the best I’ve been able to do at matching the Fourth Amendment’s text and rationale with the new technological environment we’re rapidly entering. It’s an environment that includes things like rapid DNA sequencing. Law enforcement uses of facial recognition will have to be very sharply tailored indeed if we are going to achieve the Supreme Court’s oft-stated goal of “preserv[ing] that degree of privacy against government that existed when the Fourth Amendment was adopted.”

This article by Jim Harper first appeared at the American Enterprise Institute.

Image: Reuters.