Facing the Music:
The Dubious Constitutionality of Facial Recognition Technology
John Brogan
April 2, 2002
First Final Draft
UI Cyberspace Law Seminar 2002



Contents
I. Introduction
II. Background
III. Facial Scanning as a Fourth Amendment Event
IV. An Alternative Approach
V. Conclusion
Endnotes
I. Introduction

National identification cards are in the news again,1 and in the wake of the events of September 11th, the political climate seems ripe for their adoption.2 Proponents of such a system point to its value as a tool for the protection of innocent individuals,3 downplaying its significance as a vehicle for monitoring individual activities.4

In truth, the government already has everything it needs to keep tabs on its residents: omnipresent video cameras, databases of information, and facial matching technology.  Combined, this technology provides unprecedented power to keep track of the most intimate details of human life: the places we go, the activities we engage in, and the people with whom we associate.5

The growth of biometric identification,6 and in particular facial scanning technology, raises serious questions about the continued longevity of a variety of Constitutional protections.

In this article, I argue that the deleterious effects of facial scanning technology may be curtailed by distinguishing between wide area facial scans7 and focused facial scans.8  Recognizing this distinction, courts should determine that wide area scans are per se unconstitutional while focused scans serve a legitimate law enforcement purpose when supported by a minimal level of suspicion that a particular individual is engaged in criminal activity.

Part II of this article considers the background of biometric identification, tracing the lineage of technological evolution from the early approaches of criminologists to modern crime databases.  The section then moves to a consideration of the component parts of facial scanning: surveillance, recognition, and information synthesis.

Part III considers the validity of facial scanning as a Fourth Amendment event under the Constitution, examining critical cases decided by the United States Supreme Court from United States v. Katz through Kyllo v. United States.

Part IV argues that the generic category of facial scanning should be bifurcated into two different types of scans: wide area scans and focused scans.  It then proposes that wide area scans should be severely limited and that focused scans should be available to law enforcement on a minimal level of suspicion.

II. Background

A. The Historical Origins of Biometric Identification

 Biometric identification is not a new phenomenon in law enforcement.  Almost as long as scientists have been aware that each human being has certain physical attributes that are unique to that individual, criminologists have taken advantage of that fact to aid them in the investigation of crimes. The first recorded incident of biometric identification was Alphonse Bertillon's database of criminals in Nineteenth Century Paris.9  Bertillon recorded various physical statistics: the length of a criminal's finger and the circumference of his head in an effort to quickly identify recidivists.10

 Bertillon's system quickly evolved and the recognition of fingerprints as a means of identification opened new doors for investigating criminal activity.11  But the true breakthrough in biometric databases has been the advent of the digital age.

A collection of information is only as useful as the ability to organize it efficiently and access it rapidly.  Moreover, the growing transience of society makes it necessary to have multiple sinks of information that are both self-contained and integrated.  The databases must be self-contained in order to quickly access a limited amount of information based on its geographical relevance; it must be integrated in order to allow law enforcement agencies to access a broader field of information on individuals moving through different jurisdictions.

The necessity of solving these two problems became clear to police departments in the mid-1980s that were struggling under the weight of excess information.  In Los Angeles in 1985, a manual match of an anonymous fingerprint would require an examination of over 1.7 million fingerprint cards, a Sisyphean task for a single worker.12

The answer to this problem is the utilization of a series of decentralized but connected databases, known as a distributed database.13  The distributed database achieves the two needs of law enforcement by being both self-contained and interoperable.  As a result, local police can quickly identify a fingerprint by first checking the print in their local computer system before broadening the scope of the search to other jurisdictions.14

While fingerprinting is probably the most common and widely known form of biometric identification, it is by no means the only example.  A variety of forms of biometric identification have now become commonplace, including retinal recognition,15 handprint scans,16 and DNA matching.17  In addition, other databases have proliferated throughout the public and private sectors that contain a vast quantity of somewhat overlapping information.18  As a result, a surprisingly complete portrait of an individual can be gleaned from the trivial linkage of just a few databases.

A full examination of just how much information these databases track is beyond the scope of this article; however, one commentator has suggested that the database problem cannot adequately be understood by way of the Big Brother metaphor--even when adapted to account for private sector databases. Although the Big Brother metaphor certainly describes particular facets of the problem, . . . . [the more apropos metaphor is] Franz Kafka's depiction of bureaucracy in The Trial--a more thoughtless process of bureaucratic indifference, arbitrary errors, and dehumanization, a world where people feel powerless and vulnerable, without any meaningful form of participation in the collection and use of their information.19

B. The Components of Facial Scanning

Turning from the question of databases generally to the more specific consideration of using these databases for law enforcement purposes, it is important to begin by examining the underlying steps involved in a particular scan.
A database of information, in itself, tells us very little about future activity.  Although it is sometimes useful in predicting future events from patterns of empirical criminal behavior, the real goal of maintaining a large amount of biometric information is to reconstruct the past.

Investigators arrive at a crime scene and must attempt to piece together what happened at that location from small clues: a fingerprint, a strand of hair, skin cells, or semen.  In each case, a piece of physical evidence taken from a crime scene is checked against existing records to determine if a match exists. If a match is found, the investigator's job is made easier; if not, the information is stored in the computer and available for future cataloguing and cross-checking.

At an abstract level, the database scan involves three steps.

First, baseline information is entered into the database.  This information could be fingerprint information, DNA information, or something else entirely.  The information may be obtained through some standard arrest procedure,20 from an employer,21 from an insurer, or by alternative means.

Second, comparison information must be acquired from another source.  Lifting fingerprints or other DNA information from a crime scene is one way in which comparative data might be acquired.

Third, the two pieces of information must be checked against each other.  This is the phase at which an individual or a computer checks the anonymous comparative data against the baseline data and determines if the two pieces of biometric data originated from the same source.  The outcome of such an analysis is binary; that is, at the end of the process one is left with a statement that is either true or false.  The two pieces of biometric information came from the same person or they did not.

Facial scanning operates much like other forms of biometric identification.  In order to make the system work, there must first be a database of baseline images against which the comparative image must be checked.22  The comparative image must then be acquired from an input source and analyzed against the baseline data.  On the surface, these processes look very similar, but as one looks more closely, the differences become clear.

III. Facial Scanning as a Fourth Amendment Event

A. Katz and Its Progeny

1.  United States v. Katz23

The Fourth Amendment to the United States Constitution guarantees that individuals shall be free from "unreasonable searches and seizures."24  Nevertheless, a substantial amount of time and effort has been expended in determining what actually constitutes a search.  For the purposes of our inquiry, the seminal case on this issue is United States v. Katz, which established in Justice Harlan's concurrence a seemingly simple two-pronged test.25  The test provides that a governmental action constitutes a search if an individual has a subjective expectation of privacy and if society is willing to objectively recognize that expectation as reasonable.26

Katz broke from prior law in that it shifted from an interpretation of the Fourth Amendment that focused on sacrosanct spaces to a broader view of privacy as a tied to an individual.27  But as much as Katz gave in the way of an expanded notion of privacy, it sustained an important caveat: that society is not prepared to accept that which is openly displayed to the public as private.28  As a result, that which is done or said in a space that can be viewed or overheard by a member of the general public is not off limits to law enforcement officials merely because they are an arm of the government.

Two of the Court's subsequent cases demonstrate how large an exception the public view doctrine carves into the Fourth Amendment.

2. Smith v. Maryland29

In Smith v. Maryland, the Court held that an individual did not have a reasonable expectation of privacy in the phone numbers he dialed.30  Smith involved a case in which the phone company, at the request of police but without a warrant from a magistrate, placed a pen register on Smith's telephone line in order to ascertain what numbers he was dialing.31  Smith sought to suppress evidence obtained from the pen register, arguing that he had a subjective expectation of privacy in the phone numbers he dialed and that his expectation was reasonable.32

The Court disagreed, arguing that society is unwilling to accept that there is a reasonable expectation of privacy in the phone numbers one dials.33  Pointing to the fact that people are generally aware that the phone company keeps track of long distance numbers for billing purposes, the Court found that tracking the local numbers produced no incremental invasion.34  In addition, the Court asserted that because a phone number does not expose the content of the actual phone conversation and is not content in itself, it does not constitute a search within the meaning of the Fourth Amendment.35

The potential impact of Smith in light of the growing necessity of the Internet, and more specifically e-mail, as a form of communication is unclear.  A number of scholars have already expressed concern about justifying the constitutionality of the Carnivore36 system on Smith.37  Although a lengthier discussion of these issues is beyond the scope of this article, it is nevertheless important to recognize that as the law currently stands, it appears relatively clear that the utilization of technology to intercept information traveling through the public domain does not amount to a constitutional search event under the Fourth Amendment.38

This sobering realization provides two alternatives.  The first possibility is that non-content based communications will never be protected by the Fourth Amendment.  Alternatively, an information transfer may constitute a search, but requires something more than the trace of a numerical signature,39 as in Smith, to invoke constitutional protection.40  It is to this latter possibility that we will return later in this section.41

3. California v. Ciraolo42

In Ciraolo, the Court considered the issue of whether an overflight of a person's backyard that is surrounded by a high fence constitutes a search under the Fourth Amendment.43  In making the threshold determination, the Court considered the two factors established in Katz: did Ciraolo have a subjective expectation of privacy regarding the curtillage of his home and, if so, was society prepared to accept that expectation as reasonable.44

The Court gently waffled its way past the first issue, pointing out that although the high fences and the geographic proximity of the marijuana crop to his home evidenced some subjective expectation of privacy, Ciraolo had not engaged in the most privacy-protecting conduct.45  For instance, the Court remarked that Ciraolo did not attempt to hide his gardening activities from the view of someone standing on top of a double-decker bus or truck.46  As such, the Court concluded that a mere hope that no one will see into an enclosed space is different than a subjective manifestation of privacy.47

The Court disposed of the objective prong more easily, concluding that because the public routinely flies over property in commercial airplanes, society is unprepared to recognize the belief that one's property is free from aerial surveillance as objectively reasonable.48

Ciraolo is particularly interesting for our purposes on two levels.  First, it is a case that deals with a space that has long been considered deserving of the greatest level of protection from government intrusion -- the home and the areas immediately adjacent to it.  Second, it is not a case that squarely addresses the technology issue, even though it is essential to the disposition of the case.49

The space question can be dismissed quickly, albeit not perfunctorily.  What Ciraolo apparently tells us is that even the most sacred spaces are not immune from the government's prying eyes. To wit, if you want to sunbathe nude in your backyard, build a roof over it. But if our expectations of privacy regarding our own backyards are unreasonable, what can we expect of spaces away from the home?  The answer, seemingly, is very little.

Second, Ciraolo wonderfully demonstrates an area of concern discussed by a number of scholars.50  The concern is that the ever-expanding power of technology will continue to erode objective notions of privacy to the point that even the most significant invasions of privacy will be commonplace.51  In Ciraolo, the notion that air transportation had become, at the time of the decision, so ubiquitous as to negate an objective expectation of privacy is telling.52  It suggests that the Court may be less concerned with technological encroachments whose primary purpose is not surveillance-oriented.  Thus, the airplane is not as threatening a technological intrusion as the spike mike.

The result is both confusing and ironic.  The Court has consistently ruled that enhancing technologies do not constitute searches under the Fourth Amendment.53  Binoculars merely enhance what an officer could see with his own eyes;54 night vision goggles do much the same. Dog sniffs expand upon the sense of smell.55  But barring an argument that airplanes just expand on our ability to flap our arms, it's hard to see what justifies the intrusion.  The pressures that erode privacy can therefore be seen as bi-directional: general technology raising the objectivity bar while sense-enhancing technologies narrow the breadth of protection.

4. Kyllo v. United States56

The Court's most recent grapple with the threshold search question was in Kyllo.  At issue in that case was whether a thermal scan of Kyllo's residence constituted a presumptively unreasonable search requiring a warrant under the Fourth Amendment.57

Writing for the majority, Justice Scalia concluded that the scan was an unreasonable search.  In making this determination, the Court retraced its previous decisions, emphasizing the test cobbled from Katz that there must be a subjective expectation of privacy that is determined to be objectively reasonable.58  Bracketing Smith and Ciraolo as cases where the defendant failed to meet his objective burden under the Katz test, the majority took particular exception to two aspects of the scan involved in Kyllo.

First, the Court emphasized its longstanding commitment to the home as a particular zone of privacy worthy of the greatest degree of protection.59  Within that space, it found that individual and societal expectations of privacy are at their zenith.

Second, given the importance of the protected space, the Court held that the use of specialized technology to acquire information about the interior activities of a home, normally obtainable only with a warranted search, was unreasonable per se.  In making this determination, the majority refused to accept the Ninth Circuit's analysis, adhered to by the dissent, that there is a discernible distinction between devices that monitor data emanating from a home as opposed to technologies that scan its interior.  Justice Scalia opined that the government's argument is that

there is a fundamental difference between what it calls "off-the-wall" observations and "through-the-wall surveillance." But just as a thermal imager captures only heat emanating from a house, so also a powerful directional microphone picks up only sound emanating from a house -- and a satellite capable of scanning from many miles away would pick up only visible light emanating from a house. We rejected such a mechanical interpretation of the Fourth Amendment in Katz, where the eavesdropping device picked up only sound waves that reached the exterior of the phone booth. Reversing that approach would leave the homeowner at the mercy of advancing technology -- including imaging technology that could discern all human  activity in the home.60
In the wake of Kyllo, a critical question remains: does the Court's decision apply only to specialized technologies aimed at the home?  In the narrowest reading, it seems that it would.  The majority is particularly concerned with intrusions into the home.61  But a slightly broader reading could offer supporters of privacy more hope.

The Court's notion of objective privacy hinges on whether or not the technology performing the scan is in general use by the public.62 Furthermore, the reaffirmation of Katz in the Kyllo decision suggests that notions of privacy remain tied to the individual rather than the space.63  As a result, although it's clear from the decision in Kyllo that the reason why the Court is quick to invalidate the search is because it intrudes into the home, it does not follow that the home is the only locus of protection; it just means that applying the rationale of Katz to other spaces is more problematic.64

Operating within the framework established by Kyllo and its predecessors, the possible constitutional challenges to wide area facial scans begin to emerge more clearly.

 B. To Scan or Not to Scan?

After the incidents of September 11, law enforcement officials are under extreme pressure to protect against future catastrophes.65  In this climate, numerous means of crime prevention have been bandied about, including national identity cards and state-sanctioned torture.66  Focusing on facial scanning as one of these preventive strategies, this section considers whether or not is rises to the level of a search under the Fourth Amendment.

Before directly answering the question, it is worthwhile to spend some time looking at where the technology stands today - where it is being deployed, how it is being used, and who is watching.

1. Scanners, Scanners Everywhere . . .

In early 2001, some 72,000 fans attending Superbowl XXXV were facially scanned while entering through the turnstiles of Raymond James Stadium in Tampa.67  Using technology called FaceTrac, 128 points of each attendee's face were scanned and checked against an FBI crime database.68  Although no arrests were made, 19 known criminals were identified as a result of the scan.69  At the time, privacy groups condemned the scan,70 but in the wake of the World Trade Center attack, resistance the technology is quickly fading away.71

In fact, a number of airports in various cities already have the technology in place, including: Tampa,72 London,73 Fresno,74 Providence,75 Kansas City,76 Boston,77 and shortly Washington D.C.78  Virginia Beach is using the technology at the Ocenfront to track "runaways, wanted felons and people suffering from dementia."79  In San Francisco, at least 30 high-resolution cameras are being installed at every Bay Area Rapid Transit (BART) stop to scan passengers moving throughout the city.80

Remarking on the growing omnipresence of cameras, former National Security Agency counsel Stewart Baker suggested that "'George Orwell underestimated our enthusiasm for surveillance . . . . He correctly predicted we'd have cameras everywhere.  What he failed to imagine is that we'd want them so bad[ly] we'd pay for them.'"81

This mentality is reflected by security specialists preparing for potential terrorist strikes on sports arenas, who are eager to use face scanning devices like the ones operated at the Superbowl in Tampa.82  Furthermore, "[f]acilities managers are working closely with law enforcement, watching for" danger signs like "[m]eetings and public protests by dissident groups."83

In short, public support for increased security measures is so high that the political checks that should operate to preserve privacy measures cannot reliably operate.  As such, it is the judiciary who will likely have to take a leading role in safeguarding the general public from itself.
 
2. Making Facial Scanning a Search

By all indications, convincing courts to treat facial scanning as a search under the Fourth Amendment is at least an uphill battle, and at most, impossible.  At least one commentator addressing the issue suggests that facial scans are unlikely to rise to the level of a search.84  As discussed earlier, the public view doctrine established in Katz and subsequently expounded in Smith and Ciraolo probably encompasses all of the spaces in which an individual might be subject to a facial scan.85  Because airports, public transit stations, and sport stadiums are all open spaces in which an individual's face could be readily viewed and identified by a police officer, the use of a video camera to achieve the same end, arguably, is no different.

In spite of this fact, a number of characteristics of wide area facial scans create a chasm so great between personal police observation and the kind of observation engaged in by the computer that some level of Fourth Amendment protection seems necessary.
 
a. Approaching the Problem

Finding an analog between wide area facial scans conducted by computers and traditional forms of police activity is difficult because the scans have an amorphous quality that is difficult to characterize precisely.  In particular, the level of invasiveness of a scan depends, in large part, on the way the system works.

For instance, the FaceIt system developed by Visionics, Inc. is capable of recognizing single or multiple faces in either one-to-one or one-to-many matching mode.86  Depending on the implementation of the system, authorities may have access to a relatively small pool of images in the database87 or an extremely large number.88  Furthermore, depending on the level of integration, the system could return only the name of the individual or it could return cross-referenced data available in any linked database.  This could include, potentially, any criminal information, tax information, credit information, health information, and vehicle registration information.  Likewise, the database could be read-only, which would prevent the system from adding information, or it could be writable, allowing the system to automatically add certain details every time it found a match.  This information could include the time at which the match was made, the location of the camera making the match, and possibly the images of individuals photographed shortly before or after a match.  In effect, simply by hooking the right databases together, law enforcement officials could track the movements of an individual, the people with whom she associated, and any other information tied into the system.  To give an extreme, but not unfathomable example, each time one stopped at a toll, stepped on the subway, passed through a turnstile, withdrew money from an ATM, or went to work, the system could keep a record, and this would be made possible by the mandatory photo identification issued by each state's Department of Motor Vehicles.
 
b. Applying the Katz analysis

Given the diversity of systems that can and have been implemented in various jurisdictions coupled with the ease with which a system could be upgraded or downgraded as necessary, attempting to argue that particular versions of the system may or may not be legitimate would be futile.  As such, this argument proceeds under the notion that a brightline rule is more appropriate than a case by case analysis for the purposes of categorizing any given facial scanning activity as an event under the Fourth Amendment.

Operating from this premise, we briefly revisit the dual tiered requirement of Katz that the government engages in a search when it violates an individual's subjective expectation of privacy that society is prepared to recognize as reasonable.89

i. The Subjective Prong

To what extent do individuals have a subjective expectation of not being facially scanned when they appear in public?  The answer to this question obviously depends on the individual.  Nevertheless, there are some fairly good reasons why any particular person might have a subjective expectation of privacy that he is not being scanned.
Conceding that most people have no subjective expectation that they will not be watched or captured on videotape as they move about in public, there must be something more that the individual is relying on in order to meet the first prong of Katz.  One possibility is that although an individual expects to be seen by the government, he does not expect to be recognized by the government.  That is, absent specific illicit conduct that would give the government a reason to learn his name, there is an expectation by all but the most paranoid elements of the society that the government is not watching for you in particular.  Another possibility is that even if one expects to be identified, there is still no expectation that that identification will be tied to a veritable cornucopia of data detailing the most intimate details of one's life.  In either case, most people would likely have some subjective expectation of privacy as they move about in public, even though that expectation is substantially less than they would expect in a more intimate space like the home.

ii. The Objective Prong

The more difficult question is whether such an expectation would be recognized by society as reasonable.  As discussed above, the Court's analysis of this prong is linked to two factors.90  First, to what extent is the technology in question being used by the general public.  Second, does the technology allow law enforcement to achieve an objective that would normally be circumscribed by the Fourth Amendment?

The first issue can be disposed with fairly easily.  Although the individual elements of facial scanning technology are widely available: cameras, recognition software, and databases,91 the power of a scanning system is its breadth: the dizzying quantity of interlinked cameras and baseline databases.  To that extent, facial scanning systems are in no more common use by the general public than was the thermal sensing technology used in Kyllo.

The more difficult question is whether the technology provides an end around the Fourth Amendment.  Arguably, it does for a number of reasons.  First, although law enforcement officials claim that facial scanning systems merely allow "machines to do what . . . [police have] always done" by "giv[ing] policeman pictures and put[ting] them on every corner . . . . ,"92 in truth the two activities are quite different.

Under the traditional method, police start with specific, articulable information: a person for whom they are looking, a reason why they are looking for that person.  The police then take a photograph of that person and compare the face of each person that passes them to the baseline photograph.  In the parlance of Visionics' technology, this is a one-to-one match: each new face is matched against one existing face.93

When the computer performs a wide area scan (or one-to-many match),94 it engages in a task that no police officer individually, or any police force as a whole, could achieve.  It examines each face against as many as a billion faces for a match.95  But more importantly, it does so for no reason.  A wide area scan is not looking for someone in particular; it is looking for anyone, suspicious or not, that happens to wander past.  Furthermore, to the extent that the database tracks the location of faces it successfully scans, it operates as a homing device on a person's movements.  In the words of one commentator: "The new surveillance goes beyond merely invading privacy . . . to making irrelevant many of the constraints that protected privacy . . . ."  For example, mass monitoring allows police to eliminate cumbersome court hearings and warrants.  Immediately after a crime, cops check cameras in the vicinity that may have captured the perp on tape.

So, as surveillance expands, it has the effect of enlarging the reach of the police.  Once it becomes possible to bank all these images, and to call them up by physical topology, it will be feasible to set up an electronic sentry system giving police access to every citizen's comings and goings.96

Of course, the technology feared by that author in 1998 is now a reality.

c. The Nature of the Search

Accepting for the moment, arguendo, that a search has occurred, the question remains: what kind of search?  The obvious problem is that a facial scan doesn't look like the kind of searches we're worried about -- the kind where the contents of someone's house are rummaged while looking for a particular item.

In order to find the answer to this question, we have to look past the basic scan, the recognition of one picture against another, to what the scan really tells police: where an individual is at a particular time.

This kind of information would be invaluable to police in that it could help predict where crimes might be committed, like drug deals and burglaries, but the cost of predicting these events comes at a price of the government being able to intrude into the privacy of a large segment of the population.

IV. An Alternative Approach

This article has repeatedly emphasized the difference between wide area scans and focused scans as tools of law enforcement.97  This distinction is critical to finding an alternate solution that balances society's interest in preventing crime with the individual's interest in being free from governmental intrusions.98

This section first considers and rejects a state legislative solution to the problem of wide area scans before proposing a more comprehensive judicial solution.

A. The Virginia Plan99

In response to the city of Virginia Beach's implementation of facial recognition technology at the Oceanfront, the State of Virginia House of Delegates passed a bill to severely circumscribe the use of the system.100  Virginia's proposed scheme requires law enforcement officials to acquire authorization from the circuit court before installing scanning technology and only when the technology is reasonably likely to provide information pertaining to the commission of a felony, individuals with outstanding felony warrants, terrorists, or missing persons.101

In addition the bill limits use of the installed system to 90 days, provides for court oversight, caps extensions to 60 days or less, and requires the deletion of images of all persons not falling into one of the four enumerated categories "as soon as possible, but in no event . . . for more than ten days."102

The text of the bill also provides for a series of exceptions to the requirements of the bill.  First, the bill is not intended to curtail security measures taken to protect various ports of entry such as public airports and harbors.103  In addition, the bill provides for a grace period during which time any evidence received from scans is admissible in court.104

While the House of Delegates' actions are certainly a step in the right direction, a number of problems remain with respect to privacy issues.105

First, although the bill enumerates four categories that would justify use of the scanning technology, the categories themselves are so broad that it would be virtually impossible for a court to deny them.  In particular, because prediction of future activity is almost impossible absent explicit articulable facts, courts would likely have to rely on statistical analyses to justify the request.  For instance, could a court reasonably deny a request to place scanners in an area where a large number of runaways had been apprehended previously using traditional methods?  For that matter, should courts defer to police use of the technology in high crime areas?  As a practical matter, local judges are unlikely to deny police department requests in all but the most exceptional cases.

Second, the fact that the bill controls the installation of the technology rather than the type of search is problematic.  As discussed above, the real concern with wide area scans is that they cast an extremely wide net that brings a substantial number of individuals for whom there is no suspicion of criminal activity within the gaze of the state.  Under the proposed bill, the arguably unconstitutional activity is limited in temporal duration but not in scope.

Third, to the extent that the bill requires a judicial decree each time that law enforcement wants to use its system, it unnecessarily hampers police from protecting society's interest in the effective investigation of criminal activity.  Focusing on the type of search rather than the type of technology employed is a more appropriate solution.

Fourth, although the bill provides for the admissibility of evidence acquired from facial scans prior to the cut-off date, it does not explicitly provide for the exclusion of such evidence after the grace period.  The only punishment for failing to comply with the bill is being held in contempt of court.  Because the goal of exclusionary rules is to deter future police misconduct, any statute that failed to address this issue would likely be fatally flawed.106

B. The Judicial Option

Having discussed some of the pitfalls of one specific legislative solution, I consider what a judicially created solution might look like.  In proposing this solution, I do not rule out the possibility that a legislative solution could provide some protection against the invasiveness of facial scanning; nevertheless, the judiciary's flexibility to deal with particular fact situations on a case-by-case basis may make it the superior forum for enforcing these rights.

My proposal is that the judiciary should recognize two types of facial scans: wide area scans and focused scans, sometimes known as one-to-one and one-to-many searches.  As a general rule, the judiciary should find that wide area scans are presumptively unreasonable searches of the person.  Nevertheless, law enforcement officials should be allowed to use facial scanning for focused searches so long as they reasonably believe that the use of such a scan is reasonably necessary to prevent the commission of a serious crime or aid in the investigation of a crime that has been committed.

The goals of establishing such a system are twofold: first, to reduce the number of individuals being processed and tracked by the system for whom there is no suspicion of involvement in criminal activity; second, to increase the probability that individuals who have committed or are about to commit a crime have a greater likelihood of being caught.  In return, the increased likelihood of detection should deter future crime.

In addition, management of such a scheme by the judiciary is essential, because the penalty for utilizing the scanning system in an authorized way will be the exclusionary rule.  Because the scan will often be the first interaction between the police and the suspect, exclusion of all fruits of the initial search will act as a super deterrent to illicit uses of the system.

V. Conclusion

 
The dystopic vision of a society in which the government tracks its individuals once seemed like the stuff of science fiction: the work of Orwell and Huxley, the abstract criticism of Foucault, the demented ramblings of conspiracy theorists.  The truth is that the new millennium, pointedly ushered in by the attacks of September 11, has shifted the frame.  This new world does more than ignore the cries of the lunatic fringe -- it actively embraces the destruction of privacy under the guise of increased security.  What is lost in this process is who we might need protection from: is it the next Joseph McCarthy? the next Hitler?

To the extent that one can push back, the time is now.  The incremental protections over which political battles are fought may not seem so incremental in retrospect.  In this context, the oft quoted words of George Orwell immortalized in Justice Brennan's dissent in Florida v. Riley ring eerily true:

The black-mustachio'd face glazed down from every commanding corner. There was one on the house front immediately opposite.  BIG BROTHER IS WATCHING YOU, the captain said ... In the far distance a helicopter skimmed down between the roofs, hovered for an instant like a bluebottle, and darted away again with a curving flight.  It was the Police Patrol, snooping into the people's windows.107

Endnotes

1 Kathryn Balint, Attack on America: Personal Technology, San Diego Union Tribune, Mar. 11, 2002.

2 See Richard Sobel, The Degradation of Political Identity Under a National Identification System, 8 B.U. J. SCI. & TECH. L. 37, 41 (arguing that even before September 11th, the intersection of a number of federally legislated databases was already pushing us toward a National Identification System and contending that the post-September 11th pressures will virtually ensure a widespread identification system).

3 See generally Thomas G. Donlan, "Secure in Their Persons": A National Identity Card is No Threat to Liberty, BARRON'S, Mar. 18, 2002.

4 See id ("But the fear is misplaced in the 21st century.  Americans may fear tyranny, and oppose every hint or possibility that it may emerge.  But an identity card is not tyranny, it is an identity card.  Free citizens may very well wish to prove their identity, and a government composed of free citizens may even require them to do so at time.").

5  For instance, consider the remarks of a recent McLaughlin Group transcript:

MR. MCLAUGHLIN: To Americans, that chilling request is the image long associated with national identity cards.
But the image may be changing, as a result of September 11th. A Harris Poll conducted immediately thereafter found that 68 percent of Americans favored a national I.D. system. Even renowned civil rights influentials support a national I.D. Quote, "We need to distinguish between a right to privacy, which I believe in, and a right to anonymity, which I no longer believe in," unquote . . . . Under review are so-called smart cards, especially biometric cards like scans of the retina and fingerprints. The scans are linked to databases. The Department of Defense is already issuing smart cards to more than 4 million service members.

Senator Dick Durbin of Illinois, a Democrat, has introduced a bill to create a national standard for drivers' licenses, which could become a de facto national I.D. card. Representative Jim Moran, also a Democrat, will introduce a bill that carries it a step further, requiring biometric data.

The McLaughlin Group (Broadcast, Mar. 30-31, 2002)

6  Biometric identification systems measure and analyze the physical characteristics of the human body, including recognition of: fingerprints, handprints, voice patterns, retinal patterns, brainwaves, physiognomy, etc.

7 See infra Part ______.

8 See infra Part ______.

9 SIMON GARFINKEL, DATABASE NATION: THE DEATH OF PRIVACY IN THE 21ST CENTURY (2000).

10 Id.

11 Simon Cole, A History of Fingerprinting and Criminal Identification (2001).

12 See David Johnston, Computer Could Point Finger at Murderers: Automated Searches Through Fingerprint Files Could Substantially Increase Arrests in L.A., L.A. TIMES, June 28, 1985 (pointing out that the task of identifying a single fingerprint using a computerized system could do in five minutes what an individual technician would need 67 years to complete).

13 According to the Institute for Telecommunications Sciences, a distributed database is "not entirely stored at a single physical location, but rather is dispersed over a network of interconnected computers.").  Institute for Telecommunications Sciences, Definition of a Distributed Database, available at http://www.its.bldrdoc.gov/fs-1037/dir-012/_1750.htm (last visited April 2, 2002).

14 See generally Neil Munro and Elizabeth Frater, The Digital Dragnet, The Nat'l Journal, Mar. 23, 2002 ("There are between eight and 20 federal criminal  databases, including the Justice Department's National Crime Information Center (which stores criminal records and arrest warrants) and the Combined DNA Index System (which stores convicted felons' DNA "fingerprints"). The nation's 16,000-plus police jurisdictions also maintain an array of databases . . . . These store data on more than 59 million individual offenders-far more than the federal agencies store . . . . The nonfederal databases are increasingly linked to local government databases created by municipal courts, parole services, and public defenders' offices.).

15 For instance, the state of Kansas has tentatively approved a change to the issuance of driver's licenses that would contain biometric identifiers.  See Chris Ochsner, Bill Would Require Fingerprint to Apply for Driver's License, Topeka Capital Journal, Mar. 13, 2002 ("Under the proposal, the Division of Motor Vehicles would keep a database of ''biometric identifiers,'' which could include a thumbprint, retinal scan, face recognition or hand geometry.").

16 Id.

17 Robert W. Schumacher II, Expanding New York's DNA Database: The Future of Law Enforcement, 26 Fordham Urb. L.J. 1635, 1644 (1999).

18 See McGregor McCance, Device Keeps Problems in Check; Fingerprinting Frustrates Forgers, Richmond Times-Dispatch, Mar. 3, 2002, at E-8 (describing how convenience store owners are beginning to use fingerprint scans and biometric databases to catch check forgers).

19 Daniel J. Solove, Privacy and Power: Computer Databases and Metaphors for Information Privacy, 53 Stan. L. Rev. 1393, 1398 (2001).

20 One commentator has described the process of police fingerprinting in some detail, stating that
When the local police arrest a suspect, they normally fingerprint the suspect. They take at least one set of fingerprints each on an FD-249 and an R-84 and also record other relevant data on both cards. If the local police do not immediately resolve the offense (for example, the suspect must await trial), they send the fingerprint card (FD-249) to the CJIS, but keep the R-84 for future use. When the CJIS receives the FD-249, it enters the information in the NCIC. When the charges against the suspect are resolved (for example, by conviction), the police fill in the disposition on the retained R-84 and send it to the CJIS. The CJIS then matches the R-84 to the previously sent FD-249 and updates the information on the NCIC, including the conviction. If the CJIS, for whatever reason, cannot locate an FD-249 for the suspect for that particular offense, the CJIS returns the R-84 to the submitting agency. Once entered in the NCIC, the information about the suspect, including the conviction, is available to all other authorized agencies for their use.
Major Michael J. Hargis, Three Strikes and You Are Out - The Realities of Military and State Criminal Record Reporting, 1995 Army Law. 3, 5 (1995).

21 For instance, in order to join the bar, one must submit to fingerprinting.  See  Alaska Bar Association, Reciprocity Application and Instructions, available at http://www.alaskabar.org/526.cfm (last visited April 2, 2002).

22 See Smart Cards Could Cut Airport Wait Times for Frequent Flyers, Airline Industry Information, April 1, 2002 ("The cards would store personal information about the holder on a magnetic strip or computer chip and they would be used in conjunction with biometric measurements such as a scan of the user's iris, face, hand or fingerprint. At security checkpoints, the person would submit to at least one biometric measurement, the results of which would be compared to an image stored in a database or on the card itself.").

23 389 U.S. 347 (1967).

24 U.S. CONST. amend. IV.

25 Katz, 389 U.S. at 361 (Harlan, J., concurring).

26 Kyllo v. United States, 533 U.S. 27, 33 (2001).

27 Katz, 389, U.S. at 351.

28 Id. at 361 (Harlan, J., concurring).

29 442 U.S. 735 (1979).

30 Id. at 742.

31 Id. at 739, n.4

32 Id. at 742.

33 Id.

34 Id.

35 Id.

36 The FBI e-mail surveillance system, Carnivore, works like a kind of Internet wiretap that tracks emails and other electronic communications. Agents use it only after obtaining a court order that allows them to intercept the communications of a criminal suspect.

The FBI would install the specialized computer on the networks of Internet providers, where it "sniffs" out all mail and records sent to or from the target of an investigation.

Chris Oakes, ACLU: Law Needs 'Carnivore' Fix, Wired, July 12, 2000, available at http://www.wired.com/news/politics/0,1283,37470,00.html (last visited April 2, 2002).

37 See generally Christian David Hammel Schultz, Unrestricted Federal Agent: 'Carnivore'and the Need to Revise the Pen Register Statute, 76 Notre Dame L. Rev. 1215 (discussing the need to revise the pen register statute so as to de-link Carnivore from Smith).

38 Christopher S. Milligan, Facial Recognition Technology, Video Surveillance, and Privacy, 9 S. Cal. Interdis. L.J. 295, 299.

39 I use the phrase "numerical signature" rather than phone number to sidestep the issues presented by Carnivore as discussed in note 36.

40 Yet a pen register differs significantly from the listening device employed in Katz, for pen registers do not acquire the contents of communications. This Court recently noted:

"Indeed, a law enforcement official could not even determine from the use of a pen register whether a communication existed. These devices do not hear sound. They disclose only the telephone numbers that have been dialed -- a means of establishing communication. Neither the purport of any communication between the caller and the recipient of the call, their identities, nor whether the call was even completed is disclosed by pen registers."

 Katz, 442 U.S. at 741 (quoting United States v. New York Tel. Co., 434 U.S. 159, 167 (1977)).

41 See infra Part _________.

42 476 U.S. 207 (1986).

43 Id. at 209.

44 Id. at 212.

45 Id.

46 Id. at 211.

47 Id. at 212.

48 Id. at 213.  Justice Powell's vigorous dissent poses an alternate view.  "In my view, the Court's holding rests on only one obvious fact, namely, that the airspace generally is open to all persons for travel in airplanes. The Court does not explain why this single fact deprives citizens of their privacy interest in outdoor activities in an enclosed curtilage."  Ciraolo, 276 U.S. at 216 (Powell, J. Dissenting).

49 Id. at 213-14.

50 See generally United States v. Kyllo, 533 U.S. 27, 33-34 (2001)(discussing the circularity of relying on an objective expectation of privacy that constantly shifts under the pressure of new innovations).

51 Cite to another piece.

52   Justice Berger's opinion in Ciraolo expresses this notion explicitly when he remarks that

One can reasonably doubt that in 1967 Justice Harlan considered an aircraft within the category of future "electronic" developments that could stealthily intrude upon an individual's privacy. In an age where private and commercial flight in the public airways is routine, it is unreasonable for respondent to expect that his marijuana plants were constitutionally protected from being observed with the naked eye from an altitude of 1,000 feet. The Fourth Amendment simply does not require the police traveling in the public airways at this altitude to obtain a warrant in order to observe what is visible to the naked eye.
476 U.S. at (215).

Justice Scalia's majority opinion in United States v. Kyllo reiterates this point all too clearly

[i]t would be foolish to contend that the degree of privacy secured to citizens by the Fourth Amendment has been entirely unaffected by the advance of technology. For example, as the cases discussed above make clear, the technology enabling human flight has exposed to public view (and hence, we have said, to official observation) uncovered portions of the house and its curtilage that once were private. The question we confront today is what limits there are upon this power of technology to shrink the realm of guaranteed privacy.
533 U.S. 27, 33-34 (2001)(citations omitted).

53 See generally Alyson L. Rosenberg, Passive Millimeter Wave Imaging: A New Weapon In the Fight Against Crime or a Fourth Amendment Violation?, 9 Alb. L.J. Sci. & Tech. 135 (1998).

54 See David A. Harris, Superman's X-Ray Vision and the Fourth Amendment: The New Gun Detection Technology, 69 Temp. L. Rev. 1, 20-22 (1996).

55 See generally United States v. Place, 462 U.S. 696 (1983).

56 533 U.S. 27 (2001).

57 Id. at 29.

58 Id. at 33.

59 Id. at 31.

60 Id. at 35-36.

61 Id. at 31.

62 The majority remarks that obtaining by sense-enhancing technology any information regarding the interior of the home that could not otherwise have been obtained without physical "intrusion into a constitutionally protected area," constitutes a search -- at least where (as here) the technology in question is not in general public use. This assures preservation of that degree of privacy against government that existed when the Fourth Amendment was adopted. On the basis of this criterion, the information obtained by the thermal imager in this case was the product of a search.
Id. at 34-35 (emphasis added)(citations omitted).

63 Id. at 32-33.

64 Justice Scalia notes that

[w]hile it may be difficult to refine Katz when the search of areas such as telephone booths, automobiles, or even the curtilage and uncovered portions of residences are at issue, in the case of the search of the interior of homes -- the prototypical and hence most commonly litigated area of protected privacy -- there is a ready criterion, with roots deep in the common law, of the minimal expectation of privacy that exists, and that is acknowledged to be reasonable.
Id. at 34.

65 Molly Ivins, Post Sept. 11: For Those Who've Lost Their Common Sense, CHIC. TRIB., at 31; Rene Sanchez, Border Patrol Agents Answer Higher Call, WASH. POST, at A12.

66 Conor O'Clery, A Strange Turn-up for the Book Supreme Injustice, Irish Times, February 16, 2002, at 59 (citing Harvard law professor Alan Dershowitz as one such proponent of these schemes).

67 Mark Hollands, We Don't Need IT to Make Us Stupid, THE AUSTRALIAN,  Feb. 13, 2001, at 48.

68 Id.

69 Id.

70 Robert Trigaux, In Riskier World, Personal Security Trumps Personal Privacy, ST. PETERSBURG TIMES, Feb. 24, 2002, at 1H.

71 An October 2001 Harris Poll indicated that "86 percent of respondents favored the use of facial recognition devices to scan for terrorists in public places."  Biometrics: A Security Boon or Invasion of Individual Privacy, 7 CORRECTIONS PROFESSIONAL, Mar. 25, 2002.

72 Id.

73 Id.

74 Biometrics' Time Has Come, 7 COLLECTIONS AND CREDIT RISK 8, Feb. 2002.

75 Mary Kirby, More U.S. Airports Acquire Visionics Biometric Systems, AIR TRANSPORT INTELLIGENCE, Jan. 24, 2002.

76 Id.

77 Id.

78 Biometrics' Time Has Come, 7 COLLECTIONS AND CREDIT RISK 8, Feb. 2002.

79 Warren Fiske, House Panel Backs Face-Scanning Limit, The Virginian-Pilot, Feb. 8, 2002, at B4.

80 David Streitfeld & Charles Piller, A Changed America; Big Brother Finds Ally in Once-Wary High Tech, L.A. TIMES, Jan 19, 2002, at A1.

81 Id.

82 Edward Iwata, Stadium Security Gets Serious, USA TODAY, Mar. 18, 2002, at 3B.

83 Id.

84 Christopher S. Milligan, Facial Recognition Technology, Video Surveillance, And Privacy, 9 S. Cal Interdisc. L. J. 295, 318 (1999).

85 See supra Part III.A.1.

86 Visionics, Inc., website, What is FaceIt, http://www.visionics.com/faceit/whatis.html (last visited Mar. 31, 2002).

87 See William Welsh, Facing Trouble, Washington Technology, at http://www.washingtontechnology.com/news/16_21/state/17781-1.html (last visited Mar. 31, 2002)(reporting that the Tampa system currently has only 900 entries in its database, but will soon expand to 45,000).

88 Visionics, Inc., ID Solutions, at http://www.visionics.com/faceit/apps/idsol.html (last visited Mar. 31, 2002)("There are estimated to be 1.1 billion facial images in identification databases around the world. No re-enrollment is required.").

89 Katz, 389 U.S. at 361.

90 See supra Part III.A.4.

91 Mark Boal, Spycam City, THE VILLAGE VOICE, Oct. 6, 1998 at 38 ("A hundred bucks at a computer store already buys face-recognition software that was classified six years ago, which means that stored images can be called up according to biometric fingerprints.").

92 Warren Fiske, House Panel Backs Face-Scanning Limit, The Virginian-Pilot, Feb. 8, 2002, at B4.

93 Visionics, Inc., website, What is FaceIt, http://www.visionics.com/faceit/whatis.html (last visited Mar. 31, 2002).

94 Id.

95 Visionics, Inc., ID Solutions, at http://www.visionics.com/faceit/apps/idsol.html (last visited Mar. 31, 2002).

96 Mark Boal, Spycam City, The Village Voice, Oct. 6, 1998 at 38.

97 See supra Part ____.

98 Terry v. Ohio, 392 U.S. 1, 20-21 (1968) ("[I]t is necessary 'first to focus upon the governmental interest which allegedly justifies official intrusion upon the constitutionally protected interests of the private citizen,' for there is 'no ready test for determining reasonableness other than by balancing the need to search [or seize] against the invasion which the search [or seizure] entails.'")(citing Camara v. Municipal Court, 387 U.S. 523, 534-35, 536-37 (1967)).

99 H.B. 454, 2002 Leg., ___ Sess. (Va. 2002).  The full text of the bill is as follows:

CHAPTER 6.1. ORDERS FOR FACIAL RECOGNITION TECHNOLOGY.

§ 19.2-70.4. Definition.

As used in this chapter, "facial recognition technology" means any technology or software system [ that identifies humans by using a biometric system to identify and analyze a person's facial characteristics and is ] employed for the purpose of matching a facial image captured by cameras placed in any public place, other than in a state or local correctional facility as defined in § 53.1-1, with an image stored in a database.

§ 19.2-70.5. Who may apply for order authorizing facial recognition technology.

A. Except as provided in subsection A of § 19.2-70.7, no locality or law-enforcement agency shall employ facial recognition technology prior to complying with all of the provisions of this chapter.

B. The Attorney General or his designee, in any case where the Attorney General is authorized by law to prosecute or pursuant to a request in his official capacity of an attorney for the Commonwealth in any city or county, or an attorney for the Commonwealth, may apply to the circuit court, for the jurisdiction where the proposed facial recognition technology is to be used, for an order authorizing the placement of facial recognition technology by any law-enforcement agency in the jurisdiction, when the technology may reasonably be expected to provide (i) evidence of the commission of a felony or Class 1 misdemeanor, (ii) a match of persons with outstanding felony warrants, (iii) a match of persons or class of persons who are identifiable as affiliated with a terrorist organization, or (iv) a match of persons reported to a law-enforcement agency as missing.

§ 19.2-70.6. Application for and issuance of order authorizing use of facial recognition technology; contents of order; introduction in evidence of information obtained.

A. Each application for an order authorizing the use of facial recognition technology shall be made in writing upon oath or affirmation to the circuit court and shall state the applicant's authority to make the application. Each application shall be verified by the applicant to the best of his knowledge and belief and shall include the following information:

1. The identity of the applicant and the law-enforcement agency;

2. A full and complete statement of the facts and circumstances relied upon by the applicant in support of his request that an order be issued, including, but not limited to, (i) details either as to the particular offenses that have been, are being or are about to be committed, or the event or appearance that would attract individuals affiliated with a terrorist organization; (ii) a specific description of the nature and location of the facilities where or the place from which the facial recognition technology is to be used; (iii) a description of the type of match being sought; (iv) the identity of any persons or class of persons sought by the use of facial recognition technology as provided in subsection B of § 19.2-70.5; and (v) a description of the type of facial recognition technology to be used and a description of the contents of the database;

3. A statement of the period of time for which facial recognition technology is required to be maintained. However, in no case shall any request for an order granting the use of facial recognition technology be for longer than a period of ninety days;

4. A full and complete statement of the facts concerning all previous applications known to the individual authorizing and making the application, made to the court for authorization to use facial recognition technology involving any of the same persons, facilities or places specified in the application, and the action taken by the court on each application; and

5. Where the application is for the extension of an order, a statement setting forth the results thus far obtained from the use of facial recognition technology, or a reasonable explanation of the failure to obtain the expected results.

The court may require the applicant to furnish additional testimony or documentary evidence in support of the application.

B. If the court determines on the basis of the facts submitted that the provisions of this chapter have been met, and upon submission of a proper application, the court shall enter an order, as requested or as modified, authorizing the use of facial recognition technology within the territorial jurisdiction of the court. The application and any order granted or denied may be sealed by the court.

C. Each order authorizing the use of facial recognition technology shall specify:

1. The identity of any persons or class of persons who are the object of the use of the facial recognition technology, or the expected evidence of the commission of felonies or Class 1 misdemeanors from the use of the facial recognition technology;

2. The nature and location of the facilities as to which, or the place where, authority to use facial recognition technology is granted;

3. A description of the type of facial recognition technology to be used;

4. A description of the contents of the database;

5. The name of the agency authorized to use the facial recognition technology;

6. The requirement that only the agency named shall use the facial recognition technology;

7. The period of time, not to exceed ninety days, during which the use of the facial recognition technology is authorized, including a statement that the use shall be terminated at the end of the time period specified, unless the agency applies for and is granted an extension;

8. If the court deems it appropriate, the submission of reports at specified intervals to the court that issued the order, showing what progress has been made toward achievement of the authorized objective and the need for continued use of the facial recognition technology; and

9. The requirement that any facial image captured that is not relevant to (i) evidence of the commission of a felony or Class 1 misdemeanor, (ii) a match of persons with outstanding felony warrants, (iii) a match of persons or class of persons who are identifiable as affiliated with a terrorist organization, or (iv) a match of persons reported to a law-enforcement agency as missing shall be disposed of as soon as possible, but in no event be retained for more than ten days.

D. No order entered under this section may authorize the use of facial recognition technology for any period longer than ninety days from the time the facial recognition technology is operational. Extensions of an order may be granted in accordance with subsection A. The period of extension shall be no longer than the court deems necessary to achieve the purposes for which it was granted and in no event shall the extension be for longer than sixty days.

E. Any violation of the provisions of this subsection may be punished as contempt of court.

§ 19.2-70.7. Certain exemptions from chapter.

A. The provisions of this chapter shall not apply to security measures undertaken at (i) public-use airports in the Commonwealth or (ii) harbors and seaports of the Commonwealth.

B. Any information acquired through facial recognition technology prior to July 1, 2002, shall be admissible in evidence in any suit, action or proceeding.

Id.

100 Facial Scan: Beach's Use Restricted Under Bill Approved by House, The Virginian-Pilot, Feb. 13, 2002.

101 H.B. 454, 2002 Leg., ___ Sess. (Va. 2002).

102 Id.

103 Id.

104 The text provides that "[a]ny information acquired through facial recognition technology prior to July 1, 2002, shall be admissible in evidence in any suit, action, or proceeding."  Id.

105 Most notably, the chances that the bill will be ratified by the Virginia Senate is slim.  See Warren Fiske, House Panel Backs Face-Scanning Limit, The Virginian-Pilot, Feb. 8, 2002, at B4 ("The measure is expected to be passed by the full House next week bu may have a tough time in the state Senate.  Sen. Kenneth W. Stolle, R-Virginia Beach, vowed to vigorously oppose the legislation.").

106 There are two policies behind the use of the exclusionary rule to effectuate the Fourth Amendment. When there is a close causal connection between the illegal seizure and the confession, not only is exclusion of the evidence more likely to deter similar police misconduct in the future, but also use of the evidence is more likely to compromise the integrity of the courts. Dunaway v. new York, 442 U.S. 200, 217-18 (1979).

107 488 U.S. 466 (citing ORWELL, NINETEEN EIGHTY-FOUR 4 (1949)  (Brennan, J., dissenting)).


[NJ 20020402 0700]