Legality
Fourth Amendment protections and the role of local laws
"If you are in the public, you do not have an expectation of privacy. And that's been black letter constitutional law for a very long period of time," said then-New Orleans Mayor Mitch Landrieu in late 2017.
Landrieu was speaking at the grand opening of the city's Real Time Crime Center, the live monitoring hub for the city’s burgeoning surveillance camera network and the crown jewel of the city’s $40 million public safety plan.
A reporter had asked Landreiu to respond to privacy concerns surrounding a proposed ordinance — which was later dropped following fierce opposition — to force every alcohol vendor in the city to install outdoor cameras that would feed live footage to the Real Time Crime Center.
“It sounds like if this ordinance passes you won't even be able to go to a bar without going on the city's radar,” the reporter said.
“That's probably true,” Landrieu responded. “If you're out in public, it is highly likely in this day in age you're going to be filmed by some camera or someone holding a phone. That's the new day in age that we are in, and people should conduct themselves accordingly.”
As a general principle, federal law allows the police to observe public spaces without a warrant, including through public-facing cameras, according to Rafael Goyeneche, a former prosecutor and president of the Metropolitan Crime Commission.
“These are video feeds of people when they're out in public and there isn't an expectation of privacy in that,” Goyeneche told The Lens in an interview.
But the legal issues surrounding privacy rights in public are far more complicated than Landrieu’s blanket proclamation would suggest, especially as the courts have begun to grapple with the implications of modern technology.
The Fourth Amendment protects people’s right to privacy against unreasonable government searches, seizures, and arrests. It forces the government to establish probable cause and, in most cases, obtain a warrant before violating that “reasonable expectation” of privacy.
The Fourth Amendment, like many US laws, can’t be fully understood just by reading the 54 words ratified in the Bill of Rights in 1791. What the Amendment actually protects is largely defined by the subsequent 230 years of Supreme Court and lower court rulings that have allowed those theoretical protections to take shape in the real world.
But so far, the nation’s highest court has only scratched the surface of emerging privacy debates.
“The law is still catching up to technology in a lot of ways,” said Bruce Hamilton, former senior staff attorney for the ACLU of Louisiana, who now works for the Southern Poverty Law Center. “The Fourth Amendment as originally construed never imagined we'd have the technologies we have now.”
Privacy advocates like Hamilton argue that some new police surveillance technology is already violating the Constitution, at least in theory. Without explicit Supreme Court rulings, it’s hard to know for sure. But regardless of whether advocates think the Constitution should protect people from these practices, the uncertainty and lethargy of the court system raise doubts about whether it will.
“I'm sorry to be cynical about it, but people think of constitutional rights as being sacrosanct things that when they're violated it's this huge deal that will automatically get litigated,” Hamilton said. “But the sad fact is that constitutional rights get violated every day in myriad ways. And we don’t always know about it, and the people whose rights are violated don't always know that they're violated and they don’t always think to challenge them in court.”
Even if a surveillance practice is challenged in court, the Supreme Court may not choose to hear the case. Or if it does, it could take years before a ruling comes down, by which time the technology might already be outdated, or be applied by the police in new ways that would have to go through another court challenge.
“While technological improvements are following a kind of exponential curve upward, the law is a more gradual bell curve lagging far behind,” Hamilton said.
'Seismic shifts in digital technology'
The Supreme Court has ruled repeatedly that people don’t lose all expectation of privacy when they walk out their front door. The court has ruled, for example, that the police can’t just arbitrarily stop and frisk anyone standing in public.
In 2018, Fourth Amendment protections were put to the test in the context of modern technology in the landmark Supreme Court case Carpenter vs. U.S. The court ruled that the FBI violated the Fourth Amendment rights of the defendant, Timothy Carpenter, by obtaining 127 days of his cell phone location data without a warrant.
“A person does not surrender all Fourth Amendment protection by venturing into the public sphere,” wrote Chief Justice John Roberts in the majority opinion. “A central aim of the Framers was ‘to place obstacles in the way of a too permeating police surveillance.’ ”
In his opinion, Roberts highlighted how the court was being forced to “contend with the seismic shifts in digital technology” that have “enhanced the Government’s capacity to encroach upon areas normally guarded from inquisitive eyes.”
The location information that the police obtained through Carpenter’s phone records could have been, theoretically, collected without the use of modern technology. But it would have required officers trailing and staking out his locations for 127 days, a dedication of resources that has been traditionally limited by cost and the finite number of law enforcement personnel.
“The law has really failed to accommodate that advance, because it's built around this notion that surveillance is limited by human perception,” Hamilton said. “And now that we've augmented human perception with computers and technology, we need to account for that.”
As Roberts wrote, the technology used to track Carpenter was “remarkably easy, cheap, and efficient compared to traditional investigative tools.” And, he noted, the police were able to pinpoint Carpenter’s location retroactively.
“Whoever the suspect turns out to be, he has effectively been tailed every moment of every day for five years, and the police may—in the Government’s view—call upon the results of that surveillance without regard to the constraints of the Fourth Amendment. Only the few without cell phones could escape this tireless and absolute surveillance.”
Roberts noted that the ruling was “narrow,” and was only meant to address the issue of cell phone records. He explicitly wrote that it did not “call into question conventional surveillance techniques and tools, such as security cameras.”
There aren’t many clear-cut constitutional prohibitions on the warrantless use of mass public camera systems. But privacy advocates argue that they can present a form of monitoring that is just as intrusive, if not more so, than practices that are prohibited by the Fourth Amendment.
New software, for example, is exponentially increasing the insightfulness of video surveillance. A software used in New Orleans until 2019, called Briefcam, can instantaneously search through live and archived footage to track different people, cars, outfits, faces and other characteristics across all cameras and automatically show you every recorded instance where that person or object was caught on camera.
Advocates argue that mass surveillance hardware like cameras, license plate readers and drones — combined with software powered by artificial intelligence and machine learning — can replicate the kind of invasive and unconstitutional tracking that occurred in the Carpenter case.
And yet, although the Supreme Court ruled the police need a warrant to obtain cell phone location records, there is no such warrant requirement for the use of surveillance camera networks.
“We look at where we are now, we're talking about the advent of technologies that are collecting data on everyone all the time, regardless of whether you're suspected of being involved in a crime,” Dave Maass, head of investigations at the Electronic Frontier Foundation, told The Lens. “That to me is fundamentally at odds with our justice system, or the philosophy of our justice system that the government should not be investigating you unless they have good cause to.”
The Katz Test
In another landmark privacy case in 1967, Katz v. United States, the Supreme Court ruled that the FBI violated the Fourth Amendment when placing an audio recorder on the outside of a public phone booth where a man, Charles Katz, conducted illegal sports betting business.
The case established several Fourth Amendment principles that are still cited in modern-day rulings, such as the idea that Fourth Amendment protections don’t only apply within the boundaries of certain private places.
“The Fourth Amendment protects people, not places,” Chief Justice Potter Stewart wrote in the majority opinion. “What a person knowingly exposes to the public, even in his own home or office, is not a subject of Fourth Amendment protection. But what he seeks to preserve as private, even in an area accessible to the public, may be constitutionally protected.”
The ruling also established a two-part test, sometimes called the Katz test, to determine whether a person’s Fourth Amendment rights were violated.
The first question is whether the individual themself had a subjective expectation of privacy. When Katz was speaking in the phone booth, for example, he didn’t expect anyone to be listening in. That likely would not have been the case if an FBI agent had been physically present and pressing their ear on the phone booth glass.
The second, and more tricky, question is whether that subjective privacy expectation is “one that society is prepared to recognize as ‘reasonable.’ “ But establishing the privacy values of society as a whole is no easy task, especially when the public doesn’t know or understand the ways the government is spying on them.
“From a Fourth Amendment context, society can only evolve our interpretation of reasonableness if we know the ways the government is watching,” Colin Reingold, former litigation director for the Orleans Public Defenders, told The Lens.
But with the secrecy shrouding New Orleans’ surveillance capabilities, Reingold argued that the public may not be equipped to determine where to draw the line.
“All of that is happening behind very closed, very locked doors such that we can't have an informed discussion about what's reasonable,” Reingold said. “We can't have a discussion about what it means to do something in private if we don’t know what the government can see.”
And while some government surveillance technologies are hidden through intentional secrecy, others are shrouded by their sheer complexity, argues Albert Fox Cahn, a New York-based privacy advocate.
“Part of why the surveillance state has been able to proliferate the way it has in recent years and decades is because of the technical learning curve, the legalistic learning curve, the barriers for much of the public to fully understand how these systems work,” Cahn told The Lens. “We definitely see the public understanding around these technologies is a decade behind for software where it is for hardware, and a decade behind for algorithmic systems where it is for software. So it really is a huge gap.”
Reingold told The Lens that he doesn’t think everyone is aware of how powerful modern police cameras are, and how their zoom capabilities and high vantage points allow the police to see into homes, through car windows and into back yards — areas that many people consider to be private spaces.
“We have to do some real thinking about what we as a community are willing to accept,” Reingold said. “What does it mean, essentially, to do something in public? If I have any window open does that expose me to a real time crime camera from two blocks away? Is it my fault I had my window open? Or do we say that's unreasonable that the government should be permitted to look in from that distance?”
“If we don’t have these conversations, then the Fourth Amendment loses a lot of its protective value,” he said.
Local New Orleans Laws
Some privacy advocates argue that given the uncertainty of the court system — as well as the relative lack of federal privacy legislation — it is vital to pass local laws to stop unwanted surveillance practices that could take years to stop through the court system, or that may not be covered by the Fourth Amendment at all.
“The intersection of technology and local government activity is one of the most potent threats to civil rights, and also one of the most potent opportunities for civil rights reform,” Cahn said. “Obviously we've seen these abuses on the federal level for years, but there's such entrenched opposition to privacy and civil rights protections against invasive federal practices that's made it much harder to make the kind of headway that's been made possible at the state and local level.”
In December 2020, New Orleans joined a short but growing list of cities actively regulating its surveillance technology when the City Council passed an ordinance to create an entire new chapter of the City Code titled “Surveillance Technology and Data Protection.”
The ordinance was written in partnership between then-Councilman Jason Williams, who is now the Orleans Parish District Attorney, and the Eye on Surveillance Coalition, a privacy and anti-surveillance advocacy group.
The ordinance as originally written was based off of model legislation provided by the ACLU through its Community Control Over Police Surveillance program. The ACLU’s template ordinance creates blanket oversight requirements, including City Council review and approval of surveillance technology, annual reporting, use policies and a citizen review board.
The core elements of the ACLU legislation are all about creating transparency as a baseline for future surveillance regulations — allowing the public to see what the government is doing so they can collectively decide what practices they’re ok with, and which ones cross the line into government overreach.
The ACLU says that at least 21 cities have passed laws based on the model ACLU legislation. But New Orleans still isn’t one of them.
The New Orleans City Council ditched almost all the elements of the ACLU model legislation before passing a surveillance ordinance in December 2020.
“We've stripped extensive approval and reporting processes as requested by the NOPD,” Williams said during the December council meeting.
What remained was outright bans on four specific types of surveillance technology, including facial recognition, as well as new data sharing and collection rules. The ordinance also established a foundation for future privacy laws by creating a new chapter of the City Code and setting official definitions for terms like “surveillance” and “predictive policing.”
Advocates lauded the ordinance as an important first step, but said that without the transparency measures in the original ordinance, the public was still vulnerable to unknown and future surveillance practices that violate their sense of privacy. And without annual reporting, it’s difficult to know whether the city is actually abiding by the blanket technology bans.
“I think a lot of folks in our coalition were surprised that the more controversial components of the ordinance ended up being the oversight and transparency aspects rather than the outright bans,” said Chris Kaiser, Advocacy Director of the ACLU of Louisiana. “We approached the policy conversation focused on, 'look, we're not trying to say you can't have surveillance technology, we just think there should be guard rails in place and some democratic process to know what's going on.”
There already appears to be an effort to roll back at least some of the ordinance’s new restrictions. Councilman Jay Banks told The Lens in March that the NOPD was working on a policy for using facial recognition. Banks said that it was his intention to revisit the ordinance once that policy was presented to the council and, if the policy was found to be acceptable, reverse the ban.
What the New Orleans surveillance ordinance did
Blanket ban on four technologies
The ordinance bans four types of technology that the city cannot “obtain, retain, possess, access, sell, or use.” The city also cannot access the technology through contractors or subcontractors:
Facial recognition: “An automated or semi-automated process that assists in identifying an individual, capturing information about an individual based on the physical characteristics of an individual’s face.”
Predictive policing: “The usage of predictive analytics software in law enforcement to predict information or trends about criminality, including but not limited to the perpetrator(s), victim(s), locations or frequency of future crime. It does not include, for example software used to collect or display historic crime statistics for informational purposes.”
Cellular communications interception technology (cell site simulator): “Any device that intercepts mobile telephony calling information or content, including an international mobile subscriber identity catcher or other virtual base transceiver station that masquerades as a cellular station and logs mobile telephony calling information.”
Characteristic tracking: “Any software or system capable of tracking people and/or objects based on characteristics such as color, size, shape, age, weight, speed, path, clothing, accessories, vehicle make or model, or any other trait that can be used for tracking purposes, including BriefCam and similar software.”
New rules for how the city collects, stores and shares data from residents
- The ordinance states that “the City shall collect only the minimum amount of personal information needed to fulfill a narrow, well defined purpose.”
- The city cannot inquire or collect data about someone’s immigration status unless it is required by law or in a few other select circumstances, like determining eligibility for city employment and connecting people to benefits or services.
- The law prohibits city contractors from cooperating or participating in the surveillance, detention or removal of “persons suspected of being noncitizens.” The prohibition can be waived by the Chief Administrative Officer in certain circumstances.
- The city is responsible for protecting data it collects, and must maintain policies to protect the data from unauthorized access.
- Any city department that uses surveillance technology, either itself or through a 3rd party, has to designate a “Data Protection Officer” to make sure the department is in compliance with surveillance and privacy laws.
- The city is required to maintain procedures for using and evaluating automated decision systems, including artificial intelligence, “through the lens of equity, fairness, transparency, and accountability.”
- The ordinance gives people the right to opt out of certain automated decision making systems. The ordinance says, “wherever decisions are made based on the identity of an individual, rather than patterns of a general population, such as air traffic control, individuals must have the option to opt out of automated decisions.”
- The city is obligated to give public notice and provide an opportunity for public comment whenever it wants to buy, or receive a donation of, any surveillance data generated and owned by a private source. The same requirement exists if the city wants to sell, or donate, any city-owned data to a private source.
Official definitions related to surveillance regulation
Automated decision systems: “Any software, system, or process that aims to automate, aid or replace human decision making. Automated decision systems can include both tools that analyze datasets to generate scores, predictions, classifications, or some recommended action(s) that are used by agencies to make decisions that impact human welfare and the set of processes involved in implementing those tools.”
Surveillance: “The act of observing or analyzing the movements, behavior, or actions of identifiable individuals.”
Surveillance technology: “Any electronic surveillance devise, hardware, or software that is capable of collecting, capturing, recording, retaining, processing, intercepting, analyzing, monitoring, or sharing audio, visual, digital, location, thermal, biometric, behavioral, or similar information or communications specifically associated with, or capable of being associated with, any identifiable individual or group.”
Removed from the final version of the ordinance
The original ordinance was significantly amended by the time it was actually made into law. The City Council ended up removing the law's most forward-looking provisions, including those that required ongoing regulation and oversight.
Council approval of surveillance technology: The original ordinance required the City to gain approval from the City Council for future and existing surveillance technology at the city’s disposal. Departments would be required to submit a “surveillance use request” for existing surveillance technology and whenever they add new technology. Council approval for a specific technology would last three years, at which time the department would have to apply for council approval again.
Surveillance use policy: For each approved surveillance technology, the city would have to produce a policy for its use that includes its authorized purpose and uses, as well as what data it collects and how that data is stored.
Annual surveillance reports: Every year, city departments and entities would have to submit an annual report for each piece of approved surveillance technology it’s using. The reports would include, “information about how often the surveillance technology was used, where it was used, and by which agency or department the technology was used. It should also include the demographics of the surveillance technology targets, including but limited to race or ethnicity, gender, and socioeconomic status.” The council could amend or rescind approval for surveillance technology based on the reports.
Surveillance impact reports: Each annual surveillance report would include a surveillance impact report, which would include product information, cost and an assessment of the “potential or realized impacts on privacy and civil liberties, as well as plans to safeguard the rights of the public.”
- Project concept and production by Caroline Sinders and Michael Isaac Stein
- Writing, research and reporting by Michael Isaac Stein
- Creative direction, research and design by Caroline Sinders
- Data visualization, research and graphic design by Winnie Yoe
- Web development and technical guidance by Annabel Church and Thomas Thoren
- Editorial support by The Lens
- Project funded by the Fund for Investigative Journalism