The 'New World Order'
BIBLICAL END TIMES COMING:   SOON!
Digital ID Or Digital Prison
 
The New World Order

Project N.O.L.A. National Crime Camera Program

Located on the campus of the University of New Orleans, Project NOLA is a 501(c)(3) nonprofit organization that operates the largest, most cost efficient and successful networked HD crime camera program in America, which was created in 2009 by criminologist Bryan Lagarde to help reduce crime by dramatically increasing police efficiency and citizen awareness. A first of a kind Community-based crime camera initiative, Project NOLA provides HD crime cameras to residents, business owners, developers, associations and municipalities, which are most often placed on private property facing a street or park. Via the Internet, cameras transmit video to the Project NOLA Real-Time Crime Information Center at UNO, where video may be live-monitored, stored and re-broadcast to local law enforcement. Those hosting a Project NOLA crime camera may also view live and recorded video via a smart device, phone or pc. For privacy purposes, Project NOLA only maintains camera video for about 10 days and only provides camera footage to law enforcement.

Communities that host Project NOLA Crime Cameras often see a reduction in crime. When felony crimes do occur, criminals are more likely to be arrested quicker, confess and implicate others, and plead guilty, saving hundreds of investigative/prosecutorial/defense man-hours and millions in tax dollars. Witnesses and victims are less likely to relive tragic memories in court. At-risk juveniles are more likely to be identified and helped. Violent offenders transition more quickly into the state penal system.

Serving dozens of law enforcement agencies across America, we wish to bring the unprecedented success of our nonprofit program to other communities and highly encourage those wishing to learn more to call us at (504) 736-9187. A nonprofit with a highly proven track record of success, Project NOLA may immediately implement a highly successful crime camera system virtually anywhere for a small fraction of what other solutions commonly cost.


Also:


New Orleans may be first U.S. city to use live AI facial recognition camera network

New Orleans police paused its use of a privately run facial recognition camera network last month amid legal and privacy questions from The Washington Post.

Why it matters: It's likely the first AI-enhanced live surveillance system to be used in a major American city, the paper says of the investigation.

The big picture: New Orleans police have been using information from Project NOLA's 200-plus camera network to find wanted individuals for at least the past two years, Douglas MacMillan and Aaron Schaffer wrote in the story.

Project NOLA's facial recognition cameras monitor the streets and send alerts to officers' phones in real time through an app when the system finds a possible match.

Project NOLA, a privately run nonprofit, puts in the details about who to look for based on press releases and social media alerts from NOPD and other agencies, Project NOLA founder Bryan Lagarde tells Axios.

Case in point: NOPD Superintendent Anne Kirkpatrick said the department used the facial recognition technology Friday to identify one of the 10 inmates who escaped from Orleans Justice Center earlier that day.

"This is the exact reason why facial recognition technology is so critical," Kirkpatrick said.

Project NOLA in a post said Louisiana State Police sent the escapees' information at 10:35am Friday. The cameras found two of the escapees less than 10 minutes later in the French Quarter and pinged authorities, leading to the arrest of one.

The group later released the video footage of the two escapees.

Image shows people riding bikes in the dark.

Between the lines: Project NOLA doesn't have a formal contract with the city, Lagarde tells Axios. Individual officers download the app and sign up for alerts, he says.

NOPD officers are "regrettably" no longer participating in the alerts, he told Axios on Monday. However, he said alerts are still going to LSP and federal agencies.

Flashback: A 2022 City Council ordinance regulates how the city uses facial recognition software, according to reporting by The Lens.

Before then, it wasn't allowed, but police had been using it for years, according to The Lens.

Mayor LaToya Cantrell requested to reverse the ban, The Lens said.

The other side: The ACLU of Louisiana blasted the technology in 2022 and again this week.

"This is the first known time an American police department has relied on live facial recognition technology cameras at scale, and is a radical and dangerous escalation of the power to surveil people as we go about our daily lives," ACLU said in a statement.

The organization called on the City Council to launch an investigation into NOPD's use of the system.

What's next: Kirkpatrick told the Washington Post the department is doing a formal review of how many officers used the alerts, what arrests were made and whether this violated the ordinance.

She also said she's in favor of the city running its own live facial recognition program.


Also:


Police secretly monitored New Orleans with facial recognition cameras

Following records requests from The Post, officials paused the first known, widespread live facial recognition program used by police in the United States.

For two years, New Orleans police secretly relied on facial recognition technology to scan city streets in search of suspects, a surveillance method without a known precedent in any major American city that may violate municipal guardrails around use of the technology, an investigation by The Washington Post has found.

Police increasingly use facial recognition software to identify unknown culprits from still images, usually taken by surveillance cameras at or near the scene of a crime. New Orleans police took this technology a step further, utilizing a private network of more than 200 facial recognition cameras to watch over the streets, constantly monitoring for wanted suspects and automatically pinging officers’ mobile phones through an app to convey the names and current locations of possible matches.

This appears out of step with a 2022 city council ordinance, which limited police to using facial recognition only for searches of specific suspects in their investigations of violent crimes and never as a more generalized “surveillance tool” for tracking people in public places. Each time police want to scan a face, the ordinance requires them to send a still image to trained examiners at a state facility and later provide details about these scans in reports to the city council — guardrails meant to protect the public’s privacy and prevent software errors from leading to wrongful arrests.

Since early 2023, the network of facial recognition cameras has played a role in dozens of arrests, including at least four people who were only charged with nonviolent crimes, according to police reports, court records and social media posts by Project NOLA, a crime prevention nonprofit company that buys and manages many of the cameras. Officers did not disclose their reliance on facial recognition matches in police reports for most of the arrests for which the police provided detailed records, and none of the cases were included in the department’s mandatory reports to the city council on its use of the technology. Project NOLA has no formal contract with the city, but has been working directly with police officers.

“This is the facial recognition technology nightmare scenario that we have been worried about,” said Nathan Freed Wessler, a deputy director with the ACLU’s Speech, Privacy, and Technology Project, who has closely tracked the use of AI technologies by police. “This is the government giving itself the power to track anyone — for that matter, everyone — as we go about our lives walking around in public.”

Anne Kirkpatrick, who heads the New Orleans Police Department, paused the program in early April, she said in an interview, after a captain identified the alerts as a potential problem during a review. In an April 8 email reviewed by The Post, Kirkpatrick told Project NOLA that the automated alerts must be turned off until she is “sure that the use of the app meets all the requirements of the law and policies.” The Post began requesting public records about the alerts in February.

The police department “does not own, rely on, manage, or condone the use by members of the department of any artificial intelligence systems associated with the vast network of Project Nola crime cameras,” Reese Harper, a spokesman for the agency, said in an emailed statement.

Police across the country rely on facial recognition software, which uses artificial intelligence to quickly map the physical features of a face in one image and compare it to the faces in huge databases of images — usually drawn from mug shots, driver’s licenses or photos on social media — looking for possible matches. New Orleans’s use of automated facial recognition has not been previously reported and is the first known widespread effort by police in a major U.S. city to use AI to identify people in live camera feeds for the purpose of making immediate arrests, Wessler said.

The Post has reported that some police agencies use AI-powered facial recognition software in violation of local laws, discarding traditional investigative standards and putting innocent people at risk. Police at times arrested suspects based on AI matches without independent evidence connecting them to the crime, raising the chances of a false arrest. Often, police failed to inform defendants about their use of facial recognition software, denying them the opportunity to contest the results of a technology that has been shown to be less reliable for people of color, women and older people.

One of the few places where live facial recognition is known to be in wide use is London, where police park vans outside of high-traffic areas and use facial recognition-equipped cameras to scan the faces of passersby, and confront people deemed a match to those on a watch list. While the city says the program has never led to a false arrest since launching in 2016, Big Brother Watch, a London-based civil liberties group, argues that the practice treats everyone as a potential suspect, putting the onus on the people who were falsely matched to prove their innocence.

Real-time alerts

The surveillance program in New Orleans relied on Project NOLA, a private group run by a former police officer who assembled a network of cameras outside of businesses in crime-heavy areas including the city’s French Quarter district.

Project NOLA configured the cameras to search for people on a list of wanted suspects. When the software determined it had found a match, it sent real-time alerts via an app some officers installed on their mobile phones. The officers would then quickly research the subject, go to the location and attempt to make arrests.

Police did not set up the program nor can they directly search for specific people, or add or remove people from the camera system’s watch list, according to Bryan Lagarde, Project NOLA’s founder.

Little about this arrangement resembles the process described in the city council ordinance from three years ago, which imagined detectives using facial recognition software only as part of methodical investigations with careful oversight. Each time police want to scan a face, the ordinance requires them to send a still image to a state-run “fusion center” in Baton Rouge, where various law enforcement agencies collaborate on investigations. There, examiners trained in identifying faces use AI software to compare the image with a database of photos and only return a “match” if at least two examiners agree.

Investigators have complained that process takes too long and often doesn’t result in any matches, according to a federally mandated audit of the department in 2023. It has only proved useful in a single case that led to an arrest since October 2022, according to records police provided to the city council.

By contrast, Project NOLA claims its facial recognition cameras played a role in at least 34 arrests since they were activated in early 2023, according to the group’s Facebook posts — a number that cannot be verified because the city does not track such data and the nonprofit does not publish a full accounting of its cases. Without a list of the cases, it’s impossible to know whether any of the people were misidentified or what additional steps the officers took to confirm their involvement in the crimes.

Kirkpatrick said her agency has launched a formal review into how many officers used the real-time alerts, how many people were arrested as a result, how often the matches appear to have been wrong and whether these uses violated the city ordinance.

“We’re going to do what the ordinance says and the policies say, and if we find that we’re outside of those things, we’re going to stop it, correct it and get within the boundaries of the ordinance,” she said.

There are no federal regulations around the use of AI by local law enforcement. Four states — Maryland, Montana, Vermont and Virginia — as well as at least 19 cities in nine other states explicitly bar their own police from using facial recognition for live, automated or real-time identification or tracking, according to the Security Industry Association, a trade group.

Lawmakers in these places cited concerns in public meetings that the technology could infringe on people’s constitutional rights or lead police to make mistakes when they rush to arrest a potential suspect before taking steps to confirm their connection to the crime, as many people look alike. At least eight Americans have been wrongfully arrested due to facial recognition, The Post and others have reported.

The unsanctioned surveillance program in New Orleans highlights the challenge of regulating a technology that is widely available, at a time when some police see AI as an invaluable crime fighting tool. Even in some places where officials have banned facial recognition, including Austin and San Francisco, officers skirted the bans by covertly asking officers from neighboring towns to run AI searches on their behalf, The Post reported last year.

Violent crime rates in New Orleans, like much of the country, are at historic lows, according to Jeff Asher, a consultant who tracks crime statistics in the region. But city officials have seized on recent instances of violent crime to argue that police need the most powerful tools at their disposal.

Last month, an independent report commissioned after the New Year’s Day attack that left 14 people dead on Bourbon Street found the New Orleans police to be understaffed and underprepared. The report, overseen by former New York City police commissioner William Bratton, advised New Orleans to explore adopting several new tools, including drones, threat prediction systems and upgrades to the city’s real-time crime center — but did not recommend adding any form of facial recognition.

Kirkpatrick, the city’s top police official, and Jason Williams, its top prosecutor, both said they are in discussions with the city council to revise the facial recognition ordinance. Kirkpatrick says she supports the idea of the city legally operating its own live facial recognition program, without the involvement of Project NOLA and with certain boundaries, such as prohibiting use of the technology to identify people at a protest.

“Can you have the technology without violating and surveilling?” she asked. “Yes, you can. And that’s what we’re advocating for.”

5,000 cameras

Few people have as much visibility into the everyday lives of New Orleans residents as Lagarde, a former patrol officer and investigator who started his own video surveillance business in the late 1990s before launching Project NOLA in 2009.

Funded by donations and reliant on businesses that agree to host the cameras on their buildings or connect existing surveillance cameras to its centralized network, Lagarde said Project NOLA has access to 5,000 crime cameras across New Orleans, most of which are not equipped with facial recognition. The cameras all feed into a single control room in a leased office space on the University of New Orleans campus, Lagarde said in an interview at the facility. Some camera feeds are also monitored by federal, state and local law enforcement agencies, he said.

Project NOLA made $806,724 in revenue in 2023, tax filings show. Much of it came from “cloud fees” the group charges local governments outside of New Orleans — from Monticello, Florida, to Frederick, Colorado — which install Project NOLA cameras across their own towns and rely on Lagarde’s assistance monitoring crime. He’s experimented with facial recognition in Mississippi, he said, but his “first instance of doing citywide facial recognition is New Orleans.” New Orleans does not pay Project NOLA.

For more than a decade, Lagarde used standard cameras outside businesses to monitor crime and offer surveillance clips for officers to use in their investigations. Lagarde’s cameras became so widespread that police began calling him when they spotted a Project NOLA camera hovering near a crime scene they were investigating, according to police incident reports, interviews with police and emails obtained through a public records request.

Lagarde began adding facial recognition cameras to his network in early 2023, after an $87,000 bequest from a local woman. Lagarde used the money to buy a batch of cameras capable of detecting people from about 700 feet away and automatically matching them to the facial features, physical characteristics and even the clothing of people in a database of names and faces he has compiled.

Lagarde says he built his database partly from mug shots from local law enforcement agencies. It includes more than 30,000 “local suspected and known criminals,” Project NOLA wrote on Facebook in 2023. Lagarde can quickly identify anyone in the database the moment they step in front of a Project NOLA camera, he said. He can also enter a name or image to pull up all the video clips of that person Project NOLA captured within the last 30 days, after which Lagarde says videos get automatically deleted “for privacy reasons.”

Project NOLA found enthusiastic partners in local business owners, some of who were fed up with what they saw as the city’s inability to curb crime in the French Quarter — the engine of its tourism economy that’s also a hub for drug dealers and thieves who prey on tourists, said Tim Blake, the owner of Three Legged Dog, a bar that was one of the first places to host one of Project NOLA’s facial recognition cameras.

“Project NOLA would not exist if the government had done its job,” Blake said.

While Lagarde sometimes appears alongside city officials at news conferences announcing prominent arrests, he is not a New Orleans government employee or contractor. Therefore, Lagarde and the organization are not required to share information about facial recognition matches that could be critical evidence in the courtroom, said Danny Engelberg, the chief public defender for New Orleans.

“When you make this a private entity, all those guardrails that are supposed to be in place for law enforcement and prosecution are no longer there, and we don’t have the tools to do what we do, which is hold people accountable,” he said.

Lagarde says he tries to be transparent by posting about some of his successful matches on Facebook, though he acknowledges that he only posts a small fraction of them and says it would be “irresponsible” to post information about open investigations. Project NOLA, he added, is accountable to the businesses and private individuals who host the cameras and voluntarily opt to share their feeds with the network.

“It’s a system that can be turned off as easily as it’s been turned on,” he said. “Were we to ever violate public trust, people can individually turn these cameras off.”

Banned devices

Lagarde declined to say who makes the equipment he uses, saying he doesn’t want to endorse any company.

Several Project NOLA cameras in the French Quarter look nearly identical to ones on the website of Dahua, a Chinese camera maker, and product codes stamped on the backs of these devices correspond to an identical camera sold by Plainview, New York-based equipment retailer ENS Security, which has acknowledged reselling Dahua cameras in the past. Project NOLA’s website also contains a link to download an app where police officers can view and manage footage. The app, called DSS, is made by Dahua.

Congress banned federal agencies from using products or services made by Dahua and a list of other Chinese companies in 2018, citing concerns that the equipment could be used by President Xi Jinping’s government to spy on Americans. Since 2020, the law has barred any agency or contractor that receives federal funds from using those funds on the banned products.

A Dahua spokesperson declined to comment on the New Orleans cameras and said the company stopped selling equipment in the U.S. last year.

The New Orleans Police Department has received tens of millions of dollars from the federal government in recent years and confirmed that some officers have installed this DSS app on mobile phones and police workstations. Kirkpatrick said she was not aware of who made the app or cameras but would look into it.

Lagarde said Project NOLA uses “American-made, brand-name servers to operate our camera program.”

Some city officials argue that police are not violating the city’s facial recognition ordinance because they do not own the cameras or contract with Lagarde; they are merely receiving tips from an outside group that is performing facial recognition scans on its own.

“If Bryan Lagarde calls an officer and says ‘I think a crime is occurring on the 1800 Block of Bienville,’ that’s no different than Miss Johnson looking out of her window and saying ‘I think a crime is occurring on 1850 Bienville,’” Williams, the Orleans Parish district attorney, said in an interview.

But in many cases, police have gone to Lagarde to request footage or help identifying and locating suspects, according to police reports, Project NOLA social media posts and internal police emails.

Tracking a suspect

In one case last year, a police detective investigating a snatched cellphone relied on Project NOLA to identify the perpetrator and track him down using facial recognition alerts, according to accounts of the investigation drawn partly from the police incident report and partly from Project NOLA’s Facebook post.

The detective contacted Lagarde “to assist locating the perpetrator on Project NOLA cameras,” according to the police report, providing still shots taken from the city’s surveillance camera footage. Lagarde used Project NOLA’s clothing recognition tool to find previous video footage of a suspect. With the new, better images of his face, Project NOLA used facial recognition to learn his possible identity and share that with the detective.

The detective took that name and found photos of a man on social media whose appearance and tattoos matched the phone-snatcher. Police got a warrant for his arrest. Lagarde added that name and face to Project NOLA’s watch list, and a few days later, cameras automatically identified him in the French Quarter and alerted police, who found and arrested him. The man was charged with robbery but pleaded guilty to the lesser offense of theft, court records show.

The police report mentioned that Lagarde helped identify the suspect, but did not mention that he used facial recognition to do so or used live facial recognition and automated alerts to monitor for and locate him.

David Barnes, a New Orleans police sergeant overseeing legal research and planning, said officers are trained to always find probable cause before making an arrest. He said Lagarde sometimes overstates in Facebook posts the role his technology played in some of the cases. He said the detective investigating the phone-snatching case was only asking Lagarde to find videos of the suspect, not the location of the suspect.

On a rainy May morning outside the Three Legged Dog, a Project NOLA camera swiveled about, blinking red and blue lights, and twitching side to side as it followed cars and people based on an automated program. The camera is no longer pinging the police on an app — at Kirkpatrick’s request.

“Like you and everybody else, I do not want to lose any cases of violent criminals based on policy violations or violations of our ordinances,” Kirkpatrick said in her email last month to Lagarde.

But the alerts still go to Project NOLA staff, who Lagarde said convey the location of wanted suspects to the police via phone calls, texts and emails.


Also:


New Orleans used Minority Report-like facial recognition software to monitor citizens for crime suspects:

New Orleans police have secretly been using facial recognition software to monitor citizens in an effort to identify crime suspects, according to a new report.

And while police across the country are increasingly using such technology, an investigation by The Washington Post has found that police in New Orleans have taken it a step further, potentially violating a city ordinance that was designed to prevent false arrests and safeguard their citizens' civil rights.

The cameras belong to Project NOLA, a nonprofit that operates a private network of more than 200 facial recognition cameras that scan the streets for wanted suspects and then automatically sends the information to cops’ phones through an app when a possible match for a suspect is detected, The Post reported.

Bryan Lagarde, a former cop who founded the surveillance video company, had previously said he wanted to help police more closely monitor the city's “crime-heavy areas.”

Police were only supposed to use the software to find “specific suspects in their investigations of violent crimes,” according to the city ordinance.

But The Post’s investigation found, according to court records, that these cameras “played a role in dozens of arrests,” but most uses were never disclosed in police reports and were “not included in the department’s mandatory reports to the city council.”

If an officer wants to scan a face, the ordinance requires police to send a still image to a state-run “fusion center” in Baton Rouge, where various law enforcement agencies collaborate on investigations, The Post reported.

Experts who are trained in identifying faces use AI software would compare the image with a database of photos and only if at least two examiners agree, they would return a “match” before cops approach suspects.

Nathan Freed Wessler, deputy director of the American Civil Liberties Union (ACLU) Speech, Privacy, and Technology Project, “this is the facial recognition technology nightmare scenario that we have been worried about,” he told The Post.

“This is the government giving itself the power to track anyone — for that matter, everyone — as we go about our lives walking around in public.”

The program was paused in April, according to New Orleans Police Department Superintendent Anne Kirkpatrick, who told The Post that she would be conducting a review and that all automated alerts would be turned off until she is “sure that the use of the app meets all the requirements of the law and policies.”

Project NOLA promoted its cameras for playing a role in the capture of one of 10 inmates who escaped from a Louisiana jail just last week.

But the ACLU is calling for the department to “halt the program indefinitely and terminate all use of live-feed facial recognition technology.”

“We cannot ignore the real possibility of this tool being weaponized against marginalized communities, especially immigrants, activists, and others whose only crime is speaking out or challenging government policies. These individuals could be added to Project NOLA's watchlist without the public’s knowledge, and with no accountability or transparency on the part of the police departments,” said Alanah Odoms, Executive Director of the ACLU of Louisiana.

“Facial recognition technology poses a direct threat to the fundamental rights of every individual and has no place in our cities. We call on the New Orleans Police Department and the City of New Orleans to halt this program indefinitely and terminate all use of live-feed facial recognition technology. The ACLU of Louisiana will continue to fight the expansion of facial recognition systems and remain vigilant in defending the privacy rights of all Louisiana residents.”