How does Myrtle Beach SC police use facial recognition software?
The Myrtle Beach Police Department used facial recognition software that collects and stores mass amounts of photographs and other information posted online in the course of police work for more than a year, records obtained by The Sun News reveal.
The Myrtle Beach police’s use of the software, created by the tech company Clearview AI, means that officers potentially had access to the photos residents and visitors have posted and appeared in on the internet but never intended to share with law enforcement. Though officers used the program on a trial basis and never paid for it, that trial period gave Myrtle Beach police access to Clearview AI’s search engine and investigative tools, which collects photos posted to Instagram, Facebook, Twitter and other parts of the internet and stores them in a massive database with other information. When police officers have a photo of a person and plug it into the software, Clearview AI pulls up possible matches from its database, allowing officers to identify people they might not have otherwise been able to.
Records obtained by The Sun News via a Freedom of Information Act request, in addition to other reporting, reveal Myrtle Beach officers searched Clearview AI’s database for people they suspected of crimes dozens of times In those cases, police officers were able to take a photo of a person they believed committed a crime, submit that photo to the software, and use information pulled from the database.
Cpl. Thomas Vest, a spokesperson for the police department, said no arrests were made after officers used the software.
Emails obtained by The Sun News show the department used Clearview AI’s program from February 2020 through April 2021. Vest confirmed the department used the program during that time period.
On the whole, Vest said, Myrtle Beach police sought to use the Clearview AI software as part of their police work and did so, and ultimately decided not to buy the software from the company.
“Our intelligence unit and officers who worked with them were authorized to have access to the program as a trial,” Vest wrote in an email in response to questions from The Sun News. “After the trial we chose not to purchase the program, and that was the end of our involvement. In our experience, investigative tactics utilizing technology we currently have has been successful at identifying and holding persons who commit crimes in our city accountable.”
Still, said advocates and others, the police department’s use of the technology raises questions about what constitutes privacy online, and whether law enforcement agents should have ready access to photos and other information people post on personal websites or social media accounts.
“We have certain rights in this country where we can go about our daily lives without surveillance…what this in effect does is just completely undermine that,” said Frank Knaack, the executive director of the American Civil Liberties Union in South Carolina. “We don’t know how they’re using it. There are no state checks on it, there are no court checks on law enforcement being able to use this technology. Officers could spy on people they want to keep an eye on.”
How Clearview AI works, and how Myrtle Beach police used it
In its marketing materials, Clearview AI claims that its software can be used as a powerful tool to track down suspected criminals when all a police officer may have is a photo of the person. In one example touted by the company, police officers in Las Vegas were alerted that a man was sharing child pornography over the internet, and only had a photo of the suspect to go off of. Officers, the company says, were able to plug that photo into Clearview AI’s search engine, which collects mass amounts of photographs posted online, and got a match in return: The man appeared in the background of someone else’s Instagram post. From there, officers were able to track the man down, and arrest him.
Clearview AI has also boasted that its technology was used by federal authorities to track down rioters who stormed the U.S. Capitol on Jan. 6. In those cases, officers could take screenshots of video recordings from inside the building, plug those images into Clearview AI’s search engine, and get results that could help identify the person.
It appears Myrtle Beach police used Clearview AI in a similar way.
The emails obtained by The Sun News show that Cpl. Chris Tyndall, a detective in the Myrtle Beach police’s intelligence unit, created an account with Clearview AI in February 2020 and began using the software then. About a month later, in March, Tyndall invited Lt. Chris Smith, an investigations supervisor, to create an account himself, which Smith appears to have done. Tyndall was the primary user of the Clearview AI software, Vest said.
Between February 29, 2020 and April 3, the emails show that a user within the Myrtle Beach Police Department logged into the Clearview AI program 13 times on various dates. The majority of the logins occurred in the first half of 2020, between February 29 and May 11, though there were two additional logins in November 2020 and April 2021.
Though the logins to the software are piecemeal, the emails also suggest officers used the program regularly and into 2021. In October 2020, for example, police detective Bryan Stillwell emailed photos of people to Tyndall and asked him to run the photos through Clearview AI’s software. Stillwell made a similar request in January 2021, the emails show.
Data obtained by Buzzfeed News, which first reported that hundreds of local police departments across the country had used Clearview AI, shows that Myrtle Beach police used the software between 101 and 500 times throughout its trial. In messages sent to users, Clearview AI encourages law enforcement officers to conduct as many searches as they can with the software.
“Don’t stop at one search. See if you can reach 100 searches. It’s a numbers game,” a company message sent to Smith when he set up his account said. “Our database is always expanding and you never know when a photo will turn up a lead.”
Buzzfeed News’ data shows that 30 other South Carolina law enforcement agencies used trials of Clearview AI’s software, including the Georgetown County Sheriff’s Department, the Charleston County Sheriff’s Department and the Beaufort police. The state attorney general’s office and SLED also used the technology. The Charleston County Sheriff’s Department and the Spartanburg police used the Clearview AI software most frequently, with the agencies tallying between 501-1,000 and 1001-5,000 searches respectively.
“The program was used during investigations on persons involved or believed to be involved in crimes,” Vest said. “It was used alongside standard investigative methods, and the results were not used to make arrests.”
By early January, police officials began considering whether or not to purchase a subscription to the software, though the department ultimately opted not to do so, Vest said.
During the agency’s trial period with Clearview, Vest said, the chief of police was informed that officers were using the technology, but Mayor Brenda Bethune and members of city council were not. Bethune said it’s her understanding that the software was only used on a trial basis and was never purchased by the city.
“Had we continued the use of the program and intended to purchase the program, we would share the results of the trial and justification for purchase with the City Leadership Council and the Mayor,” Vest said. “We chose not to purchase the program, and our involvement with the company ended there.”
Ethical concerns about facial recognition software
Generally, artificial intelligence software works by collecting a massive amount of data into a centralized database, and then deploying code to detect patterns in that data and draw conclusions based on those patterns. Speech-to-text software, like the kind used in Google Translate, is a good example of how an artificial intelligence program can take a mass amount of data (multiple users all pronouncing the same words slightly differently) and conclude which translations best match what a person is saying. Those “conclusions” reached by the software program are then added back into the central database, allowing the program to “learn” as it keeps running over and over.
Artificial intelligence software can also be used to analyze photographs and find patterns in the images. Clearview AI’s facial recognition software falls under that category.
Feng Luo, a professor at Clemson University and the founding director of Clemson’s AI Research Institute for Science and Engineering, explained that facial recognition software works by having a computer program recognize patterns in the images, or faces, it’s reading. Those patterns could include everything from how a person’s eyes and nose are shaped, to their facial hair, to the bone structure of their face, based on shadows, creases and indentations visible in a photograph.
“The general idea is to do a classification tool, you want to classify the images into different categories,” Luo said. “We can easily separate the features…(but) it’s a tough task. You need a large training dataset.”
Clearview AI, according to the company, gathers up photos and other information publicly-available on the web, from public social media accounts to mugshot websites, and then deploys a facial recognition program to categorize those images and match them to photos police officers submit. That means that if police have a somewhat-blurry photo of a person they suspect of committing a crime, perhaps taken from a security camera, they can plug that photo into Clearview AI’s software and receive possible matches in return.
But that type of software, critics have said, is ripe for biases and misuses. Because there are so many minute differences between people’s faces, and because the quality, angle and lighting of a photo can vary, facial recognition software can easily mistake one person for another. And in the criminal justice system, that can mean that people of color are more frequently targeted as matches by that software, especially if the software is drawing from mugshot photos, which more frequently feature Black and brown individuals.
“If you don’t have enough data, your data is biased,” Luo explained. “If you only have a picture from one group, the tools can’t apply to the other groups.”
Hoan Ton-That, the CEO of Clearview AI, said in an email that his company’s software only gathers publicly-available data and that independent studies have found “no racial bias” in the software.
“Clearview AI’s software only searches publicly available information that is available to any person with a computer and internet connection and does not search any private data,” he wrote. “As a person of mixed race, creating a non-biased technology is important to me.”
Critics of facial recognition software say that even if police use the technology with good intentions, its use essentially allows a government agency to surveil the public based on photographs that appear online.
“The issues with facial recognition are many. It’s a very powerful surveillance tool that allows for the mass surveillance of people which can be done without their knowledge,” said Jeramie Scott, senior counsel for the Electronic Privacy Information Center in Washington, D.C., an organization that supports digital privacy and engages in litigation against government agencies and other groups to prevent the spread and use of technologies like facial recognition software.
Facial recognition software, Scott said, “essentially creates a digital ID that government and law enforcement can control.” Even if a police department didn’t pay for a subscription to Clearview AI, Scott said, the agency may still have records from the program about people they searched for in the database.
“Presumably if they used it to identify someone they would have the hits that came back,” Scott said. “I assume they probably keep that information in the investigative file.”
In response to questions about individual privacy, Vest said Myrtle Beach police took those concerns seriously and didn’t use Clearview to surveil the public, only to search for people suspected of crimes.
“We take the safety of our community seriously and understand that individual privacy is part of that commitment,” Vest said. “We understand there are privacy concerns with emerging technology, and we do not take those concerns lightly. We hold our officers to a high standard of accountability and integrity and expect our community to do so as well.”
Ton-That, said the company didn’t intend for law enforcement agencies to use its software to surveil the public.
“Clearview AI’s database contains only publicly available information, not any private information, and is used for the after-the-crime investigations, not for real-time surveillance,” Ton-That wrote in an email. “Clearview AI’s software only searches publicly available information that is available to any person with a computer and internet connection and does not search any private data.”
Could Clearview AI be banned?
While police agencies around the country regularly seek to use new technologies to aid their police work, those tools can cross privacy boundaries, Knaack, of the South Carolina ACLU said. To illustrate that point, Knaack pointed to fingerprints: When a person goes out in public, they leave their fingerprints all over the place, but that doesn’t mean police officers can follow behind a person and gather up their fingerprints and keep track of every place they’ve been. Clearview AI collecting a mass amount of photos posted online does essentially that, Knaack said.
“This is the ability of law enforcement to take what we do in public and put it in a database and see everything we’re doing in public,” he said.
The ACLU nationally is currently involved in a lawsuit against Clearview AI in Illinois, alleging that the company’s technology is violating the public’s privacy rights.
But even though police using facial recognition software might raise some people’s concerns, the state legislature may not be willing to ban the use of the technology outright, said state Rep. William Bailey, R-Little River. Though some lawmakers may want to protect people’s privacy, he said, others may be wary of shunting law enforcement.
“I can honestly say that I see both sides of it,” Bailey, a former police officer and public safety director in North Myrtle Beach, said. “I can see it from the government using (your photos) for things you didn’t intend you to use (them for), but I can also see that you posted a photo and it’s out there.”
Short of state-wide bans on the use of facial recognition software, city and county councils may be more willing to bar that technology, Scott, of EPIC, said. In South Carolina, state Rep. Leonidas Stavrinakis, D-Charleston, introduced a bill earlier this year that would bar police officers from using biometric surveillance technology in conjunction with their body-worn cameras, though that legislation didn’t advance past a legislative committee this session.
Whether or not any cities or counties in South Carolina will bar the use of facial recognition software is yet to be seen. Still, Bailey said, the public knowing police have access to a database that contains all the photos they’ve posted publicly online could cause some “blowback.”
“(Police) have been going on people’s Facebook pages to figure out who they are and who they’re running with. I think the rub is doing it with the database,” Bailey said. “I think that’s where you’re going to have a huge blowback.”