How Police Access Your Social Media Photos: A Comprehensive Guide

The Closing Argument newsletter from The Marshall Project is a weekly exploration into a pivotal criminal justice issue. Sign up for future issues to be sent directly to your inbox!

You have likely encountered facial recognition software in your everyday life, as many social media accounts and mobile devices employ this technology. Recently, my phone’s photos app made a special slideshow with photos I had taken of myself and a close friend, all chosen by AI and matched to an upbeat music track.

Retailers are now using facial recognition technology more and more to prevent shoplifting. Additionally, Madison Square Garden in New York City has recently been criticized for using the same technology to deny entry to lawyers engaged in legal proceedings against the arena. This shows that this technology is no longer a novelty but is being rapidly incorporated into public life.

Hoan Ton-That, CEO of facial recognition firm Clearview AI, recently informed the BBC that law enforcement in the U.S. has conducted over 1 million searches through their platform.

Clearview’s technology is primarily marketed to local law enforcement. It combines facial recognition algorithms (which many companies provide) with a database of over 30 billion images taken from the web – mainly from social media sites – without permission from those in the pictures.

Last week, the New York Times he was reported on Randal Quran Reid’s unfortunate experience. After being pulled over in Georgia in November 2022, he was arrested for a crime allegedly committed in Louisiana – one he had never been to. Clearview’s technology mistakenly identified him. He spent six days behind bars and had to spend thousands of dollars on legal fees before the situation could be resolved.

Reid says:

“Imagine you’re living your life and somewhere far away says you committed a crime,”

“And you know you’ve never been there.”

One of the most disturbing facts in Reid’s case is that the court documents utilized to take him into custody, including the judge-endorsed warrant, do not specify that facial recognition technology was employed.

Historically, these algorithms have performed worse on darker-skinned people than White people. Others have argued that technology is rapidly improving, and racial bias concerns may be overblown.

Wired Magazine has been a witness to the growth of policing technology, and they recently featured the accounts of three African American men who were arrested on false charges. This experience caused them both financial and psychological damage.

Michael Oliver says:

“Once I got arrested and I lost my job, it was like everything fell, like everything went down the drain,”

In Detroit, two cases have caused police to increase their standards for using facial recognition technology in criminal probes. Former Police Chief James Craig and other law enforcement personnel usually justify the practice by saying that officers rely on it only to generate investigative leads, not to determine who is taken into custody.

A recent report from the Georgetown Law Center on Privacy and Technology has highlighted this technology’s implications.

Michael Oliver went on to say:

“In the absence of case law or other guidance, it has in some cases been the primary, if not the only, piece of evidence linking an individual to the crime.”

Sidney Holmes of Florida was recently exonerated after spending more than 30 years in prison, as prosecutors suspected that a witness wrongly identified him in a line-up. This issue of false identification within the justice system has been around for many years and is stated by Innocence Project as the primary factor in wrongful convictions. The use of facial recognition technology is only one piece of this puzzle.

A report released in 2022, which appeared in the Duquesne Law Review, warned that wrongful convictions might ensue if the police do not take adequate safeguards when relying on either of these identification methods.

In a particular instance, the technology was utilized to show that an individual charged with a crime was not guilty. This case was thoroughly discussed in an article from The New York Times, which caused Clearview AI to make its product accessible for public defenders to restore fairness within the justice system, as Ton-That of the company declared.

Critics are very doubtful that the technology can ever right wrongful arrests to the same extent as it is capable of making them. The European Union, Australia, and Canada have all concluded that Clearview’s technology goes against their laws on privacy. As part of a 2022 settlement with Illinois for breaching state regulations, Clearview agreed not to market its database to other businesses.

In Russia and China, facial recognition has been utilized by the state as a tool for control. The government of the United States is also exploring its possibilities about law enforcement, with organizations such as the FBI and Immigration and Customs Enforcement involved. This exploration has resulted in an app asylum seekers must use to track their applications.

The 2021 watchdogs’ report of the government has uncovered a large-scale implementation of facial recognition technology among federal agencies, with many not knowing the systems employed. In reaction to this finding, congressional hearings and bills have been brought up, but no tangible steps have been taken.

The digital age has brought new challenges and considerations regarding social media photos and law enforcement. Individuals need to be mindful of the potential risks and take steps to protect their privacy while using social media platforms. Law enforcement agencies must also be responsible and transparent in using social media photos to ensure that individual rights and privacy are respected. By being informed and proactive, we can navigate the complex intersection of social media and law enforcement in the digital era.

Source: The Marshall Project

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top