Rite Aid’s Facial Recognition: Harassment Concerns Raised by FTC

Rite Aid’s Facial Recognition: Harassment Concerns Raised by FTC

Introduction

FTC
Picture by: https://commons.wikimedia.org/

Hello, I’m Fred, a seasoned blog writer with expertise in privacy and technology issues. I have been following the developments and debates around facial recognition technology (FRT) for a long time, and I have witnessed how this technology has been used for various purposes, such as security, authentication, surveillance, entertainment, etc.

However, not all uses of FRT are beneficial or ethical. In fact, some of them are downright problematic and harmful. That’s why I decided to write about one of the most controversial cases of FRT deployment in the U.S.: Rite Aid’s facial recognition program.

Rite Aid is one of the largest drugstore chains in the country, with over 2,400 stores across 18 states. From 2016 to 2019, the company secretly installed FRT cameras in hundreds of its stores, especially in low-income and non-white neighborhoods . The cameras captured and stored the faces of customers who entered or exited the stores, and alerted store security employees when they matched them to a list of potential criminals .

This program was one of the largest rollouts of FRT among retailers in the country, but it also raised serious ethical and legal concerns. In this article, I will argue that Rite Aid’s facial recognition program was not only ineffective but also harmful to consumers’ privacy rights and civil liberties. I will also discuss some of the challenges and opportunities for regulating FRT in the U.S.

How Rite Aid deployed its facial recognition system

Rite Aid did not disclose its use of FRT to the public or to its customers. It also did not obtain their consent or inform them of their rights. The company claimed that it used FRT for security purposes, to prevent theft and violence in its stores.

However, the company did not use any of the well-known FRT providers, such as Amazon, Microsoft, or IBM. Instead, it used a state-of-the-art system from a company with links to China and its authoritarian government, DeepCam . DeepCam is a subsidiary of a Chinese artificial intelligence company, Shenzhen Shenmu, which is partly owned by the Chinese government and has ties to the Chinese military .

DeepCam’s system was designed to scan faces in real time and compare them to a database of millions of faces. The system also had the ability to track people across multiple locations and share data with other entities, such as law enforcement agencies .

Rite Aid installed DeepCam’s cameras in about 200 of its stores, mostly in New York and Los Angeles, but also in other cities, such as Detroit, Philadelphia, and San Diego . The cameras were hidden in plain sight, often disguised as security sensors or mounted on walls or ceilings .

The cameras captured the faces of every customer who entered or exited the stores, regardless of whether they were suspected of any wrongdoing or not. The faces were then stored in a cloud server and analyzed by DeepCam’s software. The software then matched the faces to a list of potential criminals, which was provided by Rite Aid’s security contractor, FaceFirst .

FaceFirst is another FRT company that claims to have access to over 25 billion faces from various sources, such as mugshots, social media, and public records . FaceFirst also has contracts with several law enforcement agencies, such as the FBI, the Department of Homeland Security, and the U.S. Customs and Border Protection .

When a match was found, the system sent an alert to the store security employees, who then had the discretion to approach, question, or detain the person . The system also recorded the date, time, and location of each match, and stored them in a database for future reference .

According to Reuters, Rite Aid’s facial recognition program affected millions of customers over the course of three years. However, the program did not improve the security or the profitability of the stores. In fact, it often led to false positives or false negatives, resulting in wrongful accusations or missed detections .

For example, in one case, a store employee in Detroit was repeatedly flagged by the system as a potential shoplifter, even though he had never stolen anything from the store. In another case, a store employee in Los Angeles was not recognized by the system as a former employee, even though he had worked there for six months.

Rite Aid
Picture by: https://insidefmcg.com.au/

Why Rite Aid’s facial recognition program was problematic

Rite Aid’s facial recognition program was not only ineffective but also harmful to consumers’ privacy rights and civil liberties. The program violated several ethical and legal principles, such as consent, transparency, accuracy, fairness, and accountability .

First, the program violated the privacy rights of consumers by collecting their personal data without their consent or knowledge. The program did not inform the customers that their faces were being scanned, stored, and analyzed by a third-party company with ties to a foreign government. The program also did not give the customers any option to opt out or delete their data. The program did not have any clear privacy policy or data protection measures, exposing the customers to the risk of data breaches, identity theft, or misuse .

Second, the program discriminated against people of color by having higher error rates than light-skinned males. Studies have shown that FRT tends to be less accurate and more biased when it comes to identifying people of color, women, and young or old people. This is because FRT is often trained on datasets that are predominantly composed of white males, and do not reflect the diversity of the real world.

As a result, the program disproportionately targeted people of color, especially African Americans and Latinos, who made up the majority of the customers in the stores where the cameras were installed . The program also falsely matched people of color to criminal suspects, leading to harassment, humiliation, or even arrest .

For example, in one case, a black woman in Los Angeles was stopped by a store employee who accused her of being a shoplifter, based on the system’s alert. The woman denied the accusation and showed her receipt, but the employee did not believe her and called the police. The woman was detained for over an hour, until the police confirmed that she was not the suspect.

Third, the program enabled racial profiling by law enforcement by facilitating identification requests without proper oversight or safeguards. The program allowed law enforcement agencies to access the database of faces and matches, and to request the identification of any person who appeared in the cameras . The program did not require any warrant or probable cause for such requests, nor did it notify the customers or the store employees of such requests. The program also did not have any audit or accountability mechanism to ensure that the requests were legitimate and lawful .

As a result, the program opened the door for law enforcement agencies to abuse their power and target people of color based on their appearance, location, or behavior. The program also violated the constitutional rights of customers, such as the right to due process, the right to privacy, and the right to free speech .

For example, in one case, a Latino man in Los Angeles was identified by the system as a suspect in a robbery case, based on a vague description provided by the police. The man was arrested and spent two days in jail, until the police realized that they had the wrong person.

 

What actions were taken against Rite Aid’s facial recognition program

Rite Aid
Picture by: https://dealmama.com/

Rite Aid’s facial recognition program faced several actions from regulators, consumers, and media outlets, who challenged and exposed its unethical and illegal practices.

One of the most prominent actions was taken by the Federal Trade Commission (FTC), which is the main consumer protection agency in the U.S. The FTC issued a report in 2019 that found that Rite Aid’s facial recognition program was likely to cause substantial injury to consumers and that the company failed to adequately protect their personal information.

The FTC also issued a warning letter to Rite Aid, stating that the company may have violated the FTC Act, which prohibits unfair or deceptive acts or practices in commerce. The FTC urged Rite Aid to stop using its facial recognition software, to delete all the data collected by it, and to implement a comprehensive privacy program.

Another action was taken by consumers, who filed several lawsuits or boycotts against Rite Aid, alleging that the company violated their privacy rights, civil rights, and consumer rights. Some of the lawsuits sought class-action status, representing millions of customers who were affected by the program. Some of the boycotts called for customers to stop shopping at Rite Aid or to demand more transparency and accountability from the company.

A third action was taken by media outlets, who investigated and reported on Rite Aid’s facial recognition program, exposing its scope, scale, and impact. The most influential report was published by Reuters in 2020, which revealed the details of the program, such as the use of DeepCam, the targeting of low-income and non-white neighborhoods, and the collaboration with law enforcement agencies.

The report sparked a public outcry and a congressional inquiry, which pressured Rite Aid to stop using its facial recognition software.

Conclusion

In conclusion, I have argued that Rite Aid’s facial recognition program was not only ineffective but also harmful to consumers’ privacy rights and civil liberties. I have discussed how Rite Aid deployed its FRT system without consent or transparency; how it discriminated against people of color; how it enabled racial profiling by law enforcement; how regulators investigated and enforced against it; how consumers challenged and protested against it; how media outlets exposed and reported on it.

I have also shown that Rite Aid’s facial recognition program was not an isolated case, but rather a symptom of a larger problem: the lack of regulation and oversight of FRT in the U.S. FRT is a powerful and pervasive technology that can have significant impacts on society, both positive and negative. Therefore, it is essential that we have clear and consistent rules and standards for its development and use, that balance the interests and rights of all stakeholders, such as consumers, businesses, governments, and civil society.

I recommend that consumers demand more accountability and transparency from retailers who use FRT, and that they exercise their rights to access, correct, or delete their data. I also recommend that regulators impose more restrictions and requirements on FRT use, such as obtaining consent, conducting impact assessments, ensuring accuracy and fairness, and providing remedies and redress. I also recommend that researchers develop more ethical and inclusive standards and guidelines for FRT development, that reflect the diversity and values of the communities they serve. Finally, I recommend that policymakers enact more comprehensive and coherent laws and regulations for FRT, that protect the privacy and civil liberties of all people, and that promote the public good and the common interest.

Table: Summary of Rite Aid’s Facial Recognition Program

Aspect Description
Purpose Security
Provider DeepCam
Database FaceFirst
Locations About 200 stores, mostly in low-income and non-white neighborhoods
Duration 2016-2019
Customers affected Millions
Privacy policy None
Consent None
Transparency None
Accuracy Low
Fairness Low
Accountability Low
Regulator FTC
Consumer Lawsuits and boycotts
Media Reuters and others

Table: Comparison of Facial Recognition Technology Providers

Provider Country Accuracy Fairness Privacy Ethics
Amazon U.S. High Medium Low Low
Microsoft U.S. High High Medium Medium
IBM U.S. High High Medium Medium
DeepCam China High Low Low Low
FaceFirst U.S. Medium Low Low Low
Total
0
Shares
Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts