FTC's Rite Aid Ruling Rightly Renews Scrutiny of Face Recognition

The Federal Trade Commission on Tuesday announced action against the pharmacy chain Rite Aid for its use of face recognition technology in hundreds of stores.
The regulator found that Rite Aid deployed a massive, error-riddled surveillance program, chose vendors that could not properly safeguard the personal data the chain hoarded, and attempted to keep it all under wraps.
Under a proposed settlement, Rite Aid can't operate a face recognition system in any of its stores for five years.
EFF advocates for laws that require companies to get clear, opt-in consent from any person before scanning their faces.
Rite Aid's program, as described in the complaint, would violate such laws.
The FTC's action against Rite Aid illustrates many of the problems we have raised about face recognition-including how data collected for face recognition systems is often insufficiently protected, and how systems are often deployed in ways that disproportionately hurt BIPOC communities.
The company collected tens of thousands of images of individuals, many of which were low-quality and came from Rite Aid's security cameras, employee phone cameras and even news stories, according to the complaint.
Rite Aid's system falsely flagged numerous customers, according to the complaint, including an 11 year-old girl whom employees searched based on a false-positive result.
Even if Rite Aid's face recognition technology had been completely accurate, the way the company deployed it was wrong.
Rite Aid scanned everyone who came into certain stores and matched them against an internal list.
Any company that does this assumes the guilt of everyone who walks in the door.
As we have pointed out time and again, that assumption of guilt doesn't fall on all customers equally: People of color, who are already historically over-surveilled, are the ones who most often find themselves under new surveillance.
As a result, store patrons in plurality-Black, plurality-Asian, and plurality-Latino areas were more likely to be subjected to and surveilled by Rite Aid's facial recognition technology.
The FTC's ruling rightly pulls the many problems with facial recognition into the spotlight.
It also proposes remedies to many ways Rite Aid failed to ensure its system was safe and functional, failed to train employees on how to interpret results, and failed to evaluate whether its technology was harming its customers.
They must enact laws that require businesses to get opt-in consent before collecting or disclosing a person's biometrics.
This will ensure that people can make their own decisions about whether to participate in face recognition systems and know in advance which companies are using them.


This Cyber News was published on www.eff.org. Publication date: Wed, 20 Dec 2023 22:43:04 +0000


Cyber News related to FTC's Rite Aid Ruling Rightly Renews Scrutiny of Face Recognition