Rite Aid settles with FTC after its facial recognition system seemed to keep identifying black, Latino, and Asian people as "likely shoplifters"
ยท Dec 28, 2023 ยท NottheBee.com

From 2012 to 2020, drugstore giant Rite Aid employed a facial recognition software in its security system to try and cut down on shoplifters.

The system was supposed to flag repeat shoplifting threats and send a "match alert" to employees if any of those threats entered any Rite Aid store.

Employees were prompted to take one of several actions, depending on the individual:

{I} "Approach and Identify,"

(ii) "Observe and Provide Customer Service,"

(iii) "Pharmacy Patient - Escort to Pharmacy," and (iv) "911 Alert" or "Potentially Violent - Notify Law Enforcement and Observe." For enrollments with the instruction "911 Alert," employees were told to "call 911 and notify [the police that] a potentially violent or dangerous subject has entered the store."

Now all of that would be great, if it worked.

The trouble is that the facial recognition software was running off of Rite Aid's CCTV cameras, which aren't always crystal clear, and for some reason that led to a large number of misidentifications.

It turns out that law-abiding citizens get upset when they're accused of being a criminal in front of their friends and family members. Those complaints then led to an FTC investigation.

"The result was sadly predictable: thousands of misidentifications that disproportionately affected Black Asian, and Latino customers, some of which led to humiliating searches and store ejections," said John Davisson, Epic's director of litigation.

While monetary agreements in the settlement will have to be approved through a bankruptcy court, since Rite Aid filed Chapter 11 last fall, the FTC and Rite Aid have agreed that the company will not be able to use facial recognition software for five years, and will have to train its employees on the responsible use of it in the future.

But short of some major technological advancements, I don't think that training will improve their outcomes at all.

Studies have repeatedly shown that facial recognition is not good at distinguishing between people with darker skin.

Some woke studies say it's because there are too many people with darker skin in the criminal databases, so it confuses the AI. Other woke studies say it's because there are too few people with darker skin in the training sets the AI uses.

The less-woke scientific studies say it's because of the properties of light that the facial recognition systems rely on to compare features. It turns out that melanin absorbs light, allowing less to reflect back to a camera lens. That property is what gives skin its darker hue, but it also obscures subtle difference in facial features.

The non-woke libertarian studies are so freaked out by the thought of facial recognition AI, that their researchers have all disappeared and are hiding off the grid somewhere.

The lawyers that brought the suit against Rite Aid hope this settlement will at least curb the use of the technology a bit.

"This is a groundbreaking case, a major stride for privacy and civil rights, and hopefully just the beginning of a trend," Davisson said. "But it's important to note that Rite Aid isn't alone. Businesses routinely use unproven algorithms and snake oil surveillance tools to screen consumers, often in secret. The FTC is right to crack down on these practices, and businesses would be wise to take note. Algorithmic lawlessness is not an option any more."

Maybe.

But my guess is that the woke tech bros will just over correct and pull a Better Off Ted, making it so their software doesn't see color at all.


P.S. Now check out our latest video ๐Ÿ‘‡

Keep up with our latest videos โ€” Subscribe to our YouTube channel!

Ready to join the conversation? Subscribe today.

Access comments and our fully-featured social platform.

Sign up Now
App screenshot

You must signup or login to view or post comments on this article.