GasCope
Meta's Smart Glasses Eye Your Face for a KYC; Senators Flash the Privacy Bat-Signal
Back to feed

Meta's Smart Glasses Eye Your Face for a KYC; Senators Flash the Privacy Bat-Signal

A trio of Democratic senators—Edward J. Markey (D‑MA), Jeff Merkley (D‑OR) and Ron Wyden (D‑OR)—fired off a strongly-worded letter to Mark Zuckerberg this Tuesday, taking aim at Meta's alleged scheme to integrate facial-recognition tech into its Ray‑Ban smart glasses. The legislators cautioned that live identity-matching could pave the way for "serious risks of stalking, harassment, and targeted intimidation," a particularly chilling prospect given Meta's already massive data silo, which makes a blockchain's immutable ledger look like a disposable notepad.

The lawmakers contend the spectacles could casually photograph countless unsuspecting individuals, immediately doxx them by linking faces to names, jobs, or social profiles, and effectively shred the age-old notion of anonymity in public. It's the ultimate IRL sniping tool, turning every sidewalk into a potential privacy rug pull.

Meta's wearable gadgets are already facing intense scrutiny. Reports surfaced earlier this month that contractors in Nairobi had been sifting through highly personal video clips from the glasses—footage that included people visiting the restroom or changing clothes. One contractor bluntly told reporters, “In some videos, you can see someone going to the toilet, or getting undressed,” highlighting the glaring issue of whether subjects had any clue they were being filmed, let alone reviewed by a human halfway across the globe.

Privacy watchdogs argue that merging always-on cameras with AI-trained models transforms the glasses into a portable surveillance state. John Davisson of the Electronic Privacy Information Center noted that “the wearer of the glasses cannot consent on behalf of all of the people they are encountering,” and that using identifiable footage for model training just stacks more compliance debt onto an already shaky data-protection premise.

A data firm based in Nairobi confirmed it had analyzed such sensitive footage after Meta enlisted the Kenyan company for offshore AI training labor. An anonymous source doubled down on the creepy detail: “I don’t think they know, because if they knew, they wouldn’t be recording.” It's the digital equivalent of a hidden camera in a Airbnb, but with your face potentially feeding a global ad-targeting model.

Meta's response was a masterclass in corporate deflection, stating that some content might be filtered before human eyes see it and that a blend of automated and manual processes helps refine its systems. The company has offered no timeline for a facial-recognition feature launch, leaving everyone to wonder if this is vaporware or a very real dystopian product roadmap.

The senators are demanding clear answers on whether scanned mugs will be matched to Facebook or Instagram profiles, how—or if—bystander consent would be gathered, and if biometric data will be stored or sold. They framed it perfectly: “Americans do not consent to biometric data collection simply by walking down a public street.” After all, in the real world, there's no "I agree to the Terms of Service" pop-up hovering over your head.

Their letter, which also gestures toward wider AI-driven surveillance worries—name-checking tools from firms like Palantir—gives Meta until April 6 to craft a reply. As of now, Meta has remained radio silent on Decrypt’s request for comment, probably too busy training an AI to write the response for them.

Share:
Publishergascope.com
Published
UpdatedMar 18, 2026, 06:05 UTC

Disclaimer: This content is for information and entertainment purposes only. It does not constitute financial, investment, legal, or tax advice. Always do your own research and consult with qualified professionals before making any financial decisions.

See our Terms of Service, Privacy Policy, and Editorial Policy.