pull down to refresh

Each pair is a real-time camera connected to servers.

Each activation of "Hey Meta" generates a video.

These videos go to Kenya, where human workers label what the algorithm can't process.

Read that again.

These aren't metadata. These aren't blurry clips.

These are real videos. People undressing.

Bathrooms. Sexual relations. Bank card data. Medical documents.

The workers described what they see:

→ "From living rooms to naked bodies"
→ Unintentionally captured card details
→ Cameras in doctors' offices
→ The anonymization eraser? It constantly fails

But here's what they're hiding from you:

You didn't agree to any of this.

If someone with these glasses enters your bedroom, your bathroom, your doctor's appointment, you are not a contracting party.

You didn't sign anything. You have no protection whatsoever. You are the product being annotated.

Meta's defense? "It's in the terms of service."

Technically correct. Practically irrelevant. The terms govern the data of those who use the product. Not yours.

And the company making the annotation? Sama, the same one exposed by TIME in 2023 paying $2/hour to label graphic content for OpenAI while charging $12.50 per worker. The employees themselves called it torture.

Same workforce. Same rates. New contract: your intimate moments.

Google Glass died because people called users "Glassholes" and banned them from bars.

Meta solved the aesthetic problem.

It didn't solve the privacy problem. It hid it.

And the next generation? Facial recognition.

The same system that can't reliably blur faces will start purposefully identifying them.

The EU is already moving. European parliamentarians have submitted formal inquiries to the Commission demanding answers on GDPR compliance.

The problem is obvious: European law requires consent from data subjects. Spectators are data subjects. Spectators never consented. The entire architecture violates the regulation by design.

What to watch for: if Brussels formalizes enforcement, Meta faces an impossible choice.

→ Disables human review in Europe
→ Cripple in AI training pipeline
→ Accepts fines that could reach billions

None of these scenarios are priced into the action.

50 million installed units permanently change the calculation.

This isn't about privacy.

This is about who controls your data without ever asking your permission.