pull down to refresh

This is a paywalled article, the best way to read is to open it in Brave and under shields turn on Block Scripts. That beats most paywalls. .. Cheers!
Byrne had joined Meta at a time when the company was transitioning “from content-based detection to profile-based detection,”said Byrne. Screenshots of team presentations Byrne participated in show an interest in predicting dangerousness among users. One presentation expresses concern with Facebook’s transition to encrypted messaging, which would prevent authorities (and Meta itself) from eavesdropping on chats: “We will need to move our detection/enforcement/investigation signals more upstream to surfaces we do have insight into (eg., user’s behaviors on FB, past violations, social relationships, group metadata like description, image, title, etc) in order to flag areas of harm.”
While Byrne and her colleagues were supposed to be preventing harm from occurring in the world, they often felt like they were a janitorial crew responding to bad press. “An article would come out, all my team would share it, and then it would be like ‘Fix this thing’ all day. I’d be glued to the computer.” Byrne recalls “my boss’s boss or even Mark Zuckerberg just like searching things, and screenshotting them, and sending them to us, like ‘Why is this still up?’” She remembers her team, contrary to conventional wisdom about Big Tech, “expressing gratitude when there would be [media] leaks sometimes, because we’d all of a sudden get all of these resources and ability to change things.”
Is this an example of another person coming out of the fog and realizing that censorship is wrong and the people doing it are the “bad guys”? Her experience at Meta, even though she did not mention the state interference in speech (which is happening) it was a mental health ruining disaster. She woke up to the problem.
reply
0 sats \ 0 replies \ @Roll 5 Dec
Each one of us needs to learn things on its own way...
reply