Subtitle: A story of where I used to work. Power. Greed. Madness. Read by the author. Published in 2025.
This is a book about Facebook, but it starts so slow, is narrated rather boringly, and is so wordy - why do I need to know the childhood story of the shark attack? - that I skipped randomly ahead after the detailed story of the New Zealand's prime minister's visit to the Facebook HQ for a photo-op with Mark Zuckerberg. I landed in chapter 46 - Myanmar. Now this was shocking. I knew that Facebook had a role in the genocide - I knew that those who incited it, used Facebook, but I didn't know how important Facebook was in enabling it, starting with:
- Early on Facebook made deals with local cell phone providers, making access to the Facebook website unmetered, which in practice meant that for most people in Myanmar, Facebook was the Internet.
- Facebook let local users use unofficial Facebook apps, which did not have a way of reporting hate speech.
- Facebook did not have any local physical presence.
- Facebook did not publish community guidelines in Burmese.
- Facebook had only one Burmese speaking content moderator in 2014, a contractor, in Ireland.
- The head of the Facebook content policy was denying the existence of the problem of accurate and timely moderation.
- Facebook was not interested in developing the Burmese language support, which would help with moderation, even after riots caused by Facebook posts in 2014.
- Facebook hired a second content moderator in Ireland in 2015, who, if I understood correctly, was removing peace activists' and civil society's posts, rather than anti-Muslim hate speech, possibly being in cahoots with the junta.
- A month before November 2015 elections, Facebook removed opposition candidate accounts, which were re-instated by Sarah's team after being informed about it by NGOs from Myanmar (Sarah's team set up clandestine meetings with civil society organisations in Myanmar).
- To be able to moderate Myanmar properly, Facebook would need to hire hundreds of moderators, like they do in other countries. They hired two extra for the period of elections, also based in Dublin.
...
Sarah writes: to have the best chance for being hired for a senior position at Facebook: be male, older, white, Harvard graduate, and be friends with the few people at the very top. An old Republican, Capitol Hill veteran told Sarah once: "Sarah, you know your boss Joel? He's a Jew who went to Harvard. [...] And his boss, Elliot? A Jew who went to Harvard, and his boss? A Jew who went to Harvard, and her boss? [she answers:] A Jew who dropped out of Harvard? [...] You are not like these people, and you'll never be like them, and the sooner you grasp this, the better.". Sarah took that advice to heart and tried to hire a very good candidate, a Harvard man, hoping to get the senior management's approval for expanding the Myanmar operation. Facebook didn't hire him.
The military lost the 2015 election, but did not give up.
In 2017, Facebook security operations team and civil society groups reported that verified accounts with large following in Myanmar were being hacked and used to spread hatred and fear, so people would demand the protection by the military. Facebook's algorithm promoted these posts because they received a lot of engagement. Sarah's team recommended these posts to Facebook content management and legal teams, but they refused to take them down, arguing they didn't violate local laws.
In late August 2017, the military launched a campaign of atrocities against the Muslim population. At least 10,000 people were murdered. Over 700,000 Muslims fled the country. The UN later recognised it as a genocide. Reporter Paul Mozur wrote in New York Times that the military was using at least 700 people spreading misinformation and hate on Facebook. They took over verified accounts of celebrities, fans, a military hero, etc. to pump out false, inflammatory posts. Troll accounts ran by the military helped spread the content, shut down critics and fuel arguments between commenters to rile people up.
Sarah's conclusion is that none of her bosses at Facebook - Joel, Elliot, Sheryl, Mark - gave a f*ck.
----------
Update:
I've listened now to the chapters that I skipped originally. I got used to the narration. The story with the shark is referenced later, so I understand now why it's in. The breastfeeding dramas are super weird to me, not because I'm a man, but because breastfeeding is a natural, harmless thing, and the Facebook's policy prohibiting employees to breastfeed in public is cruel. My wife breastfed our kids in public: on a plane, in a park, anywhere you can sit down. A few months ago I saw a woman breastfeed her baby on a crowded train platform here in Australia. Nothing wrong with that.
It's a very good book overall, and very important to understand Facebook and global politics. It's also important that the author read it herself. It makes it more like a witness statement. If you need to be convinced to stay away from anything ran by Meta, it will help you.
No comments:
Post a Comment