A new book digs deep into the world of the largest social network, its founder, the company-s struggle with privacy, freedom of speech
Facebook Chief Executive Officer and founder Mark Zuckerberg at a Dublin hotel in April 2019, after meeting with Irish politicians to discuss regulation of social media, transparency in political advertising and the safety of young people and vulnerable a
Still, it was clear that Zuckerberg did not want the responsibility of policing the speech of more than 2 billion people. He wanted a way out, so he wouldn-t have to make decisions on Alex Jones and hate speech, or judge whether vaccines caused autism. "I have a vision around why we built these products to help people connect," he said. "I do not view myself or our company as the authorities on defining what acceptable speech is. Now that we can proactively look at stuff, who gets to define what hate speech is?" He hastened to say that he wasn-t shirking this responsibility, and Facebook would continue policing its content. "But I do think that it may make more sense for there to be more societal debate and at some point even rules that are put in place around what society wants on these platforms and doesn-t."
As it turns out, Zuckerberg was already formulating a plan to take some of the heat off Facebook for those decisions. It involved an outside oversight board to make the momentous calls that were even above Mark Zuckerberg-s galactic pay grade. It would be like a Supreme Court of Facebook, and Zuckerberg would have to abide by the decisions of his governance board.
Setting up such a body was tricky. If Facebook did it completely on its own, the new institution would be thought of as a puppet constrained by its creator. So it solicited outside advice, gathering a few hundred domain experts in Singapore, Berlin, and New York City for workshops.
After listening to all these great minds, Facebook would take the parts of the recommendations it saw fit to create a board with the right amounts of autonomy
and power.
I was one of 150 or so workshop participants at the NoMad Hotel gathering in New York City-s Flatiron district. Sitting at tables in a basement ballroom were lawyers, lobbyists, human rights advocates, and even a couple of us journalists. For much of the two-day session we dug into a pair of individual cases, second-guessing the calls. One of them was the "men are scum" case that had been covered a few times in the press.
A funny thing happened. As we got deeper into the tensions of free expression and harmful speech, there was a point where we lost track of the criteria that determined where the line should be drawn. The Community Standards that strictly determined what stood and what would be taken down was not some Magna Carta of online speech rights but a meandering document evolved from the scribbled notes of customer support people barely out of college.
The proposed board would be able to overrule something in that playbook for the individual cases it considered, but Facebook provided no North Star to help us draw the line—just a vague standard touting the values of Safety, Voice, and Equity. What were Facebook-s values? Were they determined by morality or dictated by its business needs? Privately, some of the Facebook policy people confessed to me that they had profound doubts about the project.
I could see why. For one thing, the members of this proposed body—there will be forty members, chosen by two people appointed by Facebook—can take on only a tiny fraction of Facebook-s controversial judgment calls. In the first quarter of 2019, about 2 million people appealed Facebook content decisions. Facebook would have to abide by the decisions on individual cases, but it would be up to Facebook to determine whether the board-s decisions would be regarded as precedent, or simply limited to the individual pieces of content ruled on, because of expedience or because they were lousy calls.
One thing seems inevitable: an unpopular decision by a Facebook Supreme Court would be regarded just as harshly as one made by Zuckerberg himself. Content moderation may be outsourced, but Facebook can-t outsource responsibility for what happens on its own platform. Zuckerberg is right when he says that he or his company should not be the world-s arbiter of speech. But by connecting the world, he built something that put him in that uncomfortable position.
He owns it. Christchurch and all.
Excerpted with permission from Facebook: An inside story by Steven Levy, published by Penguin Random House UK
Catch up on all the latest Mumbai news, crime news, current affairs, and a complete guide from food to things to do and events across Mumbai. Also download the new mid-day Android and iOS apps to get latest updates.
Mid-Day is now on Telegram. Click here to join our channel @middayinfomedialtd and stay updated with the latest news
