After just over a year, the Target Oversight Committee has finally published the result of its review of the cross check policy🇧🇷 According to the Committee, the method ends up favoring companies, celebrities, politicians and other far-reaching accounts, while harming common users of their social networks.
Cross-checking is Meta’s policy in which the “high society” group of influence is separated from the regular user in automatic content moderation. While useful for preventing troll attacks, it proved ineffective at removing offensive content in a timely manner. Committee study was initiated after complaints from former Meta engineer Frances Haugen🇧🇷
Cross checking is useful but needs improvement
The cross-check proposal is useful. It helps, for example, that journalists in conflict zones and activists in countries ruled by dictators. Without cross-checking, an Iranian journalist publishing about the country’s protests could easily be suspended. It would be enough for the government to carry out a “rain of denouncements” on the account.
Nonetheless, as pointed out by the report of the Committee, cross-checking is too slow to remove really inappropriate content. Something even natural, as the human factors in moderation will not live 24 hours awake to assess whether a celebrity’s photo is inappropriate.
To exemplify the delay in moderation, the Committee used as an example the complaint of rape against Neymar. In 2019, hours after the case surfaced, the player published a video showing the entire WhatsApp conversation with the whistleblower, whose name appears in the video. At times, nude photos of the woman also appear.
The video was live for more than a day — and it was obviously downloaded left and right, generating 56 million views on Facebook and Instagram before it was taken down. Meta blames the queue of content currently being graded.
But if there is special treatment in the moderation of large accounts, why are they not also prioritized in the evaluation? After all, an account with 500 followers posting nudes will be quickly removed by the automated system, while the reach of illegal content posted by Neymar is much greater.
According to Meta’s policies, the punishment for Neymar’s violation would be deletion of his account. In December 2021, Meta signed an exclusive contract with Neymar for his game streams, which are shown on Facebook Gaming. Favoring reinforces a point of the committee, the focus is not the user.
Committee says cross check protects companies
The Committee’s report reinforces Frances Haugen’s point: Meta prioritized business over safety. And of course, Meta is a company with two social networks, whose revenue comes from ads — for the most part. The problem is that, in the words of the company’s own Committee, the cross-checking appears to be structured to protect commercial interests. That is, it is worth applying double standards when losing revenue is at stake. Which sheds light on another point.
For years, Meta (and Twitter) ignored Donald Trump’s violations, only taking action when the Capitol stormed on January 6. Five people died in the coup attempt, the most violent of the attempts encouraged by Trump.
The attack on American democracy, the most solid in the Americas and the second longest in the world, has led social networks to worry about the possibility of advertisers and users pulling out, as no one wants to invest or stay in a toxic environment. So it made sense to permanently ban Donald Trump — something the Committee is against🇧🇷
The case of Neymar, on the other hand, was not harmful to Meta. It would be harmful to throw away his chance to be an exclusive streamer for Facebook Gaming.
And, to repeat, Meta is a company and is obliged to think commercially. The problem is that content moderation for large accounts must be more efficient and assertive, especially since there will be a human team to assess the case. Even though the police investigation against Neymar was closed due to lack of evidence, he still exposed intimate images and the name of the whistleblower. Meta’s policies are very clear about this, but it was ignored.
Alan Rusbridger, member of the Oversight Committee, said the following to The Verge🇧🇷
I really believe that there are many people at Meta who believe in the values of free speech, protecting journalism and protecting civil society. But the program they created isn’t doing that. It’s protecting a limited number of people who don’t even know they’re on the list [de integrantes da verificação cruzada]
Meta is on the right track with cross-checking, but the method needs to be restructured and earn more investment🇧🇷 Social networks are still the company’s flagship. However, this restructuring will take time, even more than 11,000 employees were laid off🇧🇷