Africa shies away from reporting Facebook violations

Read time 2min 50sec
Fadzai Madzingira, public policy associate manager for content at Facebook.
Fadzai Madzingira, public policy associate manager for content at Facebook.

Social media giant Facebook says fewer people in Sub-Saharan Africa report violations of the social network’s rules of engagement.

Facebook says reporting violations helps the platform maintain defined community standards, and won’t allow anything that goes against these standards.

It has in recent years been cracking down on content that violates its community standards and continuously encourages reporting of any violations on its platforms, Facebook and Instagram.

Additionally, the company says it has invested in technology, processes and people to help it act quickly so violations of the standards affect as few people as possible.

In Sub-Saharan, however, reporting of such violations is lower than other parts of the world, according to Fadzai Madzingira, public policy associate manager for content at Facebook.

Speaking to ITWeb last week on the side-lines of a Facebook content workshop in Nairobi, Kenya, Madzingira said: “Something that we have realised over the past years is that when it comes to reporting [violations], Sub-Saharan Africa has fewer numbers compared to the rest of the world.”

This, she noted, could be as result “of multiple reasons; people may not know about the community standards, or they don’t know how to report and they could also be worried about what it means to report.

“When we engage with the people in the region, we let them know they can report anything that goes against our community standards, report posts, video, comments on Facebook and Instagram.

“We also let them know that reporting is confidential. The nature of our community standards is to create a safe platform. The safer the platform is, the more openly people will be able to express themselves.

“Something to keep in mind is that we have more than 35 000 content reviewers globally that are involved in ensuring the safety of our platform.”

Facebook, the world's biggest social network, earlier this year pledged stronger controls following a scandal-ridden 2018.

Some of the scandals that hit Facebook in 2018 include data privacy issues, election interference, as well as the spreading of fake news.

To stop the spread of harmful content, Facebook has also built artificial intelligence systems to automatically identify and remove content related to terrorism or hate speech, which Madzingira said helps enforce community standards.

Facebook also announced it had opened its content review centre in Nairobi, manned by over 100 content reviewers.

Madzingira said Facebook added the new local language support for several African languages as part of its third-party fact-checking programme.

The initiative helps to assess the accuracy of news on Facebook and aims to reduce the spread of misinformation.

Facebook has partnered with Africa Check, Africa's first independent fact-checking organisation, to expand its local language coverage.

The company’s fact-checking programme relies on feedback from the social network’s community, as one of many signals Facebook uses to raise potentially false stories to fact-checkers for review.

Local articles will be fact-checked alongside the verification of photos and videos. If one of Facebook’s fact-checking partners identifies a story as false, Facebook will show it lower in News Feed, significantly reducing its distribution.

Login with