Subscribe

How Facebook can become a better public forum

Sibahle Malinga
By Sibahle Malinga, ITWeb senior news journalist.
Johannesburg, 17 Jan 2019
Facebook has pledged stronger controls in 2019 following a scandal-ridden 2018.
Facebook has pledged stronger controls in 2019 following a scandal-ridden 2018.

Facebook should provide meaningful News Feed controls for its users and establish regular auditing mechanisms to aid democracy and free speech on its platform.

These are some of the recommendations made by academics at the University of Oxford in the UK and Stanford University in the US, in a new research report titled: "Glasnost! Nine Ways Facebook Can Make Itself a Better Forum for Free Speech and Democracy".

The report, which the authors describe as part of a process of "constructive engagement" with the social media company,identifies specific issues concerning political information and political speech, and offers recommendations as to what more should be done by the social media company.

The report argues the growing influence of Facebook - as well as other platforms such as Instagram, YouTube and Twitter - in the personal, cultural and political life of billions of people has led to widespread concerns about the influence of hate speech, harassment, extremist content, polarisation, disinformation and covert political advertising on social media.

In the last few years, the world's largest social media network, with around 2.2 billion users, has been embroiled in controversy relating to improper collection of user data and poor controls on malicious manipulation of public discourse.

Among the recommendations, the report suggests Facebook should tighten its community standards wording on hate speech, hire more and contextual expert content reviewers, increase decisional transparency, expand and improve the appeals process, and create an external content policy advisory group.

Social media regulation

Facebook is facing calls for regulation from the US Congress and British privacy regulators after reports last year revealed political data firm Cambridge Analytica had harvested the personal data of millions of people's Facebook profiles without their consent and used the information for political advertising during the 2016 US presidential election.

The Cambridge Analytica scandal affected up to 50 million users and prompted several apologies from chief executive Mark Zuckerberg, who promised to take tougher steps to restrict developers' access to user information.

Lead author of the report, Timothy Garton Ash, explains that while industry-wide self-regulation should be actively pursued, attaining it will be a "long and complex task".

"In the meantime, the best should not be the enemy of the good. There is a great deal that a platform like Facebook can do right now to address widespread public concerns, and to do more to honour its public interest responsibilities as well as international human rights norms.

"Executive decisions made by Facebook have major political, social and cultural consequences around the world. A single small change to the News Feed algorithm, or to content policy, can have an impact that is both faster and wider than that of any single piece of national legislation," Garton Ash writes.

Adrian Schofield, ICT veteran and programme consultant at IITPSA, says the difficulty with regulating the content that appears on social media platforms is the near impossible task of vetting everything before it appears.

"Even with sophisticated algorithms and massive processing capacity, not all offending material would be filtered out and the medium would lose its immediacy. This is why Internet service providers are not responsible for the content that passes through them, for example."

Hire more culturally diverse content reviewers

"Facebook still has too little content policy capacity to meet legitimate, and sometimes urgent, public interest and human rights concerns in many different countries. Similar problems have been reported in Sri Lanka, Libya and the Philippines, where content that was not just hate speech but dangerous speech was left up for too long, often with disastrous consequences," it adds.

In 2017, Facebook and Twitter faced pressure in the US and Europe to tackle extremist content on their platforms more effectively.

Facebook was accused of making it easy to introduce thousands of users from Islamic State of Iraq and the Levant extremists to one another, via its 'Suggested Friends' feature.

This, according to reports, allowed militant groups to develop fresh terror networks and recruit new members to their terrorist cause.

Facebook responded to accusations by saying it was removing 99% of content related to militant groups Islamic State and al Qaeda before being told to do so.

Expand fact-checking facilities

In recent months, the social media giant has been accused of being used as a tool to spread "fake news".

The research report suggests Facebook should continue to strive towards providing users with more contextual information about news stories.

"Research has shown that the effects of fact-checking can be complex, and some have even argued that it can be counter-productive. But an important piece of research has suggested that Facebook's recent fact-checking efforts have had success in reducing the amount of misinformation in the average user's feed.

"We welcome the fact that at the end of 2018, the little-information Context Button was launched globally and believe that significant resources should be dedicated to identifying the best, most authoritative and trusted sources of contextual information for each country, region and culture," it states.

Improvement efforts

While Facebook is seeking to implement much-needed processes for self-regulation and governance to help regain the trust of the public, politicians and regulatory authorities, there is still much room for improvement, according to the report.

Facebook told Reuters this week that it is better prepared to defend against efforts by users to manipulate the platform to influence elections which are expected to take place in India, Nigeria, Ukraine and the European Union this year. However, it did not mention the South African election, which will also take place this year.

Last year, Facebook announced Social Science One, a new project with the Social Science Research Council aimed at providing academics with Facebook data for important research projects, focused on democracy and elections.

Schofield explains: "As far as governance and compliance are concerned, Facebook will behave as all businesses do: pragmatically. They will do what is necessary to stay in business, or they can close their doors.

"There's no 'rocket science' behind these guidelines; they are at the heart of all media regulation structures. Regardless of the regulatory framework, all forms of media are at risk of being used to deliver propaganda, whether overtly or covertly."

Share