Kenya: Lobbies Raise Concerns as Facebook Moderation Mechanisms Fail to Tame Militants

Nairobi — Civil society organizations have expressed concern following recent findings by the Institute for Strategic Dialogue that indicated Meta and Facebook failed to stop the terrorist groups including Al-Shabab and the Islamic State from using its social media platforms.

The study suggested the terrorist groups succeeded to a large extent in spreading hateful terrorist content in East Africa and Kenya in particular.

The Institute for Strategic Dialogue (ISD) research released Wednesday revealed Facebook content moderators missed extremist content on their platform between 2020-2022.

Kenyan-based Amnesty International and HAKI Africa argued that the revelation is worrying especially in a region that remains threatened by terrorist attacks and at a time when Kenya heads into the August 9 polls.

"These elections are already generating malicious and hateful content designed to disinform, divide and demonize political opponents and their supporters," the two rights groups in a joint statement said.

According to the lobby groups, the ISD report comes five months after the February 2022 whistle-blower report alleging "Meta's third-party content moderation contractor Sama fueled mental trauma, intimidation, suppressed staff unionisation and may have undermined quality content moderation."

"As far back as 2021, the Wall Street Journal and others revealed in the Facebook Files that Facebook significantly under-invests in content moderation in Africa, Asia, and Latin America and exposes millions of users to disinformation, hate speech, and violent content," they said.

They argued that the use of automated systems and machine learning to detect violent and extremist content on the platform is not enough adding that Meta and Facebook must publicly commit to increasing investment in human content moderation.

They further called for the training of moderators in identifying and preventing violent extremism and hate messaging.

"While community guidelines are now available in over 60 languages, these guidelines do not include the Somali language, a critical language of users in this region. Without prioritizing the Somali language. users will not be aware of community standards and not flag harmful content," Haki Africa and Amnesty International said.

They pointed out that a time had come for Facebook to become more transparent and accountable to the public.

They appealed to Facebook to regularly record and publicly share disaggregated data on the trends, levels, and types of abuse being reported and their responses.

"In addition, Facebook must publicly state how many moderators they will deploy to tackle terrorist content and the languages and regions monitored," they added.

The two lobby groups further called on the ICT Ministry and the Communications Authority of Kenya to actively encourage companies to develop and publicly sign a self-regulatory Code of Practice on Disinformation.

"The Code should contain explicit public commitments to take down illegal, malicious and hateful content and actively mitigate the risks of disinformation, and perhaps most importantly, make data available to independent researchers to verify that the Code of Practice is being enforced by the companies," they said.

They noted that failure to moderate content posted by extremist groups directly threatens human rights and democracy.

AllAfrica publishes around 400 reports a day from more than 100 news organizations and over 500 other institutions and individuals, representing a diversity of positions on every topic. We publish news and views ranging from vigorous opponents of governments to government publications and spokespersons. Publishers named above each report are responsible for their own content, which AllAfrica does not have the legal right to edit or correct.

Articles and commentaries that identify allAfrica.com as the publisher are produced or commissioned by AllAfrica. To address comments or complaints, please Contact us.