Facebook employees have warned for years that as the company raced to become a global service it was failing to police abusive content in countries where such speech was likely to cause the most harm, according to interviews with five former employees and internal company documents.
For over a decade, Facebook has pushed to become the world's dominant online platform. It currently operates in more than 190 countries and boasts more than 2.8 billion monthly users, who post content in more than 160 languages. But its efforts to prevent its products from becoming conduits for hate speech, inflammatory rhetoric and misinformation — some of which has been blamed for inciting violence — have not kept pace with its global expansion.
Internal company documents show Facebook has known that it hasn't hired enough workers who possess both the language skills and knowledge of local events needed to identify objectionable posts from users in a number of developing countries. The documents also showed that the artificial intelligence systems Facebook employs to root out such content frequently aren't up to the task, either, and that the company hasn't made it easy for its global users themselves to flag posts that violate the site's rules.
With your current subscription plan you can comment on stories. However, before writing your first comment, please create a display name in the Profile section of your subscriber account page.