Facebook will remove false information posted on the network and likely to create violence imminently, said Wednesday the group, which has already tested this measure in Sri Lanka, recently shaken by interfaith violence.
“We are starting to implement this new policy in countries where we see examples where misinformation has … led to violence,” Tessa Lyons told Facebook, citing the case of Sri Lanka.
For example, the social network may remove inaccurate or misleading content, such as phony photos, created or shared to contribute to or exacerbate physical violence.
Facebook will rely on local organizations or specialized agencies to determine whether these publications are likely to cause violence in imminent fashion and therefore need to be removed.
Hate speeches and direct calls to violence are already in violation of Facebook’s rules. The new policy is to examine and remove another type of content, less explicitly violent, but still likely to cause violence.
Lyons, who was speaking at a meeting with reporters at the group’s headquarters in western California, said the policy had already been tested in Sri Lanka, which was shaken in March by anti-Muslim violence. Facebook had been violently criticized for allowing rumors and false information to circulate that may have contributed to the violence.
The group said that this change will be implemented gradually in the coming months in other countries.