Facebook gets over a million reports of violations from users every day, according to Monika Bickert, head of policy management. Bickert revealed this information while speaking about the fine line between free speech and hate speech at SXSW’s first Online Harassment Summit on Saturday. How far tech companies could go in deleting potentially harmful content on their platforms was the main topic of the panel.
The difficult part is enforcement: Facebook
Speaking on social media’s limitations, Bickert said, “You can criticize institutions, religions, and you can engage in robust political conversation. But what you can’t do is cross the line into attacking a person or a group of people based on a particular characteristic.”
Notes From Schwarzman, Sternlicht, Robert Smith, Mary Callahan Erdoes, Joseph Tsai And Much More From The 2020 Delivering Alpha Conference
The following are rough notes of Stephen Schwarzman, Steve Mnuchin, and Barry Sternlicht's interview from our coverage of the 2020 CNBC Institutional Investor Delivering Alpha Conference. We are posting much more over the next few hours stay tuned. Q2 2020 hedge fund letters, conferences and more One of the most influential investor conferences every year, Read More
She further said that crafting a policy is “tricky,” mostly because 80% of Facebook users are not from the United States and probably have divergent views on offensive and threatening content. According to her, the most challenging part is enforcement. When reviewing, Facebook gives priority to posts inciting physical harm, Bickert told CNN Money.
All reports are reviewed by trained Facebook employees. Why the social media giant doesn’t allow “its world-class engineers” to tackle hate speech “perfectly and proactively” is the question frequently asked to her.
In answer, she said, “When it comes to hate speech, it’s so contextual … We think it’s really important for people to be making that decision,” and according to her, automation will possibly play an essential role in the future.
The number of reports on violations has been increasing steadily as Facebook has enabled users to flag them from all devices, reported the head of policy management. However, Bickert shared no information about what percentage of the reported posts are serious and taken off the site.
Views of other panelists
Deborah Lauter of the Anti-Defamation League, Juniper Downs of Google, the National Constitution Center’s Jeffrey Rosen and Lee Rowland of the ACLU were also on the panel. Rosen told listeners that tech companies have to “adopt a more European” approach to free speech, but anything that is insulting to a person’s dignity can be the reason for removal.
Rowland added that most people do not know who to call if they want to know why the content was removed. People should understand why their speech has been taken down; if they don’t, they will never know where “free speech” stops, said Rowland.