Facebook now able to remove 90% of hate speech before those are reported
The company doubled the amount of drug content removed in the fourth quarter of 2019, removing 8.8 million pieces of content

Facebook says it is able to remove almost 90 percent of hate speech contents before anyone reports those because of the expansion of its proactive detection technology.
Its proactive detection rate for hate speech increased by more than eight points over the past two quarters, totalling almost a 20-point increase in just one year.
Also, because of other improvements it made to the detection technology, it doubled the amount of drug content removed in the fourth quarter of 2019, removing 8.8 million pieces of content.
The social media giant said these in its May Community Standards Enforcement Report, which provides metrics on how well it enforced its policies from October 2019 to March 2020.
"We are now including metrics across twelve policies on Facebook and metrics across ten policies on Instagram," Facebook said.
"When the Covid-19 crisis emerged, we had the tools and processes in place to move quickly and we were able to continue finding and removing content that violates our policies," it added.
The report introduces Instagram data in four issue areas – Hate Speech, Adult Nudity and Sexual Activity, Violent and Graphic Content, and Bullying and Harassment.
Facebook said the report included data on its efforts to combat organised hate on Facebook and Instagram. It increased the amount of content it took action on by 40 percent and increased proactive detection rate by more than 12 points since the last report.
Also, improvements to technology for finding and removing content similar to existing violations in the databases helped the company take down more child nudity and sexual exploitative content on Facebook and Instagram.
Over the last six months, the company started to use technology more to prioritise content for its teams to review based on factors like virality and severity among others. It also plans to leverage technology to take action on content, including removing more posts automatically.
The Community Standards Enforcement Report is published in conjunction with the company's bi-annual Transparency Report that shares numbers on government requests for user data, content restrictions based on local law, intellectual property take-downs and internet disruptions.
Facebook said it would share Community Standards Enforcement Reports quarterly in the future, which means the next report would be released in August.