Facebook spokesperson reached out to clarify that Facebook has removed 120,000 Facebook and Instagram posts for "attempting to obstruct voting," and has rejected 2.2 million ads for "failing to complete the authorization process."
A previous version of this article stated that Facebook had rejected the 2.2 million ads for the same reason that the posts were removed.
With the U.S. presidential election just weeks away, voter suppression efforts are in full-swing on social media.
Among the prime targets for voter suppression attempts: The world’s largest social network, Facebook, and its sister platform, Instagram.
In an interview published on Sunday, Facebook’s VP of Global Affairs and Communications Nick Clegg said that the company had taken down 120,000 Facebook and Instagram posts for attempting to "obstruct voting." The company also rejected 2.2 million ads across its platforms, as they were attempting to target the U.S. without completing the authorization process.
That’s not all, either. Clegg also shared that the company attached warning labels to a whopping 150 million posts for spreading misinformation related to voting in the U.S. presidential election as well.
In his interview with the French outlet Journal du Dimanche, Clegg said Facebook employs 35,000 people responsible for platform security, which includes elections. The company also partners with 70 news media outlets to verify information.
Voter suppression campaigns have weaponized Facebook before. The Cambridge Analytica scandal is perhaps the most notable instance. The firm was hired by the Trump campaign for its digital efforts in 2016 but dissolved in 2018 following the fallout from its inappropriate data collection efforts. The firm used that data in order to deter Black voters from going to the polls in the 2016 presidential election.
In the years since the Cambridge Analytica scandal broke, Facebook has updated its policies to outright ban advertisements that discouraged people from voting. The social media company began doing this in preparation for the 2018 midterm election and made the policy official in October 2019.
With just over two weeks until the 2020 election, Facebook has taken a number of steps to curb political misinformation. Earlier in October, the company removed a post from President Trump calling the coronavirus “less lethal” than the flu. The company also cracked down on Facebook Pages and Groups in support of the QAnon conspiracy, which claims the president is secretly waging war on a global satanic child-trafficking ring run by people in the Democratic Party.
However, while Facebook’s crackdowns and removals of voter suppression efforts is a good thing, one can only imagine just how much of this type of content it is missing.
A recent study found numerous workarounds to avoid detection by Facebook’s fact checks. Misinformation removed by Facebook is routinely re-uploaded with slight changes in order to once again freely spread and go viral on the platform.