Facebook discovered four separate misinformation networks—three tied to Iran and one to Russia—that the social network said it shut down as part of an ongoing effort to counter “foreign influence campaigns.”
Facebook said that the “coordinated inauthentic behavior” took aim at the U.S., North Africa and Latin America and included “proactive work ahead of the U.S. elections.” The company said it shared the findings with “law enforcement and industry partners.”
In a blog post Monday, the company also disclosed some adjustments to its policies meant to combat foreign influence and manipulation. Facebook will now identify posts from state-controlled media, provide more information about the country of origin for Facebook pages, and help further secure political candidates’ Facebook pages.
Following questions about its ability to identify and fight disinformation and influence campaigns on its platforms during the 2016 campaign, the company has been at the center of attention ahead of the 2020 U.S. presidential election. Facebook has gone to great lengths to tell voters and lawmakers that it’s taking misinformation more seriously—something it did ahead of the 2018 U.S. midterms as well.
But the company has already encountered harsh criticism, particularly around political advertising. Facebook’s policy is that it won’t fact-check posts from politicians, including ads, which has led to complaints from Democratic candidates such as Elizabeth Warren and Joe Biden.
CEO Mark Zuckerberg said last week that he doesn’t believe it’s Facebook’s role to make decisions about posts from politicians.
“I don’t think most people want to live in a world where you can only post things that tech companies judge to be 100% true,” he said Thursday during a speech at Georgetown University. “People should be able to see for themselves what politicians are saying.”