Social media firms remained on high alert Tuesday against Election Day misinformation and manipulation efforts as polling places began closing in the US and focus turned to tallying ballots.
Aiming to avoid the problems that occurred in the 2016 campaign, Facebook, Twitter and Google-owned YouTube were implementing policies aimed at heading off the spread of false information designed to sway the outcome of the election.
Facebook said it had activated a command center which was monitoring the platform in real time.
“Our Election Operations Center will continue monitoring a range of issues in real time — including reports of voter suppression content,” said a Facebook statement posted on Twitter.
Facebook said its election center was also tracking other issues such as the actions by supporters of President Donald Trump to surround campaign buses for Democrat Joe Biden.
However some groups at Facebook were being used to share stories of going to polling places without face masks to “scare liberals away,” according to a post by Kayla Gogarty of nonprofit watchdog group Media Matters.
Meanwhile, a small group of Instagram users were shown a notice atop their feeds erroneously claiming “Tomorrow is Election Day,” in what the photo-centric social network said was a glitch.
Instagram said that the notice was left over from a night earlier and wasn’t cleared out of a memory cache unless the app was restarted.
And a #stopthesteal hashtag was being used on social media posts tailored to cast doubt on the voting process.
“This is a targeted disinformation campaign against Black voters and the platforms are not doing anything to shut that down,” social analyst Shireen Mitchell.
Facebook reiterated that it would place warning labels on any posts which seek to claim victory prematurely.
“If a presidential candidate or party declares premature victory, we will add more specific information in the labels on candidate posts, add more specific information in the top-of-feed notifications and continue showing the latest results in our Voting Information Center,” the social giant said.
– Loopholes, glitches –
Along with other social platforms, the company has promised to stem misinformation around the election, including premature claims of victory, seeking to avoid a repeat of 2016 manipulation efforts.
Over past days, Facebook and Twitter added disclaimers to Trump posts calling into question the integrity of mail-in ballots.
The Trump post in question on Twitter said a slow vote count in battleground state Pennsylvania could lead to “rampant and unchecked cheating.”
“It will also induce violence in the streets. Something must be done!” he tweeted.
Twitter last month updated its “civic integrity policy” aiming to prevent efforts to manipulate or interfere in elections. That calls for actions against false claims for victory or any incitement to violence.
YouTube has also sought to limit the sharing of videos with election misinformation. Last month it began adding information panels to videos about voting by mail.
Mail-in voting was added to a short list of topics that YouTube considers prone to posts containing falsehoods, such as Covid-19 and landing on the Moon, according to the Google-owned video sharing platform.
The panel appears under such videos regardless of who is speaking or who uploads them, according to YouTube.
Some activists noted that efforts by social platforms to curb the spread of false information was being hurt by loopholes and glitches.
The activist group Avaaz said it has found multiple examples of unverified election claims on Facebook in recent days.
Some comments say the “left” is planning a “coup” if Trump wins, while others argued without any factual basis that Trump would need to win Pennsylvania by four to five points “to overcome voter fraud,” according to Avaaz.
Facebook acknowledged this week that some political ads banned for containing misleading information were resurfacing, with political groups copying the same content for new messages to slip through filters.