While U.S. lawmakers press Twitter and Facebook to better police their platforms against Russian social media trolls and ponder tougher sanctions against Moscow, American voters remain vulnerable to divisive messaging and misinformation before midterm elections in November, experts told VOA.

“All of us, left and right [politically], are all very susceptible to being fooled by disinformation,” said Claire Wardle, director of First Draft News, a project at Harvard University’s Shorenstein Center on Media, Politics and Public Policy that provides tools to fight false content on the Internet and social media.

“There are many people who are trying to spread misinformation.We all have to be much more skeptical of the information we are consuming and to be aware, particularly if it’s content that makes us have an emotional reaction,” Wardle added.

Last month, Facebook shut down 32 fake accounts that posted polarizing messages on race, gender, and fascism. In 2016, Russian trolls flooded Facebook, Twitter and other platforms with similar content, reaching millions of Americans.

One Russia-linked Twitter handle, 4MYSQUAD10, now deactivated, posted: “White America Does The Crime, Black America Gets The Time. WTF? #BlackLivesMatter #racism.”

Another, TEN_GOP 2545, posted: “Muslim bus driver throws all passengers out of the bus, so he has space and time to pray.”

“As humans we respond to fear,” Wardle said. “A lot of disinformation is driven by fear — other people you should be fearful of and then wanting to protect yourself, our family and your community.”

Last week, social media researchers told the Senate Intelligence Committee Russian efforts to polarize the American people are as pernicious as ever.

“Russian manipulation did not stop in 2016. After Election Day, the Russian government stepped on the gas,” New York-based Graphika CEO John Kelly said.

“Foreign actors will continue to aim future disinformation campaigns at African-American voters, Muslim-American voters, white supremacist voters,” Oxford University researcher Philip Howard told the panel. “I expect the strategy will remain the same: push disinformation about public issues and prevent particular types of voters from participating on Election Day.”

Political scientist Keneshia Grant may have had a firsthand brush with Russia’s use of social media to inflame racial tensions in the United States before the 2016 election, and believes American voters are still in Moscow’s crosshairs for malign messaging.

“There were minority communities targeted. I believe that targeting is still happening and that it has been getting more sophisticated over time,” said Grant, who teaches at Washington’s Howard University, a predominantly-African American institution.

In 2016, Grant noticed her Twitter account suddenly gained a group of mysterious and silent followers. She believed they were studying her posts to learn to craft messages to effectively target black Americans.

“There were 20-30 accounts of individuals who were trolling to see what I might say and, I suppose, to use that information to seem credible with other black users of Twitter,” she said. “I was one of the people who got an e-mail [from Twitter] saying you have interacted in some way with someone we believe to be fraudulent.”

Weeding out fake accounts

Social media companies have trumpeted their efforts to weed out fake accounts and bad actors. While commendable, Grant said it’s not enough.

“Americans have a responsibility to know that Russians are attempting to interfere in elections, and then to take the additional steps to figure out where information comes from that they are consuming.Not just consume it, but think about it,” she said.

Wardle concurred, but noted that social media trolls exploit a basic human tendency: giving credence to information or messaging that supports one’s outlook or ideology.

“People want to believe information that supports their worldview, whether that’s a belief on gun control or immigration or whether you’re more a dog person than a cat person,” she said, adding that counteracting that tendency will require holding people to account when they wittingly or unwittingly spread erroneous content.

“If we want to drive on roads that aren’t covered in garbage, we have to take responsibility for not throwing Coke cans out of the window,” she said.

“I want to see people recognize that when they click share’ [on social media], they have a responsibility for the information they are putting out,” she added. “So when crazy Uncle Bob is sharing false information, rather than saying, ‘well, that’s just crazy Uncle Bob,’ we should call him out and say that it’s not healthy for us to live in a society where we are sharing false information.”

The Harvard researcher noted that other regions of the world, like Eastern Europe, have been grappling with false information campaigns for far longer than the United States.

“After the election of 2016, when Americans all of a sudden woke up to misinformation, I think the rest of the world did a slow hand-clap and said ‘welcome to the party, America.'”

Some American schools have introduced curriculum to teach students to think more critically about the information they receive and to identify propaganda and malign messaging. Such classes should become standard, according to a member of the Senate Intelligence Committee.

“We are asymmetrically vulnerable [to disinformation campaigns] because of the First Amendment and democracy, our whole system is based on information,” Maine independent Senator Angus King said.

“Our kids are growing up with these [high-tech] devices,” he added, “but not necessarily taught how they can be manipulated by their devices. I think there ought to be standardized courses in high school called digital literacy’ and increasing the public’s awareness that they are being conned.”