Facebook executive talks strategies for stifling fake news

The 2016 election experience helped Facebook employees this election season stifle fake news stories and accounts that spread them.

That’s according to Katie Harbath, global politics and government outreach director at Facebook. She spoke last week at a WisPolitics.com event in Madison celebrating the close of this year’s election season.

“Everyone had been so optimistic about the internet up to 2016; everyone thought it was the great leveler, that it was going to be a great thing for democracy,” she said, pointing to the events of the Arab Spring as evidence.

But that was before Facebook announced that nearly 500 accounts likely originating from Russia bought around $100,000 worth of ads on the platform during the election. And in April, CEO Mark Zuckerberg appeared before Congress to address concerns related to online interference in the American political system.

“We’ve had to dramatically pivot to start to think about how we can mitigate that,” she said.

In doing so, Facebook has put data researchers and other scientists to work on some complex tasks, like verifying accounts more accurately, making it easier to tell where advertisements are coming from, and disrupting external “bad actors,” as Harbath puts it.

Facebook’s threat intelligence operations team investigates when suspicious political content is being spread that might come from another country. Earlier this month, Harbath said the team had its fifth “takedown” of Iranian online actors targeting “a wide variety of countries” with over 500 different pages.

“These folks are sophisticated,” she noted, adding the team got a tip “right before the election” about some accounts being run by the Russia-based Internet Research Agency on Instagram, which Facebook owns along with the popular messaging system WhatsApp.

“We’re seeing a move to other platforms, because of some of the work we’re doing,” she said. “We’re trying to do more across the platforms… We’re trying to fan out and find these networks as much as we can.”

In seeking out these actors and identifying how they were manipulating the system, Harbath says her team found one common thread: fake accounts. She says it was easier to detect the automated accounts, which often follow patterns that can be identified.

The harder task, she said, was finding the fake accounts that weren’t created with some automated code. There were “rooms full of people” creating fake accounts individually and trying to make them seem like real people, she said. Some started doing this as far back as 2014, and they followed some patterns of their own.

Many would add hundreds of loosely connected people as friends very soon after creating the account, while liking many pages and joining groups to spread their content as much as possible. Over time, she says her team got better at detecting these behaviors and taking down the accounts.

“When you think about right after 2016, the topic wasn’t foreign interference; the topic was false news,” she said. “Then it became, what did the Trump campaign do on Facebook? What did they do online? How did they beat the best Obama brains who had been working for Hillary?”

From there, the national conversation shifted to the Russian ads on Facebook, which drove the company to pursue ways to make advertising on the platform more transparent.

“We were far from perfect this time,” she admitted, but said the company’s efforts still deterred at least some of the dishonest political advertisers.

Some of the new advertising requirements include having to provide official identification like a Social Security number and respond to a postcard sent to a physical address, “to make sure you live in the states.”

She says some of these requirements have been added over the past several weeks, and Facebook will be “getting stricter” and adding more by 2020.

With regards to fighting misinformation itself — the actual content many of these fake accounts are spreading — Harbath says the company’s policy is to “root it in free speech as much as possible.”

Any content that violates Facebook’s community standards will be taken down, she said. That includes any hate speech and also voter fraud.

For example, if someone was saying, “Republicans vote Tuesday and Democrats vote Wednesday,” that’s an easy choice to remove. But it can get complicated, she said, when a political post contains some truth as well as falsehood.

“So Katie has a right to say the sun rises in the West; she does not have a right for us to amplify that,” she said. “That’s a controversial position to be in.”

–By Alex Moe
WisBusiness.com