Ex-Facebook employee says company has known about disinformation problem for years

Oct 25, 2021
Originally published on October 26, 2021 11:15 am

Updated October 26, 2021 at 11:15 AM ET

Facebook is on the defensive after a whistleblower leaked thousands of documents showing how the company failed to control the spread of false information and lies about the 2020 election.

In an internal report, Facebook staff concluded that the company "helped incite the Capitol Insurrection." Leaked documents show how the company was unable to stop the growth of pro-Trump "Stop the Steal" groups and conspiracy theories.

Facebook denies that it was responsible for the Jan. 6 riot and said it "took steps to limit content that sought to delegitimize the election."

Yael Eisenstat spent six months at Facebook in 2018 as the global head of elections integrity operations for political advertising. She's now a Future of Democracy Fellow at the Berggruen Institute.

"This is something that was years in the making," she tells NPR's Morning Edition. She says experts, including people inside the company, have been warning for years about how Facebook's software incentivizes disinformation and the most extreme voices. She says Facebook executives were aware of the extent of the problem.

Facebook spokesperson Drew Pusateri told NPR that the company "took steps to limit content that sought to delegitimize the election, including labeling candidates' posts with the latest vote count after Mr. Trump prematurely declared victory, pausing new political advertising and removing the original #StopTheSteal Group in November."

The company says after the Jan. 6 riot, it "removed content with the phrase 'stop the steal' under our Coordinating Harm policy and suspended Trump from our platforms."

Facebook also says that politicians are required to follow community standards and advertising policies and that it has enforced rules against politicians including Trump, who is suspended until at least January 2023, and Brazil's Jair Bolsonaro, whose video with misinformation about the coronavirus was removed last year.

Interview highlights with Yael Eisenstat have been edited for length and clarity.

Interview Highlights

On how Stop the Steal and misinformation grew so quickly after the election

For years, people — whether it was people like myself, people who work in the company, researchers, journalists, outside academics — have been warning the company that the way their product is working, the way they amplify certain kind of content, the way they recommend people into certain groups, was causing a situation that was already amplifying some of the most extreme voices.

But then you couple that with what I believe was one of their most dangerous political decisions the company made — this was actually around the time that I was working there — to not fact check political actors, to allow some of the biggest voices with the biggest platforms to violate [Facebook's] own policies, these combined to make this perfect storm.

So when they talk about all the measures that they put in place to protect the elections, that doesn't override all the things that they were not doing, such as really making sure that groups weren't engaging in this kind of activity, making sure that certain political elites were not using the platform to spread lies about the election. This was going on for years. It's not something they could have stopped suddenly in three days after an election.

On how Facebook is designed for growth above all else

Content moderation — what to take down and what to leave up — is extremely difficult. And yes, that is something that is on the platforms to handle because that is definitely not government's area. But how your platform is designed, how you're monetizing it, what you're allowing and what you're not: basic guardrails, that's what's really missing here.

If you for years have allowed an environment where you're not holding politicians to the same standards that you're holding the rest of us to on your platform because really you want to preserve your power and you're not reining in these algorithms and recommendation engines that are pushing people to a lot of this content ... that's really about not wanting to harm your own growth. So you're not reining in how your recommendation engines are pushing people into some of these groups.

Those two things combine around a time where we have a really volatile situation in the U.S. around our election. It's a perfect storm. And let's be clear, Facebook had years to fix this. A lot of the documents are focusing on 2019 and 2020, but this is work that many people have been trying to get them to do — including my team when I was there — for years.

On what happened when she raised concerns

In 2018, the very first thing I tried to do was ask why we were not putting any sort of fact-checking standards into political advertising. It was very clear to me that advertising — may be not the most important thing on the platform — but it's paid speech. We're putting labels on the ads to make them look even more credible, and we're giving the political actors targeting tools to target people with this messaging. And if we weren't even fact-checking that and we were allowing politicians to use advertising to sow different messages to different groups, that was dangerous. ...

I was pushed out for these kinds of ideas. And for the ideas to try to build in voter suppression plans into political advertising. There just wasn't an appetite from leadership for that.

Lilly Quiroz and Jill Craig produced and edited the audio interview. James Doubek produced for the web.

Editor's note: Facebook is among NPR's recent financial supporters.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

RACHEL MARTIN, HOST:

We're going to focus now on another allegation against Facebook - that the company didn't do enough to prevent extremists from organizing online and ultimately attacking the U.S. Capitol on January 6. According to the documents, Facebook did put measures in place to deal with potential violence around the 2020 election. But when a movement built on a lie got traction - it was called Stop the Steal - those measures fell short.

Yael Eisenstat was the global head of elections integrity operations at Facebook in 2018. I asked her how this misinformation was allowed to spread.

YAEL EISENSTAT: This is something that was years in the making and - maybe not Stop the Steal in particular, but just the fact that for years people, whether it was people like myself, people who worked in the company, researchers, journalists, outside academics had been warning the company that the way their product is working, which is to - I mean, there's multiple things here - the way they amplify certain kind of content, the way they recommend people into certain groups was causing a situation that was already amplifying some of the most extreme voices. But then you couple that with what I believe was one of their most dangerous political decisions the company made - and this was actually around the time that I was working there - to not fact-check political actors, to allow some of the biggest voices with the biggest platforms to violate their own policies, these combined to make this perfect storm.

So when they talk about all the measures that they put in place to protect the elections, that doesn't override all the things that they were not doing, such as really making sure that groups weren't engaging in this kind of activity, making sure that certain political elites were not using the platform to spread lies about the election. This was going on for years. It's not something they could have stopped suddenly in three days after an election.

MARTIN: So it seems like there's two issues, as you just laid them out. It's vetting the potential of any particular actor to spread lies and misinformation but also, once it's out there, taking it down, right?

EISENSTAT: So even more, I mean, content moderation - what to take down and what to leave up - is extremely difficult. And yes, that is something that is on the platforms to handle because that is definitely not government's area. But how your platform is designed, how you're monetizing it, what you're allowing and what you're not, basic guardrails - that's what's really missing here.

I mean, as you said, once - if you, for years, have allowed an environment where you're not holding politicians to the same standards that you're holding the rest of us to on your platform because, really, you want to preserve your power and you're not sort of reining in these algorithms and recommendation engines that are pushing people to a lot of this content, which some of the documents - I mean, this piece about Carol's journey to QAnon is the perfect example. And that's really about not wanting to harm your own growth. So you're not reining in how your recommendation engines are pushing people into some of these groups. Those two things combined around a time where we have a really volatile situation in the U.S. around our election, it's a perfect storm.

And let's be clear, Facebook had years to fix this. A lot of the documents are focusing on 2019 and 2020. But this is work that many people have been trying to get them to do, including my team when I was there, for years.

MARTIN: What were you met with when you raised these red flags, when you raised these concerns?

EISENSTAT: Sure. So in 2018, the very first thing I tried to do was ask why we were not putting any sort of fact-checking standards into political advertising. And why I did that was it was very clear to me that advertising, first of all, maybe not the most important thing on the platform, but it's paid speech. We're putting labels on the ads to make them look even more credible. And we're giving the political actors targeting tools to target people with these messaging.

And if we weren't even fact-checking that and we were allowing politicians to use advertising to sow different messages to different groups, that was dangerous. And there was no appetite. I was pushed out for these kinds of ideas and for the ideas to try to build in voter suppression plans into political advertising. There just wasn't an appetite from leadership for that.

MARTIN: You think that's likely to change now with all this pressure?

EISENSTAT: I think it can only change through regulation, to be honest. They have made it clear they will not self-regulate.

MARTIN: Yael Eisenstat, she's Facebook's former global head of elections integrity operations for political advertising. We appreciate all of your perspective. Thank you.

EISENSTAT: Thank you. Transcript provided by NPR, Copyright NPR.