© 2024 WEMU
Serving Ypsilanti, Ann Arbor and Washtenaw County, MI
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Facebook's Blind Spot: Connecting The World, For Better Or Worse

RACHEL MARTIN, HOST:

We are following breaking news this morning. NPR has confirmed that President Trump's former campaign manager, Paul Manafort, has surrendered to the FBI as part of the special counsel investigation into possible collusion between the Trump campaign and Russia. We will have more on this evolving story throughout the day.

Meanwhile, a different strand of the broader Russia story - executives from Facebook, Google and Twitter will be on Capitol Hill to talk to Congress this week about their social platforms and Russian interference in the 2016 campaign. NPR's Aarti Shahani joins us now.

AARTI SHAHANI, BYLINE: Hi.

MARTIN: You have focused a lot of your reporting in recent months on Facebook, which came under a whole lot of scrutiny after it came out that Russian agents had bought thousands of ads on Facebook during the presidential election. Can you talk a little bit about how Mark Zuckerberg has navigated all of this?

SHAHANI: Yeah, actually - you know, I think what he's being forced to come to terms with now is this big blind spot he's had from the beginning. Think of it this way - here's this young man who starts a company with a supremely techno-utopian goal - let's connect the world. And he does it...

MARTIN: Sounds great, yeah.

SHAHANI: Yeah. And he does it without the awareness of something that is quite obvious, which is connecting in and of itself is not an inherent good, right? It can be good or bad. We're human. And so this oversight - call it ambition or being naive - it gets built into the very bones of the platform. This is what insiders, people who work at Facebook or talk with senior employees at Facebook, tell me. From day one, Zuckerberg has been obsessed with measuring the positive stuff. And you can see this in the company's blog posts and research. You know, they track engagement, how much you like and click and share, up to what second you watch a video. And this obsession has grown a network to 2 billion users monthly. But the negative stuff - OK, Russian operatives, trolls, hate groups - that stuff he and his leadership have been far, far slower to even register on their radar.

MARTIN: OK, we should just note briefly NPR and other major news organizations do get money from Facebook to make videos. But, Aarti, you talk about this negative stuff that has surfaced around Facebook. Has the company done anything to change its processes to combat these criticisms?

SHAHANI: You know, yeah, I know from talking with current and former employees that they are now scrambling trying to figure out how to monitor and quantify the bad stuff. One key problem is that the main metric they've been relying on, which is what users report as bad content. It's garbage according to people at Facebook who've had to analyze it. It's not reliable because users flag a lot of things that they shouldn't. You know, you can report an article is fake news because you disagree with it.

MARTIN: Yeah.

SHAHANI: Or you can flag a post as - you can flag a business page as offensive when really it's just a competitor you want to take down.

MARTIN: So you've talked to a number of people at Facebook. What do the rank and file make of this new moment for their company and these conflicts that they're trying to work out?

SHAHANI: Yeah, you know, what I'm learning from talking to current and former employees is that there's a deep conflict in Facebook. On the one hand, people feel really badly about Russian operatives and hate speech and extremists and God knows what else. The engineers and product managers are racing to fix it, working very long hours. At the same time, they're defensive or indignant, too. Like, hey, we're not the ones who created original sin. Humans are humans. And if humans are bad, that's not our fault. You know, if there's a crime happening, take it up with law enforcement.

MARTIN: But that gets to the real question about Facebook's identity, right? Like, if it is just the platform where the stuff goes out, then they can say we - we're not responsible for it. But are they - are they a publisher? Are they subject to the same standards and ethics as a journalism organization would be?

SHAHANI: You know, this is a really key point. Mark Zuckerberg maintains he's running a tech company, not a media company. And, you know, what I compare it to, at least in my own mind, is it's like a guy who builds a new, enormous office building that's got some whiz-bangy (ph) technology. And he says the rules that apply to regular builders, they don't apply to him. Then when a fire breaks out, because he didn't follow some old building code, he says, hey, it's not my fault. I didn't light the match.

MARTIN: NPR's Aarti Shahani. Aarti, thanks so much for sharing your reporting with us.

SHAHANI: Thank you. Transcript provided by NPR, Copyright NPR.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.

Rachel Martin is a host of Morning Edition, as well as NPR's morning news podcast Up First.
Aarti Shahani is a correspondent for NPR. Based in Silicon Valley, she covers the biggest companies on earth. She is also an author. Her first book, Here We Are: American Dreams, American Nightmares (out Oct. 1, 2019), is about the extreme ups and downs her family encountered as immigrants in the U.S. Before journalism, Shahani was a community organizer in her native New York City, helping prisoners and families facing deportation. Even if it looks like she keeps changing careers, she's always doing the same thing: telling stories that matter.