Frontline’s ‘The Facebook Dilemma’ Details What Mark Zuckerberg Knew and When He Knew It (Review)

“The Facebook Dilemma,” a Frontline two-parter airing Monday and Tuesday (Oct. 29 and 30) on PBS, peels away the company’s outer layer of wanting to “bring the world closer,” to reveal how it spiraled so wildly into a tool used for sowing discord, winning wars, tilting elections and monetizing your privacy.

Early footage of Mark Zuckerberg is shown as he promotes his new venture, high on beer-fueled idealism and the company’s motto, Move Fast and Break Something. It’s quite clear that he fancied himself a put-upon hero of free speech, too caught up in his grand, altruistic vision of the company to be inconvenienced by its growing concerns.

His attitude trickled down into those hired to execute his vision, as evidenced by the five current executives given to Frontline for interviews. One after the other spouts variations of the company being “too slow to recognize” how it was being weaponized in such places as the Middle East and Russia. “We’ve been really focused on the good things” and “slow to really understand the ways in which Facebook might be used for bad things,” said one of the five, VP of Social Good, Naomi Gleit.

Frontline lays out how the addition of the “like” button in 2009 turned user’s News Feeds into a selling point, because it conveys things that are important to the consumer that could then be sold to advertisers for direct marketing. COO Sheryl Sandberg leaned into this philosophy when she was hired from Google in 2008 to help Facebook generate revenue. In overseeing the company going public in 2012, her business model was to dangle before investors and advertisers the potential profit that could be made from the treasure trove of personal data collected from its users, a business model that both she and Zuckerberg downplayed in public interviews.

In fact, Frontline shows a clip of Sandberg telling one interviewer, “We are focused on privacy, we care the most about privacy, our business model is by far the most the most privacy friendly to consumers.”

Interviews with eight former employees, however, offer insight into Facebook’s repeated brushing off of red flags reported from both inside and outside the company.

Part two of the series gets into Facebook’s role in the 2016 election of Donald Trump, his campaign manager’s use of Facebook micro-targeting, the unchecked burst of fake sites sharing News Feed space with legitimate publications and the development of hyperpartisan posts, which typically yields the most engagement, and in turn goes to the top of the News Feed for more people to see.

Filipino journalist Maria Ressa details how Facebook ignored her pleas to address President Duarte’s use of Facebook in viciously trolling her efforts to criticize him. James Clapper, the former director of National Intelligence, weighs in on Russia’s use of Facebook to polarize voters during the 2016 election. “They had messages for everybody – Black Lives Matter, white supremacists, gun control advocates, gun control opponents, didn’t matter,” he said.

The iceberg that finally caused Facebook to confront these issues was British political consulting firm Cambridge Analytica and its mining of data for political purposes. Former employee Chris Riley explains how he alerted Facebook about the issue in 2015 and they promised to investigate, but wouldn’t take action until Wiley went public with his findings.

By the end of part two, Frontline shows a montage of this “we were slow to realize” pattern of denial from the five Facebook executives that were interviewed, which became comical in its prevalence. We also see a team assembled by Facebook to root out fake news posts ahead of the Nov. 6 midterms, a team that will also be active on election day as well.

But is Facebook doing enough to address these abuses?

“They are a combination of unable and unwilling to grasp and deal with this complexity,” concludes Turkish writer and techno-sociologist Zeynep Tufecki. And because everyone you know has a Facebook page, she believes there is no incentive for the company to ever change its revenue-generating business model.

To Tufecki, it all boils down to this: “You might not like the company, you might not like its privacy policies, you might not like the way its algorithm works, you might not like its business model, but what are you gonna do?”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s