Facebook appears to be talking out of both sides of its mouth again.
On Monday, the nonprofit news organization ProPublica stating that Facebook had intentionally disabled its ability to monitor political advertising on the platform — which doesn’t exactly sound like the transparency Mark Zuckerberg promised.
ProPublica uses a plugin that allows it to see which ads a consenting user sees on Facebook. It also tells the organization how those users were targeted. For example, if you’re someone who told Facebook that you are “liberal,” you might see ads for liberal causes. But this sort of targeting can also get much, much more granular. Cambridge Analytica targeted people with political ads based on their denim preferences.
Similar tools from Mozilla and UK organization WhoTargetsMe were also reportedly disabled thanks to changes made on Facebook’s end.
Facebook says the actions were meant to prevent widgets and plugins from scraping the personal information of users for privacy and security reasons. Facebook pointed Mashable to the following statement from its director of product for further explanation:
We know we have more to do on the transparency front – but we also want to make sure that providing more transparency doesn’t come at the cost of exposing people’s private information.
— Rob Leathern (@robleathern) January 28, 2019
This isn’t about stopping publications from holding us accountable or making ads less transparent. This is about preventing people’s data from being misused – our top priority. Plugins that scrape ads can expose people’s info if misused.
— Rob Leathern (@robleathern) January 28, 2019
But ProPublica, Mozilla, and other watchdogs are skeptical that this is about security.
“It’s clear that Facebook is seeking to disable tools that provide greater transparency on political advertising than they wish to permit,” ProPublica President Richard Tofel told Mashable over email. “They claim this is because of potential abuse of or problems with such tools, but they have cited no evidence any such problems have resulted with our Political Ad Collector—and we know of none.”
“The propublica tool, much like ours, was aimed to provide users with valuable information about political ads to help shed more light on a process that has historically been very secretive,” Marshall Erwin, Mozilla’s head of trust and security, told Mashable over email. “Major tech companies need to provide more transparency into political advertising, and support researchers and other organizations, like Mozilla, working in good faith to strengthen our democratic processes.”
Facebook maintains its own searchable database of political advertising. But ProPublica points out that the tool is only available in three countries (the U.S., UK, and Brazil). And, crucially, it does not contain the targeting information associated with the ads.
Targeting has been a major source of controversy for Facebook. Cambridge Analytica whistleblower Christopher Wylie took issue with how his company’s ads were able to “psychographically profile” Facebook users. (The efficacy of microtargeting is still under debate.) But the ability to reach a highly customized group of potential customers is central to Facebook’s value proposition as an ad platform — and more scrutiny could threaten their current business model.
“This appears to be a deliberate attempt to obstruct journalism focused on Facebook’s platform,” Alex Abdo, a senior staff attorney with the Knight First Amendment Institute told Mashable over email. “Facebook claims that it was a ‘routine’ update to prevent the exposure of people’s information in unexpected ways, but that explanation is hard to believe.”
that it’s making the archives available in more countries and creating new ways to enforce its rules about political advertising. But experts wonder — if transparency apparently now rules the day at Facebook — why it would not encourage more scrutiny of the platform.
Specifically, ProPublica and the Knight Institute take issue with the idea that blocking the plugins are about security. Certainly, Facebook has a right to be cautious; the exploitation of third-party apps are part of what got Facebook into trouble with Cambridge Analytica in the first place. But why not work with organizations acting in good faith?
“Facebook’s motives are, of course, unclear to us,” Tofel said. “What is clear is that Facebook was used by bad actors in 2016, and perhaps beyond, and that greater transparency—including revealing the targeting of political ads on Facebook—would seem a potentially critical safeguard for democracies, here in the U.S. and elsewhere.”
“We cannot trust Facebook to be its own gatekeeper”
To illustrate the security risks of browser plugins, Facebook pointed to a November 2018 incident in which hackers used an extension to scrape and sell users’ personal information. But a malicious plugin that covertly scrapes all user data, as opposed to a well-intentioned one that explicitly gets user consent, may be like comparing apples and oranges.
“The example Facebook is relying on strongly suggests their latest move had little to do with malicious browser plug-ins,” Abdo said. “It shows that malicious browser plug-ins can be used to scrape all sorts of sensitive information from Facebook’s site, and yet the latest change is limited to preventing the scraping of ads and ad-targeting information.”
Conflating the risks of malicious data scraping and ad monitoring could end up creating a blanket policy that prevents important research.
“There have been no problems from our [plugin], despite its use by tens of thousands of people over many months,” Tofel said. “[Facebook] could, for instance, whitelist our Political Ad Collector. But they seem determined to stamp out transparency in the targeting of political ads.”
According to Facebook, it is seeking to prevent plugins from getting access to user data without their consent. However, ProPublica specifically gets consent from any users that download its plugins so that it can see targeting data.
“ProPublica’s tool and others like it rely on explicit user consent,” Abdo said. “By disabling these tools, Facebook is not respecting its users’ privacy choices, but preventing them from voluntarily sharing information about the ads they see with researchers for research purposes.”
Instead of primarily a security concern, it appears that Facebook’s latest move is part of a larger update that limits ad-blocking plugins that would negatively affect its business. And as the Knight institute points out, the latest move to safeguard Facebook’s profits could come at the expense of public interest.
“At best, Facebook has prioritized its own commercial interest in blocking ad blockers over the public’s interest in learning how the platform has been used to spread misinformation,” Abdo said. “And at worst, Facebook is deliberately obstructing important journalism.”
Facebook is all about “transparency,” but its past initiatives — including controlling which researchers have access to certain data — as well as this recent action, show that it very much wants to be the arbiter of what is visible, and what is kept from view.
“Independent journalism focused on Facebook is essential if the public wants to understand how Facebook is influencing public discourse,” Abdo said. “We cannot trust Facebook to be its own gatekeeper.”