We learned last week that Facebook had disabled our Facebook accounts and our access to data that we have been using to study how misinformation spreads on the company’s platform.
We were informed of this in an automated email. In a statement, Facebook says we used “unauthorized means to access and collect data” and that it shut us out to comply with an order from the Federal Trade Commission to respect the privacy of its users.
This is deeply misleading. We collect identifying information only about Facebook’s advertisers. We believe that Facebook is using privacy as a pretext to squelch research that it considers inconvenient. Notably, the acting director of the F.T.C.’s consumer protection bureau told Facebook last week that the “insinuation” that the agency’s order required the disabling of our accounts was “inaccurate.”
“The F.T.C. is committed to protecting the privacy of people , and efforts to shield targeted advertising practices from scrutiny run counter to that mission,” the acting director, Samuel Levine, wrote to Mark Zuckerberg, Facebook’s founder and chief executive.
Our team at N.Y.U.’s Center for Cybersecurity has been studying Facebook’s platform for three years. Last year, we deployed a browser extension we developed called Ad Observer that allows users to voluntarily share information with us about ads that Facebook shows them. It is this tool that has raised the ire of Facebook and that it pointed to when it disabled our accounts.
In the course of our overall research, we’ve been able to demonstrate that extreme, unreliable news sources get more “engagement” — that is, user interaction — on Facebook, at the expense of accurate posts and reporting. What’s more, our work shows that the archive of political ads that Facebook makes available to researchers is missing more than 100,000 ads.
There is still a lot of important research we want to do. When Facebook shut down our accounts, we had just begun studies intended to determine whether the platform is contributing to vaccine hesitancy and sowing distrust in elections. We were also trying to figure out what role the platform may have played leading up to the Capitol assault on Jan. 6.
We are privacy and cybersecurity researchers whose careers are built on protecting users. That’s why we’ve been so careful to make sure that our Ad Observer tool collects only limited and anonymous information from the users who agreed to participate in our research. And it is also why we made the tool’s source code public so that Facebook and others can verify that it does what we say it does.
We strongly believe we are not violating Facebook’s terms of service, as the company contends. But even if we had been, Facebook could have authorized our research. As Facebook declared in announcing the disabling of our accounts, “We’ll continue to provide ways for responsible researchers to conduct studies that are in the public interest while protecting the security of our platform and the privacy of people who use it.”
Our research is responsible and in the public interest. We’ve protected the privacy of our volunteers. Essentially, our ad tool collects the ads our volunteers see on their Facebook accounts, plus information provided by Facebook about when and why they were shown the ads and who paid for them. These ads are seen by the specific audience the advertiser targets.
This tool provides a way to see what entities are trying to influence the public, and how they’re doing it. We think that’s important to democracy. Yet Facebook has denied us important access to continue to do much of our work.
One of the odd things about this dispute is that while Facebook has barred us from research tools available to users and other academic researchers, it has not blocked our Ad Observer browser either by technical or legal means. It is still operational, and we are still collecting data from volunteers.
Still, by shutting us off from its own research tools, Facebook is making our work harder. This is unfortunate. Facebook isn’t protecting privacy. It’s not even protecting its advertisers. It’s protecting itself from scrutiny and accountability.
The company suggests the Ad Observer is unnecessary, that researchers can study its platform with tools the company provides. But the data Facebook makes available is woefully inadequate, as the gaps we’ve found in its political ad archive prove. If we were to rely on Facebook, we simply could not study the spread of misinformation on topics ranging from elections to the Capitol riot to Covid-19 vaccines.
By blocking us from its platform, Facebook sent us a message: It wants to stop us from examining how it operates.
We have a message for Facebook: The public deserves more transparency about the systems the company uses to sell the public’s attention to advertisers and the algorithms it employs to promote content. We will keep working to ensure the public gets that transparency.