The Ad Observatory project at NYU was started by the Cybersecurity for Democracy group with over 6,500 volunteers at the University’s school of engineering in September 2020
Last week, Facebook Inc said it suspended accounts and access of New York University’s Ad Observatory project researchers, stating their research practices were against the company’s terms of service. Critics called the move an attempt by Facebook to silence organisations questioning its ad targeting practices, especially those related to political ads.
(Subscribe to our Today’s Cache newsletter for a quick snapshot of top 5 tech stories. Click here to subscribe for free.)
What did the researchers do?
The Ad Observatory project at NYU was started by the Cybersecurity for Democracy group with over 6,500 volunteers at the University’s school of engineering in September 2020. The project’s aim was to collect information on how social networks, including Facebook and YouTube, used data to target political ads.
The team built a browser plugin that copied ads users saw on social networks and studied them. It collected advertiser’s name, ad text, images and links data. The plugin was launched as part of the project, months before the 2020 U.S. presidential election, to bring transparency to political advertising.
Why did Facebook shut them down?
Facebook said the academicians were involved in unauthorised data collection using a browser extension “programmed to evade detection systems and scrape data such as usernames, ads, links to profiles”.
The social network also claims that the web extension also collected data of Facebook users who did not install or consent to the collection. The information collected was previously archived, but is now an offline, publicly available database, Facebook noted.
The California-based company had demanded that the project be shut down weeks after the tool was launched in 2020, stating the research methods would involve scraping of massive amounts of user data from its website, according to a Wall Street Journal report.
“While the Ad Observatory project may be well-intentioned, the ongoing and continued violations of protections against scraping cannot be ignored and should be remediated,” Facebook noted.
The company reportedly also provided “privacy-protected datasets” to the researchers, which was declined.
Reaction to Facebook’s decision
Laura Edelson, the NYU Ad Observatory project’s lead researcher, said the access to Facebook’s ad data helped the team uncover systemic flaws in the ad library, and identify misinformation in political ads including “many sowing distrust in the election system”.
“By suspending our accounts, Facebook has effectively ended all this work. Facebook has also effectively cut off access to more than two dozen other researchers and journalists who get access to Facebook data through our project,” she tweeted.
Damon McCoy, professor of computer science at NYU, noted that it is “disgraceful” that Facebook is attempting to quash legitimate research that is informing the public about disinformation on their platform”. McCoy also said the platform is awash with vaccine misinformation and partisan campaigns to manipulate the public. He noted that the platform must be open to independent research.
Ad Observer’s website further clarifies that data collected by the tool does not include a user’s Facebook ID number, their name, birthday, friends list, or how they have interacted with ads.
U.S. Senator Amy Klobuchar expressed concerns about Facebook’s move, stating that it has shown the tech giant continues to sell millions of dollars’ worth of political ads without proper disclosures.
Mozilla, the not-for-profit behind the Firefox browser, also slammed Facebook for banning the researchers, saying its claims of privacy problems “do not hold water”. Mozilla said it had thoroughly reviewed Ad Observer to conclude it respects user privacy and supports transparency.
Why does this matter?
Micro-targeting of political ads allows politicians or any agency to curate custom ads tailored for specific groups, resulting in varied information across demographics and places that can cause widespread misinformation.
Facebook stopped accepting political ads a week before the 2020 U.S. presidential election to prevent election interference. It also introduced labels and prompts that showed information related to the election.
Online ads, especially political ones, are usually hard to monitor as they are shown to targeted individuals or groups, Ad Observer notes. “While platforms have developed some transparency libraries for political ads, these libraries are missing many ads featuring political content and often don’t include vital information such as ad targeting,” it added.