Facebook has been the subject of intense scrutiny of late. What started with months of rumblings about rogue Russian advertisers and organised campaigns of misinformation escalated into a perfect storm when it was revealed that a single developer, Alexandr Kogan, had used the platform to obtain personal data of about 87 million users. He had then sold the data to Cambridge Analytica, a controversial political consultancy involved in elections in the US, the UK and India.
Those revelations earned Mark Zuckerberg, the CEO of Facebook, two summons from American lawmakers, one from the British parliament and even a vague threat from the Indian IT minister. As a tidal wave of bad PR engulfed the company, stock prices dipped and #LeaveFacebook began to trend.
The social networking platform moved to contain the damage: it announced a series of measures intended to shore up the privacy of user data. Among these measures was the shutting off or restriction of a number of crucial application programming interfaces (APIs), pipelines of sorts that third-party developers use to get data in and out of Facebook.
Access to data
Kogan had used an API, shut down in 2015, to collect massive amounts of user data using the cover of a quiz app. Crucially, he was acting within his rights as defined by Facebook for third-party developers. He violated Facebook’s terms of service only when he sold the data to Cambridge Analytica. It is such potentially leaky faucets that the company hopes to plug by restricting API access now.
“It’s not enough to just give people control over their information,” Zuckerberg had said, addressing the US Senate Commerce and Judiciary Committees a few days later. “We need to make sure that the developers they share it with protect their information too.”
While the new restrictions definitely benefit users, they have also shut out legitimate third-party developers, including academics who rely on these APIs to gather the datasets necessary for their research into social media and its effect on our lives.
An open letter written by internet scholars from around the world argues that the move is “likely to compound the real problem, further diminishing transparency and opportunities for independent oversight.” With over 250 signatories, it argues that “the net effect of the new API restrictions is to lock out third parties and consolidate Facebook’s position as the main analytics and advertising broker. Contrary to popular belief, these changes are as much about strengthening Facebook’s business model of data control as they are about actually improving data privacy for users.”
Facebook knows that it can’t afford to shut out academia. Rigorous academic research is essential to understanding the many unintended and unforeseen effects the platform has had upon the world, and to develop counter-strategies. So, the company followed up its API restrictions announcement by partnering with eight major non-profit groups to set up a commission tasked with promoting “independent, credible research” about the impact of social media on society. Facebook and its partners plan to appoint scholars to run the commission, which will have the power to call for proposals, order and fund studies and, most importantly, facilitate access to Facebook data.
The privacy of user data shared this way will be ensured through oversight provided by university review boards and a dedicated team that will work with the commission to see that only “aggregated, anonymised results are reported”. It is likely that access to data will only be available on Facebook’s premises or through a virtual network controlled by the company.
In the driver’s seat
The letter welcomes the move “in principle” but warns that “the engagement model Facebook has chosen for this initiative falls well short of what is required, and fails to provide sufficient support, for free and independent scientific research.”
According to Axel Bruns, the scholar who published the letter, the initiative is “one welcome step in the right direction, but only one such step – and it addresses only a small area of research, with a strongly US-centric focus. We would be concerned if this was the only form of engagement Facebook ends up pursuing, or if all other engagement with scholars was similarly addressing only the topics that Facebook deems worthy of research.”
Gary King, the Harvard political scientist who coauthored the paper on which the engagement model is based, agrees: “What we offered in our paper was a new political science idea, in the form of a structure, like a constitution, that convinced academic researchers, Facebook, and eight ideologically, politically, and substantively diverse foundations to sign on. I am pretty sure everyone views this as a step forward as it hasn’t happened before, but only a first step.”
However, beyond the broad consensus on the need for Facebook to engage with academia, there is more mistrust than agreement between them. While Facebook has insisted that it will not have the right to review research findings prior to publication, any requests for data access will still have to be cleared by the company and not the commission. It isn’t clear whether Facebook will have a say on which projects are approved but it is clear that the initiative will prioritise research that allows Facebook to improve the effectiveness of its products. Unsurprisingly, the initial thrust of research will be focused on the platform’s impact on elections and democracy at large.
“We need better frameworks for how the two sides can talk to and engage with each other. But those frameworks shouldn’t put one of those sides in the driver’s seat,” says Bruns, the president of the Association of Internet Researchers.
King appears to understand these shortcomings, but in attempting to build a bridge from Silicon Valley to academia, there is a sense that he is fighting an uphill battle. “A few decades ago, almost all the data in the world was either created by academics or offered to them by governments or sometimes for sale by companies,” he said. “Now, almost all the data in the world is tied up inside private companies. Given the change in our data sources, there is no way we can do our jobs as social scientists unless we figure out how to work with these companies.”
As things stand, Bruns believes that researchers are left with few options, all of which seem less than optimal. The path of least resistance would be to fall in line with Facebook’s agenda or partner with market research companies that have access to the same datasets – both of which require them to adjust to external commercial demands. Alternatively, researchers could choose to scrape data using software tools, an approach that would be ethically suspect and yield potentially unreliable data. Worryingly, the only option left for those asking questions that Facebook isn’t interested in while staying on the straight and narrow seems to be to focus their research on other platforms.
Considering Facebook’s size and reach, this would result in a massive blind spot in scholarship about social media.