It’s Not About Privacy, It’s About Control: Thinking Beyond The Latest Data Mining Scandal

The recent news that a voter profiling company called Cambridge Analytica had obtained 50 million American Facebook users’ data and used it to formulate Trump campaign content in 2016 has the general public riled up anew about the risks that we all run by using free online platforms. The discourse in media (and on Facebook, of course) is heated, the outrage machine has been cranked back up, and once again pundits are asking what gives Facebook the right to sell our data to the highest bidder. How can we protect ourselves against these massive breaches of trust? Let’s all count to three and delete our Facebook accounts!

Fenwick McKelvey, co-director of Milieux’s Media History Research Centre, wishes that the media would start asking different questions about how data is being used by platforms like Facebook.

“We’re stuck in this informational paradigm that’s really outdated,” says McKelvey. “The media narrative still assumes that the goal of these platforms is to expose people to information. But it’s less and less about that — the goal is to manage and control people’s behaviour.”

Among the urgent questions media commentators should be asking, McKelvey believes, is how online advertisers are deploying user data to subtly nudge people. He provides the illustrative example of SnapChat — a company with relatively strong privacy settings in place – that leaks data to advertisers with dizzying granularity that reflects the industry standard. Through SnapChat’s protocols, your phone informs advertisers how much time passes between the moment you’re served one of their ads and the moment you make a purchase at their business, either online or in person. Every time you walk into a retailer with your phone’s location services on, you are leaking data about your consumption habits. All of us with smartphones are Pied Pipers, but instead of children, we’re trailed by invisible wisps of data about what we do and how we feel.

“We need to get some critical distance,” says McKelvey. “We’re seeing a lot of concern about practices that are standard across the industry – these are not surprising practices. The fact that we don’t really know what to make of this kind of data issue is really unacceptable at this point.”

Data leaks like the one between Facebook and Cambridge Analytica have happened before — in 2016 the public learned of a massive leak of 1.1 terabytes of voter data by a data firm employed by the United States’ Republican National Committee. “Cambridge Analytica is really a vindication of 10 years’ of research about what Facebook does with its data,” says McKelvey.

Every time one of these leaks is publicized, there’s public outcry and a sense of collective incredulity that this data is being collected in the first place. And that, McKelvey says, is where it’s time to shift the narrative away from “why?” and into the murky uncharted depths of “how?”

Martin French, an assistant professor of sociology at Concordia and a member of the TAG and Speculative Life research clusters at Milieux, is interested in the notion of “consent” underpinning Facebook’s policies around data collection. “Facebook reportedly changed its policies after 2015 to stop app developers accessing information on app users’ networks (e.g. the policies that allowed Cambridge Analytica to access data without users’ consent). But, for me, the question is, whether Facebook users, in the real world, are actually aware of the changing ways their data is being used, and the policies that purportedly govern these uses? I would posit, judging from the research that has been done on who reads and understand social media privacy policies, that most users are unaware of how their data is actually being used.

So, the ‘consent’ that Facebook is talking about is not really a kind of consent that conforms to any dictionary definition of that term,” says French.

French researches these issues in the context of free-to-play mobile social games. He believes that there are important questions yet to be asked about how people disregard, interact with, and make sense of data use policies in their daily lives.

The Milieux Institute is emerging as a place where speculative thinking about the future of AI is being facilitated at a critical moment. Milieux researchers are stepping outside the existing paradigm wherein Facebook is setting its own data use agenda with virtually no oversight. McKelvey and his students and colleagues are beginning to develop policy alternatives that could be used by government to bring about effective regulation over what Facebook can and can’t do with user data. Students in McKelvey’s Media Policy class will be presenting their ongoing research on AI policy to Global Affairs Canada on April 18, along with nine other institutions from across Canada that have been invited to advise officials.

Bart Simon, the Institute’s co-director, wonders if platforms like Facebook are concerned with people as users at all. “What they are trying to produce through data collection is actionable data,” he says. “The simulacrum of social order reduced to some semblance of behaviour/profile is so much more malleable than actual people.”

Meanwhile, Milieux is hosting weekly meetups on AI ethics, convened by Abhishek Gupta, an AI ethics researcher based at District 3. “These meetups are public consultations that are industry-led, so there’s the possibility of actual industry buy-in down the road,” says McKelvey. The meetups are talking through the newly drafted Montreal Declaration for Responsible AI, developed through the Universite de Montreal at the first-ever forum on responsible AI use, held last fall.

“There are calls for a national data strategy in Canada,” says McKelvey. “Canada doesn’t really have platform governance. Data has been undervalued and under-scrutinized in Canada’s regulatory system.”

Facebook’s role as a friendly behemoth that makes its own rules while crossing its fingers behind its back is ripe for forceful governmental intervention. But its influence is so totalizing that it’s going to take more than hot takes to bring about change. After all, Facebook is as much a product of academia as of the open marketplace, and its emissaries are insiders in many of our most trusted institutions. Just last month, Concordia’s own Speaker Series on Digital Futures hosted Kevin Chan, Facebook’s head of public policy in Canada, to talk about the good work being done by the platform. This is where work like that being done at Milieux comes in.

“We have a responsibility at Milieux, because it’s a place that allows us to ask questions that aren’t necessarily being asked elsewhere,” says McKelvey. “This is a real challenge as a researcher. We have a responsibility to push back against the narratives that are prevailing.”

More
News and Research