Join our newsletter

As US polls loom, backlash rises against behaviour profiling on digital platforms

New York:  As angry protesters fill America’s streets, another type of ferment is intensifying in the digital public square as the country hurtles towards a presidential election in a fraught moment — the demand for a new framework of “digital rights” for individual users in the crosshairs of behavioural micro-targeting on the world’s mighty digital platforms.

“What is at stake here is not only the political autonomy of citizens in a democracy. What’s also potentially at stake is our life chances so to speak on the level of what economic opportunities we may or may not receive based on predictions made about our own productivity, our own value in a sense,” Dr. Ramesh Srinivasan, author of ‘Beyond The Valley’ and director of the Digital Cultures Lab at University of California, Los Angeles, told in a video interview.

Tech firms are increasingly being forced to rethink what goes unchallenged on their platforms, especially their relationship with data brokers or third parties who have access to millions of data points on users which users themselves have no power over and no way to understand in the language of everyday life.

Researchers working at the intersection of media, technology and society are ratcheting up their call for individual users of digital platforms to have access to granular information on data aggregated across digital touchpoints in ways that are intelligible rather than what Srinivasan describes as “nonsense transparency” of terms of service that are currently in place.

According to Srinivasan, “what we’re having done to us is the stitching together of a composite aggregated identity we have absolutely no understanding, visibility, control or ability to do anything about”.

Scholars like Srinivasan are making the case that individual users on digital platforms should have full disclosure of what’s been collected about them, should be able to communicate to the other side what they don’t want collected, must know all aggregation information and the specifics of data retention policy.

“Transparency is not simply about quantitative access, it’s about intelligibility as well. And then we move from transparency and visibility to accountability, which is I think the key thing that’s completely missing.”

Soon after US President Donald Trump’s recent executive order escalated his war against social media companies, platform tech obsessors have analysed it with two distinct lenses: the overt issue of content moderation and the underlying stealth operation of behavioural targeting and uninhibited data collection.

Trump’s gripe is content moderation, domain experts are warning that the Trump-led firestorm is merely a shiny object.

The real trouble, Srinivasan says, is stealth-mode behavioural targeting by the world’s biggest social platforms including Facebook, Twitter, Instagram and Google.

Behavioural micro-targeting is coming under intense scrutiny as digital platforms tighten their hold on societies during a time of multiple catastrophes, including a pandemic that continues to kill people across the world.

“Almost all of President Trump’s political campaign strategy in the next five months and the election will focus primarily on digital advertising which is actually a mask for behavioural modification,” Srinivasan said.

Speaking with on the same theme, Dipayan Ghosh, author of a new book ‘Terms of Disservice: How Silicon Valley is Destructive by Design’, said digital platforms will be reined in only when privacy regulation knocks their business model of “uninhibited data collection” which in turn will erode their margins. Ghosh co-directs the Digital Platforms and Democracy Project at Harvard Kennedy School.

“If you were to institute this privacy regulation that says, no Facebook, no Google, no longer can you take whatever data you want without checking with the user (in specific ways), you will see opt-in rates drop drastically,” Ghosh said.

“The burden shouldn’t be on us,” says Srinivasan. “To protect privacy when we canaet even understand how privacy functions.”

“There should be disclosure, at least at a heuristic level, of how these algorithms work.”