Consider this on #WorldPrivacyDay. For more than 60 years now, organisations have been trying to understand and manipulate the way we think, as the first in the series of quotes below illustrates:

“Many of us are being influenced and manipulated — far more than we realize — in the patterns of our everyday lives. Large scale efforts are being made, often with impressive success, to channel our unthinking habits, our purchasing decisions, and our thought processes” (1957, Vance Packard)

“If these practices continue and grow, citizens will not be able to believe that the information they must entrust to others about their private and intimate affairs will be safeguarded. When that happens they will have lost not only a large hunk of privacy but also much of their sense of freedom.” (1964, Vance Packard)

“If it was true during the early dot-com days that “nobody knows you’re a dog,” it’s the exact opposite today. We are ranked, categorized, and scored in hundreds of models, on the basis of our revealed preferences and patterns.” (2016, Cathy O’Neill)

“It is of no collective social benefit to organize information resources on the web through processes that solidify inequality and marginalization.” (2018, Safiya Umoja Noble)

“… mathematics is only a tool, though an immensely powerful one. No equations, however impressive and complex, can arrive at the truth if the initial assumptions are incorrect.” (1962, Arthur C Clarke)

Yet despite how long our governments have had the opportunity to understand and respond to this trend, we see failures of legislation and regulation everywhere around us – most notably in recent revelations such as the Facebook/Cambridge Analytica scandal. We seem to be stuck with Victorian regulators in a digital economy.

As Jamie Bartlett observes in “The People Vs Tech“, the hyper-personalisation of the messages we receive online, based upon the constant analyses of our personal data and behaviour, means that there is no longer any common public debate. This extends to political messages – not just junk ads for new soap powder.

Our horizons are being narrowed, not widened, by these manipulative uses of technology by unseen and unaccountable marionettists. So we end up with algorithms saying things like “People like you also read articles/blogs/opinions like this …”, which point us at self-similar things to those we are already comfortable and familiar with. Far better surely to challenge such confirmation bias by highlighting alternatives that aim to open our eyes to different content and perspectives, with algorithms that point us at new things – “People like you never read articles/blogs/opinions like this …”?

At a time when technology could be broadening and educating debate and understanding and empathy, it is instead – by design – reinforcing tribal prejudice and narrow partisanship. Without common public debate – arguably the cornerstone of modern Western democracies – positions become polarised as people bunker down into their tribes. The whole purpose of politics – to find consensual compromise between different people, needs and perspectives to their mutual benefit – is undermined:

“Hyper-personalisation incentivises politicians to make different pledges to different ‘universes’ of users. Based on current trends in ad-tech, within ten years or so, every single voter could receive a completely unique, and personalised advert from the same candidate. But how can we hold anyone to account if there is no clear, single set of promises that everyone can see and understand?” (2018, Jamie Bartlett)

So if you were thinking #WorldPrivacyDay doesn’t matter to you and the future of the type of society we want to live in and the type of technology we want to design, cultivate and regulate – it’s perhaps time to think again. After all

“Technology is neither good nor bad; nor is it neutral … technology’s interaction with the social ecology is such that technical developments frequently have environmental, social, and human consequences that go far beyond the immediate purposes of the technical devices and practices themselves.” (1986, Melvin Kranzberg)

Original source – new tech observations from a UK perspective (ntouk)

Comments closed