In his speech at the end of July, the Chief Secretary to the Treasury talked about the need for a “faster, smarter” culture in government and “to make data a key part of policymaking”.

If it sounded familiar, that’s because it was. Similar sentiments have been repeatedly expressed by numerous governments over several decades. My article ‘Maggots, rats and a fork in the road‘ explored some of the benefits of a better approach to data that I was discussing in Whitehall four years ago—perhaps it’s what inspired the Chief Secretary? 😉

Groundhog Day

Since at least 1996, both the UK Government of the day and opposition parties have periodically trotted out near-identical announcements about how technology will help provide better services and improve the efficiency of government administration.

A bit like election manifesto promises, these repeated grand ambitions remain largely undelivered, condemned to journey endlessly through an Escher-like landscape only to end up where they began and be announced all over again.

Yet the appropriate use of data has long been essential to designing better public services. The current pandemic has unfortunately demonstrated the lack of good quality data available at the right time to the right people to inform decisions. The National Audit Office recently reported that “despite years of effort and many well-documented failures, government has lacked clear and sustained strategic leadership on data”. 

Data over dogma—a political challenge

A politics based on data and evidence and continuous learning and improvement rather than ideology feels like the antithesis of our existing political system. In After Shock, published earlier this year, I observed that:

… no government has used technology to make open data and insights about its own operations part of their day-to-day processes. This failure is a major obstacle to the modernisation and improvement of democracy. It lets lies and half-truths be easily fabricated, circulated and amplified. It makes it difficult to rebut raw emotion and prejudice and bias with hard, objective evidence.

A move to a new form of politics enabled by technology and informed by transparency, open data and continuous feedback will doubtless fuel the lazy populist accusation that politicians do not know their own minds and allow themselves to be swayed by events. Constant corrections and improvements will be required to reflect facts over opinions, outcomes over dogma, reality over prejudice. 

Politicians who respond to ever-changing evidence, fine-tuning and improving policy on the fly, are likely to be caricatured as weak. Policies based on data instead of dogma offer the best way to meet human, cultural, social, environmental and economic needs but they will take patience and time to prevail. This is why governments need to capture and release reliable open data about the efficacy of political decisions and the performance of the state. It will help us distinguish between the fake and the real, between scientific evidence and irrational belief.

‘Our Future State’, in ‘After Shock’. Ed. John Schroeter.

Indeed, the current pandemic has demonstrated that attempting to ‘follow the science’ can cause frustration and cynicism. The availability, acquisition and use of relevant data has been erratic, both across the UK and the world, not helped by an apparent absence of agreed, consistent data standards, analysis and interpretation from one country to another.

More challenging politically has been the recognition that as new data becomes available, policy needs updating to reflect the changing landscape. A move towards continuous policy improvement would seem to be, in principle, a good thing. However, it can also create confusion when government advice and messaging changes (or, in media parlance, ‘flip flops’), particularly given that expert advisers can interpret the same evidence in different ways or suggest different ways of responding.

Any government flexing its approach based on emerging and evolving evidence is an easy target for those who oppose it: ‘The government doesn’t know what it’s doing!’, ‘Why can’t the government make up its mind!’, ‘Last week it said that the sun goes around us—now it claims that WE go around the SUN!’, etc. etc.. Opposition parties need to be wary however of the law of unintended consequences. Tyre-kicking any administration feeling the stones towards a more evidence-based approach—however nascent, inconsistent and sometimes politically opportunistic it may also be—will undermine wider public support for a more objective, scientific and experimental approach to policymaking. This will only make it more difficult for future administrations of a different political colour to adopt more progressive, outcome-based approaches.

Over the past few decades, policymakers have been slow to take advantage of technology to develop a more evidence-based approach—part of a more systemic problem, including failing to deal well with related areas such as regulation. This is partly I suspect because there’s been both a lack of understanding of technology and its implications and a reluctance to use technology as a policy lever rather than simply to polish and automate the familiar old way of doing things.

Technology, if it has a place at the leadership table at all, is often relegated to being an implementation mechanism for decisions already made rather than being an integral part of policy discovery, design and continuous improvement. Most politicians have been noticeably absent from the decades-long discussions amongst technologists, lawyers, civil society and others about how engineering decisions can have profound policy implications (and indeed vice versa).

Much of the misnamed ‘digital’ training provided to public sector leaders and employees unfortunately reinforces the wrong narrative. It tends to emphasise tools and roles and infrastructure, such as ‘awareness of digital and agile’, the role of developers and designers, and the building of ‘products’. The more significant aspects, such as the role of technology as a policy lever and the issues of effective governance raised by an age increasingly defined by data and algorithms, are often nowhere to be seen.

The move to a more open, evidence-informed approach to policymaking is not helped either by the poor state of maturity of the current data landscape. Years on from commitments to make better use of data—such as the 1999 Professional Policy Making for the Twenty First Century or the 2007 Power of Information—the availability of robust, reliable, useful data remains patchy (and that’s putting it politely).

Without a move to better, open use of data and technology, with more appropriate and more robust forms of governance, the result is likely to be a growing sense of suspicion about the dodgy misuse of data and technology:

Many governments … walk not in the footsteps of … enlightened organizations, but in those of technology corporations and authoritarian regimes. They intrude into our personal lives, gathering and acting upon unprecedented levels of our data in both public and private spaces, online and offline. They use often unproven technologies that automate inequality and undermine human rights and the exercise of justice in the pursuit of their own financial or political goals.

‘Our Future State’. In ‘After Shock’, Ed. John Schroeter.

Data and technology are political anyway

Melvin Kranzberg’s much-quoted “Technology is neither good nor bad; nor is it neutral“, usually misses its most important context:

Technology is neither good nor bad; nor is it neutral … technology’s interaction with the social ecology is such that technical developments frequently have environmental, social, and human consequences that go far beyond the immediate purposes of the technical devices and practices themselves.

Kranzberg, M. (1986). ‘Technology and History: Kranzberg’s Laws’, Technology and Culture vol. 27, no. 3, pp. 544-560. Published by The Johns Hopkins University Press and the Society for the History of Technology

There are countless examples of public sector uses of poor quality and inconsistent data, including the often biased algorithms and snake-oil ‘artificial intelligence’ systems that utilise them, producing inevitable and unacceptable outcomes. This week’s A Level results, apparently ‘adjusted’ by an algorithm that downgraded 40% of marks, will also raise questions about the quality, appropriateness, policy objectives and unconscious bias of the systems and assumptions being used to analyse and interpret data. Such examples chip away at public trust and hence undermine the potential value of a more progressive approach to policymaking—with some public sector applications even ruled unlawful.

Any move towards better use of data and the statistical models and algorithms that act upon it will require another big political change. Transparency: open data, interfaces, models and algorithms. Any publicly funded organisation should have transparency obligations placed upon them.

Imagine how different the outcome might have been had Ofqual openly published its entire model, algorithms and code (not just secondary documentation) for public scrutiny long before it announced the A Level results. Even better if there had been open interfaces to their system so that others could submit and model data to see how it operates in practice. Making all this open could have enabled public and expert review, improvement—and wider consensus about and trust in the model used.

As the 1980s Kranzberg quote above highlighted, every technical decision has potentially profound political consequences—from which data to capture, how to model it, how to analyse it, and all the biases, assumptions and (potentially flawed) code that go with it. This is why it’s important that working in the open becomes the default—something that won’t be particularly comfortable for traditional politics, which all too often seems to proceed on the basis of ideological fervour and the imposition of dodgy, soundbite policies that have little basis in evidence. The growing interplay of politics, data, technology, statistical models, algorithms etc. also illustrates why the so-called ‘digital’ training currently provided falls woefully short of what’s needed: such courses are often largely the equivalent of explaining the nuts and bolts of how a car factory assembly line works, rather than focusing on how to design a better form of transportation.

Towards a 21st century progressive politics

The failure to use technology to help disrupt, reshape and re-engineer the way political ideas are conceived, evaluated, implemented, monitored and improved is the very antithesis of ‘progressive politics’. The result is that much of our politics looks increasingly out of place and out of time, stuck in a pre-enlightenment world of superstition and top-down unproven ‘solutions’ to complex socio-economic issues, while around us and well within our reach is an age of science, technology and evidence.

There seems to me to be little ‘progressive’ in the adoration of ideological political dogma over reality. Nothing progressive in failing to explore better options rather than reciting the same tired, failed articles of faith from the past. Nothing progressive about sidelining ‘digital, data and technology’ as being about infrastructure and polishing what already exists rather than systematically rethinking and re-engineering better outcomes.

I’ve long hoped (naively no doubt) that a political party would spend its years wisely in opposition (or even in government, although that’s a harder challenge), setting aside dogma and tired tribal prejudices to instead draw upon the existing extensive evidence base. Researching in depth how and where technology could play a genuinely significant role in improving the quality of life for citizens and public employees alike – not focusing on websites but applying the best lessons learned from modern organisational structures and processes, particularly their ability to respond to realtime feedback to make continuous service improvements. Using their time to gather essential feedback from the frontline, to map, test, learn, iterate and rethink the interplay of public policy, technology and society.

The political opportunity—and threat—of better public services

Of course, deciding how data and technology are used and applied will always ultimately be a political choice—but at least they offer the prospect of evolving a better form of politics that aspires to work on the basis of evidence, continuous learning and adaptation and outcomes. Anyhow, let me leave you for now with some words from Carl Sagan, who expressed related ideas much more succinctly than me:

Since there is no deductive theory of social organisation, our only recourse is scientific experiment – trying out sometimes on small scales (community, city, and state level, say) a wide range of alternatives …

… Science invites us to let the facts in, even when they don’t confirm to our preconceptions. It counsels us to carry alternative hypotheses in our heads and see which best fit the facts. It urges on us a delicate balance between no-holds-barred openness to new ideas, however heretical, and the most rigorous sceptical scrutiny of everything – new ideas and established wisdom. This kind of thinking is also an essential tool for a democracy in an age of change.

The Demon-Haunted World. Science as a candle in the dark. Carl Sagan. Ballantine Books. 1996


This post is a brief extract from a larger work in progress exploring the interplay of technology, politics and democracy. Final ideas, wording, case studies etc. may vary—or even disagree with this text 🙂

Read the Full Article here: >new tech observations from a UK perspective (ntouk)

Comments closed