Comment: Government algorithms are undermining democracy Published on: 9 September 2020 Writing for The Conversation, Mhairi Aitken suggests that the design of government algorithms should be opened up to the people. , Algorithms appear to be in retreat in the UK 鈥 for now. Not only did national governments recently U-turn over of algorithms to assign the grades of school leavers, but numerous local authorities similar technology used to make decisions on benefits and other welfare services. In many of these cases, this was because the algorithms were found to lead to biased and unfair outcomes. But why didn鈥檛 anyone realise these algorithms would produce such harmful outcomes before they were out in the real world doing real damage? The answer may be that the public are not adequately represented in the processes of developing and implementing new systems. This not only increases the chances of things going wrong but also undermines public trust in government and its use of data, limiting the opportunities for algorithms to be used for good in the future. As algorithms are increasingly used in ways which change how people access services and restrict their life choices, there are risks that these new opaque systems reduce accountability and fundamentally alter public relationships with government. At its core, this is a threat to democracy. Read more: There鈥檚 a that says people鈥檚 data should be used in a transparent way to maintain their trust. Yet when governments use algorithms to make decisions, if there鈥檚 any transparency at all it typically doesn鈥檛 go far beyond . Simply telling the public about what is being done . Instead, transparency should involve asking the public what should and shouldn鈥檛 be done. In this case, that means opening up real opportunities for dialogue and creating ways for the public to shape the role of algorithms in their lives. This requires professionals working in this area to concede that they don鈥檛 have all the answers and that they can learn something from listening to new ideas. In the UK鈥檚 A-level case, had the exam authorities spent more time speaking to students, teachers and parents in the early stages of developing their algorithms, these bodies may have been able to anticipate and address some of the problems much earlier and found ways to do things better. How might this work in practice? The health sector, where there is a long tradition of patient and public involvement in delivering services and , provides some clues. Members of the public are included on boards determining access requests for medical data. Patient panels have played active roles in shaping how health bodies are governed and advising data scientists in developing research projects. More broadly, across a range of policy areas, deliberative forms of public engagement can play important roles in shaping policy and informing future directions. Workshops, and can help identify and understand public attitudes, concerns and around the ways that or about the and uses of . These methods bring together diverse members of the public and emphasise collective, socially-minded ways of thinking. This can be a valuable approach for addressing the many complex social and ethical issues relating to uses of algorithms which 鈥渆xpert鈥 or professional teams struggle to resolve. Even with the best of intentions, no government team is likely to have all the necessary technical and policy expertise and understanding of the lives of everyone affected to get things right every time. Public deliberation might consider whether an algorithm is an appropriate tool to use in a particular context or identify which conditions need to be met for its use to be socially acceptable. In the case of the A-level algorithm, public engagement could have clarified (in advance) what would constitute a fair outcome and which data should be used to realise that. Trust the public It might be argued that algorithms are too complicated or technical for members of the public to understand. But this just serves as a convenient excuse for keeping scientific and policy processes following business as usual. The growing body of in this area consistently points to both the competence and enthusiasm of the public to actively engage in processes of developing, deploying and governing algorithms. There are even citizen science projects, such as that of in Brazil, that bring members of the public together to develop algorithms for public good. Read more: Given the appeal of algorithms to increase efficiency and provide an illusion of objectivity in complex decision-making, it is likely that governments鈥 use of algorithms will continue to increase. If lessons aren鈥檛 learnt from the A-levels fiasco, the recent protests will also become an increasingly regular feature of public life. Trust in government will be further eroded and democratic society itself undermined. Algorithms are not intrinsically bad. There are enormous opportunities to harness the value of data and the power of algorithms to bring benefits across society. But doing so requires a democratisation of the processes through which algorithms are developed and deployed. Fair outcomes are much more likely to be reached through fair processes. , Senior Research Associate in Digital Ethics, This article is republished from under a Creative Commons license. Read the . Share: Latest News 缅北禁地 expert highlights climate crisis in a new film A leading 缅北禁地 climate scientist is featured in a new film about how the climate and nature breakdown will affect the UK. published on: 14 April 2026 Neolithic tombs reveal ancient kinship ties Male individuals buried in Neolithic chambered tombs in northern Scotland were often related to each other through the paternal line and some were interred in the same or nearby tombs, research shows. published on: 14 April 2026 We are our Memories New exhibition by Fine Art graduate Trish Hudson-Moses, 22 April 鈥 4 May 2026 published on: 10 April 2026 Facts and figures