Skip to content
People first, alwaysContents




Earn trust and confidence

Build robust infrastructure for trust

Public services belong to the people.

We are not customers of the state but co-creators of it, so it is essential that public services move no faster than the speed of trust. However, the UK trust landscape is complex, and our next government will be faced by a long list of urgent reforms. Enacting these while building trust and confidence is essential for effective delivery.

The state of trust in the UK

Research by the King’s College London Policy Unit shows the UK public has “internationally low levels of confidence in institutions”, with falling levels of confidence in Parliament and persistently low levels of confidence in government, particularly among millennials. Confidence in the civil service, meanwhile, has rebounded since 2017, with people being twice as likely to say they are confident in the civil service as in either parliament or government.

In addition to this, British Social Attitudes research shows that the combination of increased cost of living and high taxation has led more of the public, across the voting spectrum, to favour a bigger state in which government takes a clear role in providing universal services.

There is ambivalence, however, about data and AI. A tracker study by the Centre for Data Ethics and Innovation shows that less than half of those surveyed see data collection and analysis as “good for society”, and there is a clear preference among those surveyed for “clear and effective risk mitigation strategies”.

Reading these analyses together, it seems clear that the public wants a more effective, proactive, and transparent approach to government and public services. This is not a mandate to “move fast and break things”, however urgent the challenge at hand.

Trust and digital government

Trust is widely held to be a prerequisite for public use of digital government. This is enshrined in principle by the Public Sector Equality Duty and the Data Protection Act, but enforcement of these in the public sector is limited and infrequent. Within government, CDDO’s Digital Ethics Framework offers guidance but there are no meaningful routes for escalation; meanwhile, both the recent AI White Paper and the newly established AI Safety Institute are determinedly industry facing and provide no red lines or auditing mechanisms for public services.

This lack of good governance around the use and misuse of data and data-driven decisions by public bodies has led to several recent miscarriages of justice, including the Windrush Scandal, in which poor data management contributed to thousands of British citizens being wrongly classed as illegal immigrants; the A-Level Algorithm, in which students from lower socio-economic backgrounds were given downgraded results; the Post Office Horizon Scandal, in which 700 sub-postmasters were wrongly accused of theft; and the failure of the International English Language Testing System, in which thousands of students were incorrectly found to have cheated. It seems probable that there are many other infractions that have yet to come to public notice.

Several contributors to these recommendations shared experiences of low-trust environments slowing down technology adoption. Trust in data use and data security is a particular blocker, with Dr Jess Morley noting that:

The NHS and the Government have badly damaged public trust in the use of healthcare data for secondary purposes (research and analytics). The national opt-out rate is >5%… This loss of trust, and therefore willingness to share data for secondary purposes, is a direct result of the repeated Government failures of in 2013-2015, and more recently the two attempts at GPDPR during the summer of 2020/2021. The same mistakes were made in both instances: lack of communication with the public, lack of co-design with the public, no investment in technical infrastructure to enable privacy protection, no communication of clear governance rules setting out expectations for (e.g., private company use), assumptions that the public would be happy with being simply informed.

Meanwhile, the CDEI research notes that people with low-levels of digital confidence, who may be subject to other kinds of social and economic exclusion, are also more likely to be wary of use of AI and data.

How to rebuild trust and confidence?

Philosopher Onora O’Neill says that, “To judge trustworthiness, we need to judge honesty, competence, and reliability”. For most people, these will be evidenced by a combination of factors including:

  1. personal experiences and those of friends and family
  2. reports in the media and on social media
  3. the behaviour and effectiveness of institutions.

Better stories and experiences for more people are a consequence of better institutions; while this may seem like a long-wave approach, it is not enough to run PR campaigns and seed positive media without also investing in underlying reform. Currently, public services result in a data ecosystem that is opaque to the public and their proxies; this is no basis for trust and must be urgently addressed.

Three steps to rebuild trust and safety

The next government should match the initial £100m investment in the AI Safety Institute to establish a robust approach to the safe and trustworthy delivery of public services. The following three steps could be embarked upon in the first 100 days of a new government, laying the foundations for an improved delivery and regulatory regime.

1. Conduct an Independent Expert Review of Public-Service Regulation

The next government must acknowledge that the current regulatory environment is not fit for purpose and undertake an independent expert review that maps the current state of public-service regulation, identifies the gaps and inconsistencies, and gathers evidence from a wide range of stakeholders, including civil society organisations and people with lived experience of the outcomes of poor regulation. The outcomes of this review must inform the basis of regulatory reform.

2. Establish mechanisms for transparent and ongoing public engagement - the social infrastructure for trustworthiness

Rather than one-and-done citizen juries or participatory panels, regulatory standards must be informed by an understanding of the ongoing and emergent ways public service delivery shapes people’s lives. In particular, it is important to understand that the social impacts of technologies do not emerge in a uniform way but “slowly, then all at once”, often impacting individuals or particular communities in advance of appearing to be a more generalised harm.

To mitigate this and reduce the cost and prevalence of failure, we recommend:

3. Rolling out technical infrastructure for trustworthiness

Steps to begin building the infrastructure for trust: