RECOMMENDATION 6
Earn trust and confidence
Build robust infrastructure for trust
Public services belong to the people.
We are not customers of the state but co-creators of it, so it is essential that public services move no faster than the speed of trust. However, the UK trust landscape is complex, and our next government will be faced by a long list of urgent reforms. Enacting these while building trust and confidence is essential for effective delivery.
The state of trust in the UK
Research by the King’s College London Policy Unit shows the UK public has “internationally low levels of confidence in institutions”, with falling levels of confidence in Parliament and persistently low levels of confidence in government, particularly among millennials. Confidence in the civil service, meanwhile, has rebounded since 2017, with people being twice as likely to say they are confident in the civil service as in either parliament or government.
In addition to this, British Social Attitudes research shows that the combination of increased cost of living and high taxation has led more of the public, across the voting spectrum, to favour a bigger state in which government takes a clear role in providing universal services.
There is ambivalence, however, about data and AI. A tracker study by the Centre for Data Ethics and Innovation shows that less than half of those surveyed see data collection and analysis as “good for society”, and there is a clear preference among those surveyed for “clear and effective risk mitigation strategies”.
Reading these analyses together, it seems clear that the public wants a more effective, proactive, and transparent approach to government and public services. This is not a mandate to “move fast and break things”, however urgent the challenge at hand.
Trust and digital government
Trust is widely held to be a prerequisite for public use of digital government. This is enshrined in principle by the Public Sector Equality Duty and the Data Protection Act, but enforcement of these in the public sector is limited and infrequent. Within government, CDDO’s Digital Ethics Framework offers guidance but there are no meaningful routes for escalation; meanwhile, both the recent AI White Paper and the newly established AI Safety Institute are determinedly industry facing and provide no red lines or auditing mechanisms for public services.
This lack of good governance around the use and misuse of data and data-driven decisions by public bodies has led to several recent miscarriages of justice, including the Windrush Scandal, in which poor data management contributed to thousands of British citizens being wrongly classed as illegal immigrants; the A-Level Algorithm, in which students from lower socio-economic backgrounds were given downgraded results; the Post Office Horizon Scandal, in which 700 sub-postmasters were wrongly accused of theft; and the failure of the International English Language Testing System, in which thousands of students were incorrectly found to have cheated. It seems probable that there are many other infractions that have yet to come to public notice.
Several contributors to these recommendations shared experiences of low-trust environments slowing down technology adoption. Trust in data use and data security is a particular blocker, with Dr Jess Morley noting that:
The NHS and the Government have badly damaged public trust in the use of healthcare data for secondary purposes (research and analytics). The national opt-out rate is >5%… This loss of trust, and therefore willingness to share data for secondary purposes, is a direct result of the repeated Government failures of care.data in 2013-2015, and more recently the two attempts at GPDPR during the summer of 2020/2021. The same mistakes were made in both instances: lack of communication with the public, lack of co-design with the public, no investment in technical infrastructure to enable privacy protection, no communication of clear governance rules setting out expectations for (e.g., private company use), assumptions that the public would be happy with being simply informed.
Meanwhile, the CDEI research notes that people with low-levels of digital confidence, who may be subject to other kinds of social and economic exclusion, are also more likely to be wary of use of AI and data.
How to rebuild trust and confidence?
Philosopher Onora O’Neill says that, “To judge trustworthiness, we need to judge honesty, competence, and reliability”. For most people, these will be evidenced by a combination of factors including:
- personal experiences and those of friends and family
- reports in the media and on social media
- the behaviour and effectiveness of institutions.
Better stories and experiences for more people are a consequence of better institutions; while this may seem like a long-wave approach, it is not enough to run PR campaigns and seed positive media without also investing in underlying reform. Currently, public services result in a data ecosystem that is opaque to the public and their proxies; this is no basis for trust and must be urgently addressed.
Three steps to rebuild trust and safety
The next government should match the initial £100m investment in the AI Safety Institute to establish a robust approach to the safe and trustworthy delivery of public services. The following three steps could be embarked upon in the first 100 days of a new government, laying the foundations for an improved delivery and regulatory regime.
1. Conduct an Independent Expert Review of Public-Service Regulation
The next government must acknowledge that the current regulatory environment is not fit for purpose and undertake an independent expert review that maps the current state of public-service regulation, identifies the gaps and inconsistencies, and gathers evidence from a wide range of stakeholders, including civil society organisations and people with lived experience of the outcomes of poor regulation. The outcomes of this review must inform the basis of regulatory reform.
2. Establish mechanisms for transparent and ongoing public engagement - the social infrastructure for trustworthiness
Rather than one-and-done citizen juries or participatory panels, regulatory standards must be informed by an understanding of the ongoing and emergent ways public service delivery shapes people’s lives. In particular, it is important to understand that the social impacts of technologies do not emerge in a uniform way but “slowly, then all at once”, often impacting individuals or particular communities in advance of appearing to be a more generalised harm.
To mitigate this and reduce the cost and prevalence of failure, we recommend:
Introducing public-facing transparency features across the government estate, enabling the public and civil servants to understand what is happening to data as we use services. This could be a two-step process, starting with clear and understandable content design that can be rolled out while the necessary technical infrastructure to support this is being developed (see below).
Creating a Civil Society Observatory that funds civil society organisations to co-ordinate and surface early warnings of harms experienced by service users to policymakers and delivery teams
Introducing clear and effective mechanisms for monitoring, enquiry and redress in every public service, perhaps drawing on the model of National Data Guardian in the NHS.
3. Rolling out technical infrastructure for trustworthiness
Steps to begin building the infrastructure for trust:
Updating the Service Standard to include rights-based outcomes, not only user needs, and include frameworks for anticipating unintended consequences.
Establishing continuous discovery teams that are tasked with understanding and responsibly delivering emerging technologies in the context of changing public expectations and a shifting technology landscape. These multi-disciplinary teams would embed a “test and learn” approach that shapes policymaking, shares learning across national, devolved and local government, and works closely with the Civil Society Observatory to understand and monitor outcomes. We propose funding these teams for five years in the first instance, with priority given to services that already use high-volumes of automation and data-driven decision making.
Piloting verifiable data structures that bring transparency and oversight to the use of processing of data. This will enable better visibility for citizens, policymakers and civil servants of how data is used across the technology stack, improving accountability and demonstrating trustworthiness. Such approaches are already used in the private sector – for example, Google’s Trillian creates tamper-evident logs to offer assurance that data is unchanged – and are used to deliver data-usage trackers in public services delivered by the Estonian, Indian, and Chilean governments.
Separating efforts to link data from mechanisms to facilitating data re-use to help departments deliver better outcomes for people. At present, each government department is incentivised to create their own mechanisms to link data, undermining progress and wasting resources. Departments often combine their efforts to link data with bespoke arrangements for making it available for reuse, with limited opportunities for public oversight and scrutiny, damaging trust. Labour should end this cycle of waste by investing in the creation of an Office for Data Linkage, under the banner of the UK Statistics Authority, whose sole purpose is to link datasets across government. Arrangements for access and reuse should be transparent and overseen by the Civil Society Observatory.