A digital policy memo for the Minister's red box
Some personal thoughts on what should be a priority for the new Labour Government on digital policy and regulation (This blog represents my own views and none of the organisations I work with)
Well, the UK public have spoken - we have a new Labour Government. With a big majority and mandate for change. A significant shift in the UK political landscape. We now have a centre-left government for the first time in 14 years. So what should the new Government focus on in terms of digital policy?
(**A few updates to this blog were made following further developments over the weekend following the election**)
The new Secretary of State for Science, Innovation and Technology is Peter Kyle. He doesn’t have a previous background in digital, so there is not much to go on in terms of what tech perspective he may bring to the role. We await details of other junior Ministers in DSIT.
So, in the meantime, I have penned a blog to get the ball rolling about what the Minister’s red box may need to contain on data and AI policy.
A Bill focused on AI seems likely to be a priority and this will also bring an opportunity for targeted data protection reforms as well.
The ongoing challenge is create the right balance between unlocking the economic and social benefits of innovating with data, and also safeguarding for risks and harms. To ultimately create policies that direct regulation and other interventions towards creating effective trust and confidence in how data is used. I’d argue that this means we need to promote regulation that enables an outcome driven approach and accountability, not mere technical compliance. Plus, effective human centric impact assessments and safety by design solutions that support innovation.
Easier said than done in a world where the pace of change seems to run at frenetic pace, where the UK is part of a global digital economy, and the horizontal nature of data in our lives makes policy and evidence complex. We also need to strengthen our approach in assessing the multi-dimensional benefits and risks.
As an overarching message, there is much to be drawn from the policy approach advocated for by Connected by Data: “We envision a new relationship with technology that is safe, inclusive, and empowering”. There are also many good points in their progressive vision document on skills and data infrastructure that are key areas of consideration as well.
The Tony Blair Institute for Global Change (not connected to government, but likely to be essential reading for Labour policy wonks) have produced “Governing in the Age of AI: A New Model to Transform the State”. This paper sets out a vision of how AI can transform the state’s delivery of public services. The paper also argues that up to £40 billion can be saved each year with the technology as it exists now. All these claims have to meet the hard cold reality of public services now though, including data quality challenges and whether AI can meet the standards of accuracy and fairness needed for public sector deployment (where the public often have no choice). A high level of transparency and public engagement will be vital to make this switch (see more on this below). The paper also calls for an “AI Mission Control” to be set up in Number 10.
The Labour Government also quickly rejected Tony Blair’s call for digital ID cards, when this emerged as an issue the weekend after the election. Plans for a new federated digital identity system were contained in the Data Protection and Digital Identity Bill, which fell before the election. These plans were did not focus on a compulsory element and were focus on consumer choice and creating more trust and confidence in the UK market for identity, while drawing some public sector data to help prove identity.
I haven’t covered cyber risk in this blog, but there is a great overview of what we can expect from the new government on cyber over on The Record.
Labour’s 2024 Manifesto
The Labour Manifesto was relatively light on detail related to digital policy, but committed to the following:
Creation of a new Regulatory Innovation Office - to help update regulation and co-ordinate this across sectors.
Binding regulation intended to ensure “the safe development and use of AI models”. To apply to a small core of companies developing the most powerful AI models.
Remove planning barriers for new data centres.
Create a National Data Library to bring together existing research programmes and help deliver data-driven public services (also see this great blog by Gavin Freeguard on the topic )
Legislate to ban the creation of sexually explicit deepfakes.
Harness the power of AI in the National Health Service.
Introduction of the Online Safety Act should be accelerated and could be built upon.
Coroners would be given new powers to access information held by technology companies after a child’s death.
AI growth and regulation - top of the pile?
A top priority for Labour is economic growth and accelerating the potential of the UK’s digital economy will be a key component of their strategy in this area. Therefore, as the Manifesto indicates, we can expect their approach to AI regulation to be cautious about the risks of prescriptive regulation. There will be points to draw from the product safety approach of the EU AI Act but lessons to learn about how it will work in practice.
There may be pressure from trade unions to focus on the impact of AI in the workplace, with valid concerns covering a range issues - including automated recruitment, workplace monitoring, performance management and use for “hire and fire” in the “gig economy” and other sectors. The Trades Union Congress (TUC) has also published a draft AI Bill to address risks and harms of AI-powered decision making in the workplace, and that everyone benefits from the opportunities associated with AI at work.
Peter Kyle also indicated, back in February 2024, that a Labour government would replace a voluntary testing agreement between tech companies and the government with a statutory regime, under which AI businesses would be compelled to share test data with officials.
My key points on AI:
Principles. Build on the AI principles developed by the previous government (and the international standards developed by the OECD and the Council of Europe). There is a case to place AI principles on a statutory footing with a duty for all regulators to have regard to them. There could then be a period of review and monitoring - to assess whether further statutory regulation is needed to oversee how regulators are applying and striking the balance between innovation and safeguards. This review could also inform questions about how big the AI regulation gap is in the UK.
Regulation of high risk LLMs. There is a strong foundation of evidence and assessment from the UK’s world leading AI Safety Institute and a good opportunity to continue and develop its mandate to assess current and future risks posed by the most powerful AI models. The government will need to use this evidence to inform when and how regulation should be introduced, including what role independent conformity assessment and certification should play in the UK market. There is also a question as to whether the AI Safety Institute should have more independence from government.
Coordination of public sector AI regulation. There is a need for more coordination for regulation of AI in the public sector - a companion to the Digital Regulation Cooperation Forum could be set up for the public sector. Use of AI on health, benefits, policing, migration could be transformational but also poses significant risks and intersections with other rights. A coordinating body could include the ICO, EHRC, National Audit Office, PHSO and the Local Government Ombudsman. A key priority should be safeguards and standards in procurement of AI.
AI safety-tech and privacy enhancing technologies and governance. There is an opportunity to develop a new programme to accelerate the assessment of, and investment into, technologies that mitigate AI risks and harms. Including retrieval augmented generation, unlearning and role of synthetic data. Policy interventions should also focus on balance needed between technology driven mitigations and the role of human driven governance, safeguards and interventions.
AI and employment. There is an important opportunity to run a specific consultation and multi stakeholder process on the role AI in the workplace, to inform whether specific legislation is needed or additional focus from existing regulators. One option would be to create a statutory requirement for the Information Commissioner’s Office (ICO) to develop a GDPR code of practice on AI and employment, in consultation with other regulators, industry and trade unions.
Public engagement and consultation about data and AI
The last Conservative government had a poor track record on effective public engagement about data and AI, particularly with civil society.
Labour’s plans for using AI and other emerging technologies in the public sector, including the NHS, will create significant questions about data sharing and the role of commercial companies. Learning lessons of previous data sharing programmes, including GP data sharing, is vital. We can’t lose sight of the people and communities these programmes must serve.
Transformational technology programmes in the public sector must include enough time space for public engagement and then feeding this into the design of the programmes. While this can inform meaningful transparency and communication it should also inform decisions about safeguards, independent audit and the role of third parties.
Complex issues like the use of AI to deliver public services require effective and deep public engagement. For major programmes citizens’ juries should be used and there is a positive indication that Labour are already considering this policy mechanism. As IPSOS note in this blog “Deliberative processes such as citizens’ juries and assemblies excel in addressing complex issues where trade-offs are inevitable and public opinion is paramount”.
There are also some important examples from the Citizens Juries CIC (a Manchester University based social enterprise dedicated to designing and running citizens’ juries). The ICO used this process in developing its guidance on explainable Artificial Intelligence.
Data Protection - stick or twist?
The previous Government’s Data Protection and Digital Information Bill (DPDI) ended up being messy and complex, a more focused reform package would probably have passed before the election. As the whole bill fell, we can now pick through the bones and reassess what the key objectives would be.
The economic estimates of savings from the DPDI seemed wildly optimistic, given so many multi-national companies indicated they would retain global data protection governance programmes based on GDPR when operating in the UK.
While it was unlikely that the DPDI Bill would have ultimately led to the European Commission withdrawing the UK’s data protection adequacy decision under the EU GDPR, it created unnecessary concern in the EU. Particularly on issues such as ICO independence and changes to automated decision making protections. The bigger risk was also further down the line, from a challenge in CJEU.
There is also a good opportunity for the UK to take on further leadership in the international data policy space - continuing the work to plan for the UK’s membership of the Global Cross Border Privacy Rules system (CBPR) and advocating how the CBPR needs to reform, to bring the standards closer to UK GDPR and create effective interoperability.
As the new Government have talked about improved relations with the EU there should also now be an opportunity to realise Article 769 of the EU-UK Trade and Cooperation Agreement- which says “The Parties shall cooperate at bilateral and multilateral levels, while respecting their respective laws and regulations. Such cooperation may include dialogue, exchanges of expertise, and cooperation on enforcement, as appropriate, with respect to personal data protection.”. This could include a cooperation mechanism between the ICO and the European Data Protection Board under Article 50 of the GDPR. This makes sense given the ongoing similarity between the UK and EU laws and could enable cooperation on key areas such as AI and children’s privacy.
With all that in mind I can see value in a more targeted set of data protection reforms,(including some elements that should be reintroduced from the DPDI):
ICO reform. Reintroduce the changes to reconstitute the ICO as the Information Commission, with a statutory board. The need for modernised structure and governance for ICO is still a key reform, to bring the ICO in line with other UK regulators and provide a more resilient and sustainable system of decision making.
Statutory codes of practice. Undertake a review of outstanding areas from the DPDI Bill - including legitimate interest, data protection impact assessments, automated decision making, subject access and research - to consider which areas would be better addressed by new ICO statutory codes of practice - to provide greater clarity and also allow evolution over time. In particular, creating clarity over the use legitimate interests is still of key importance given the challenge of using other lawful bases for AI training and other development activities with new technology.
International transfers. Reintroduce the DPDI Bill’s “data protection test” for international data transfers under UK GDPR, to create a more proportionate and sustainable regime for assessing the risks of personal data transfers from the UK to third countries. This may not create big savings for multi national companies but it creates an important distinction between UK and EU GDPR, and past and future cases that come from the CJEU. It also creates an important international signal that there is different way to approach risk in international data transfers.
Children’s privacy. Update S.123 of the Data Protection Act to require the ICO to include the risks of AI to children in Age Appropriate Design Code. Add a further requirement for a new statutory code of practice to cover educational technologies in schools.
Consultation and impact assessments. Update Article 35 of UKGDPR to require public authorities to publish data protection impact assessments. This is an important step to improve public accountability about the use of AI and technologies in the public sector. The lack of transparency is evidenced in the work of the Public Law Project on the Tracking Automated Government register.
Privacy by design for processors. Update Article 28 of UKGDPR to extend the requirements of privacy by design to processors, to address the increasingly blurred line between controllers and processors in the AI lifecycle and the challenges that exist for controllers in addressing data protection risks via procurement processes.
Cookies and adtech. Reintroduce the exemption in the Privacy and Electronic Regulations (PECR) for use of cookies for analytics purposes - a proportionate step still needed to address user consent fatigue and allow organisations to benefit from analytics, though with safeguards on data re-use. Commit to leading an international policy initiative (through an organisation such as the OECD), to address the challenges of third party cookie regulation via technical solutions that use internationally agreed standards and reduce privacy intrusion of third party AdTech models.
Marketing and charities. Reintroduce the marketing “soft opt-in” for charities for email/text marketing, to bring in line with the private sector.
Smart data and digital identity. Reintroduce the smart data and digital identity provisions in the DPDI, assessing whether independent statutory regulation for digital identity should be added.
Online Safety - plug the gaps and ensure evidence leased policy decisions
Labour have been supportive of the Online Safety Act (OSA), which Ofcom are now working hard to implement.
It could be very easy for a new Labour government to make some eye catching announcements about online safety, focused on age requirements for social media and smartphone bans. Quick fixes should be resisted in favour of an engaged policy approach that looks at evidence from a wide range of sources on the relationships between technology use and children’s well-being. This research undertaken by Professor Sonia Livingstone at LSE, being one example.
Key steps that the government should consider for online safety:
Plug the gap in the Online Safety Act to enable accredited researchers to apply for access to data held by major platforms, drawing on the approach in the Digital Services Act.
Undertake consultation and assessment of the ongoing risks of misinformation and whether the OSA needs to be updated to address these risks.
Fund a long term research programme to systematically assess the impacts of the OSA, particularly focused on research how it impacts children’s digital lives. This research would need to start soon to ensure an effective starting benchmark.
Also essential reading - Professor Vicki Nash from the Oxford Internet Institute outlines the challenges around online safety that will need to be addressed by the next government.
Freedom of information, open data and transparency
The UK Freedom of Information (and it’s companion law in Scotland) have proved remarkably durable for the near 20 years they have been in force. Most notably in 2015, a government review mooted many changes, such as request charges and removal of the public interest test to certain exemptions. In the end, a big media campaign and public engagement in support of FOIA resulted in a favourable report from Lord Burns about how the Act was working.
The right to request information from public bodies (generally without charge) is a well recognised cornerstone accountability mechanism - it can focus on issues such as the sale of school playing fields to multi-billion pound decisions to reform public services. The key concern of requestors is still delay, plus the requirements for proactive disclosure are hard to police and may not reflect the needs the public, the media or civil society.
With public sector finances tight, we clearly need to learn lessons from the open data push that took place under the Coalition Government from 2010, which ultimately fell back as publication of datasets became unsustainable over time. Open data needs a timely reboot, as it can also support the supply of quality datasets for research and innovation.
Labour’s new approach to government will include “Injecting more accountability into government” (see Mission Driven Government document). This document talks about new data transparency measures including a possible statutory requirement to report on progress to Parliament and new requirements to publish data on performance and delivery.
We also know that the Prime Minister’s Chief of Staff, Sue Gray, has extensive experience of FOI when she was Director-General of the Propriety and Ethics at the Cabinet Office. I had a number of interactions with Sue during my time at the ICO, including on the use of private email in government (see the Michael Gove case from 2011). We can expect her advice about FOI to be tempered with this deep experience and how it interacts with collective responsibility in Cabinet discussion and safe space for policy. But I would also hope there is insight about when it is important to be about new open government policy and when the that space is needed - the FOI battle can often reduce trust and confidence and damage public engagement (I would cite the long running FOI decisions we at the ICO made about the free schools policy in the early 2010’s as a good example).
Given the Prime Minister’s commitment to public service in his early speeches now would also be a good to review and strengthen guidance for Ministers on use of private messaging services for official business.
My key FOIA points:
Expand FOI and EIR to key public sector contractors. Undertake consultation about expanding FOIA and the Environmental Information Regulations (EIR) to contractors who deliver public services - based on the options proposed in the ICO’s 2019 report to Parliament. Also consult on what should be done to ensure any extensions are fair for SMEs and charities working with the public sector. One option is to start with an initial wave of FOIA designations under section 5 for the largest contractors.
Extend FOIA and EIR to key services that are essential to the public. To improve transparency and accountability - consult on extending FOIA and EIR to housing associations and other relevant services.
Time limits under FOIA. Amend FOIA to create a statutory time limit for conducting internal reviews, to set a clearer line in the ground on timeliness and help reduce delays.
Reform FOIA appeals. Remove the Information Tribunal from the appeals process under FOIA. Appeals would go to the Upper Tribunal on points of law only (similar to Scotland) and similar to most appeals systems for FOI around the world. Allocate the costs saved to the ICO to invest in case work quality monitoring and proactive monitoring of public authority performance.
Proactive disclosure reform. Undertake a review of FOIA publication scheme requirements and how public bodies are performing against them, with a view to reforming the approach to ensure that all proactive disclosure requirements are feasible and achievable for public bodies, but then effectively enforced by the ICO. This would also involve public, media and civil society engagement about which datasets are essential for public engagement, discourse and accountability. What does meaningful and inclusive transparency mean in practice for the public?
This would also include the dataset requirements in FOIA and the Re-use of Public Sector Information Regulations. The review should also consider whether the publication scheme concept in FOIA is feasible and how proactive disclosure requirements should be set as a digital service obligation. A review should also consider the feasibility of the ICO requiring publication registers based on open standards and APIs to assess whether larger public authorities meet proactive disclosure obligations.
ICO reform. Include a requirement for the new Information Commission to include a Deputy Commissioner for Freedom of Information on the statutory board. This ensures that FOI functions, and the importance of transparency, have a strategic position in the direction of the ICO.
Steve Wood is Director and Founder of PrivacyX Consulting and former UK Deputy Information Commissioner.
(This blog represents my personal views and none of the organisations I work with)