Month: December 2025

CSEP Cross-Institutional Partnerships: The Centre for Emerging Technology and Security (CETaS)

What is the primary research theme guiding this collaboration?

This project analyses the current UK AI assurance marketplace specifically for the defence and national security (D&S) sector. A thriving AI assurance sector has the potential to enable AI adoption and become a key driver of economic growth in the UK. We were keen to describe the current state of AI assurance within D&S organisations, highlighting strengths, challenges and possible mitigations to enable safe and effective AI adoption. Ultimately, the paper provides lessons from the D&S sector to assist the growth of a robust AI assurance marketplace across the broader UK economy.

Can you outline the ways this project supports CSEP’s goals?

This research is aligned with CSEP’s mission of improving UK competitiveness as it will be essential in the coming years to focus on the role of AI to boost economic growth. By partnering with CETaS and its expertise on AI and Defence and National Security (D&NS) topics, this project will demonstrate both centres’ commitments to advancing robust economic growth and AI safety as mutually reinforcing goals. We also anticipate synergies with other CSEP projects, such as UK’s cybersecurity or future collaboration on a sector plan for growing assurance in D&S for national benefit.

 How might the research influence policy, industry, or society?

The Department for Science, Innovation and Technology (DSIT) has identified growing the UK’s third-party AI assurance sector as a key priority within the AI Opportunities Action Plan. A thriving assurance marketplace would also help deliver the UK Government’s mission to kickstart economic growth through the adoption of AI technology, as stated in the UK’s Prosperity Mission. This research helps to support the UK government’s AI ambition by identifying: (i) key drivers of demand and approaches to AI assurance in this sector, (ii) supply and demand limitations, and (iii) recommendations to grow a thriving and robust AI assurance marketplace.

Are there any early findings or achievements that stand out?

We found it interesting that there is strong recognition across defence and security (D&S) for AI assurance, yet the level of maturity across organisations varies widely. We identified significant pockets of excellence where good practice is well established, but there remains an open question about how much to focus on in-house or external assurance moving forward. D&S also provides critical lessons for other UK sectors, such as articulating sector specific requirements, developing initiatives to upskill key stakeholders, and creating certification schemes for AI assurance providers.