Home / Resources & Guidance / The CQC is Listening Presentation given to the CQC Board on Artificial Intelligence (AI)

I recently had the honour of making a presentation to the CQC Board at its new offices just opposite the City of London Stadium in Queen Elizabeth Olympic Park. I was nervous, never having been in front of this august body before and was heartened by the nods of agreement when I was speaking from Kate Terroni, the Chief Inspector for Adult Social Care and from Chris Day, CQC’s Director of Engagement.

I recently had the honour of making a presentation to the CQC Board at its new offices just opposite the City of London Stadium in Queen Elizabeth Olympic Park. I was nervous, never having been in front of this august body before and was heartened by the nods of agreement when I was speaking from Kate Terroni, the Chief Inspector for Adult Social Care and from Chris Day, CQC’s Director of Engagement.

The subject of my talk was artificial intelligence (AI), and my presentation is available at the Care England website. In preparation, I had consulted various care providers and tech suppliers, as well as some futurologists. As I have said previously in this newsletter, while the process of digital transformation in care is in its early stages, this should not stop us looking at the possibilities for digitally-assisted care in the future.

Here are some of the points I made:

  • The time where AI tools take over direct delivery of care is a long way in the future, if ever. The question was asked by the CQC Chair, Peter Wyman, and there was no way of giving him a definitive answer.
  • AI is currently being incorporated into many tools such as digital social care records, acoustic monitoring, and remote monitoring in order to analyse patterns, especially when the data from these digital tools is triangulated. The data is then analysed to inform AI-enabled, decision making tools for staff to interact with.
    • There is some work on logistic regressions[1] and machine learning to identify risk factors and cohort characteristics, that can then be used to run models on data.

There is very little work going on on what are called “neural net” type approaches in the care sector (i.e. building an unsupervised care delivery tool) for many reasons. One of the main reasons being that if something were to go wrong, the process of regulation of an unsupervised, AI-made decision brings up issues which are too difficult to legislate for. An analogous example often given is the answer to why driverless cars are not more prevalent, even though it is proven that they are more reliable in some ways than humans and can reduce motor vehicle fatalities. The main reason is that the legal/insurance ramifications when anything goes wrong cannot be legislated for at present[2], so the issues of regulation and legal liability are big questions to tackle. We do know that an AI as a health tool would need to be classified as a regulated medical health device and we know that CQC, NICE[3], the HRA[4] and the MHRA[5] are in very earlydiscussions about this (see below).

One area in which there is much optimism is in natural language processing (NLP), which is a process whereby computers gain the ability to understand text and spoken words in much the same way human beings can. This can lead to interpretation of people’s expressions of what they want and can lead to the ability to give a voice to people who cannot express themselves well to care and health professionals. This could be especially helpful in interpreting the voice of people with learning disabilities when they require health and care support. The organisation Coproduce Care (https://www.coproducecare.com/about), which is spearheaded by Care England member The Manor Community in Bristol, is carrying out some interesting work in this area. The CQC is looking to develop its understanding of how this work can support people with a learning disability have their voices heard through AI.

Other areas which could be of interest include some work being carried out in the NHS by Methods Analytics[6] on reviewing complaints made to the NHS. It is using NLP to assess complaint letters to try and understand why people complain and to be able to intervene to prevent that specific problem occurring again. This sort of work could be replicated in adult social care as well. There is also potential in work being done by digital social care record supplier KareInn[7] with Ally Labs[8] on how acoustic monitoring and care recording can produce algorithms which help predict care needs.

I also made the point that we in the care sector hope that CQC is going to be able to use some of the techniques described above to help refine its inspection processes. We also understand that the Multi Agency Advisory Service (MAAS)[9] might undertake further work on this issue and interview health and social care providers to understand their information requirements around AI. Care England intends to try to work with the CQC and the MAAS on this in the future.

On this area of development the CQC is listening to us, and that in itself is very encouraging.

If you want to put your views on this article, please contact Daniel on dcasson@careengland.org.uk.

Daniel Casson, Care England’s Adviser on digital transformation (www.careengland.org.uk/digitalblog). 

Daniel is curating the March 2021 Laing Buisson Digital Care Tech 21 Conference (DCT21 _ Laing BuissonCare England member 15% discount code DCT1521)and  is co-host of the Talking Care Podcast (https://talking-care.com/), as well as being one of the team of Digital Social Care (www.digitalsocialcare.co.uk).


[1] An algorithm produced from collated evidence triangulation various outcomes.

[2] For an interesting discussion on this issue go to https://www.frickey.com/blog/pros-cons-driverless-cars (accessed 25 September 2021).

[3] The National Institute for Health and Excellence _ https://www.nice.org.uk/ (accessed 25 September 2021).

[4] The Health Research Authority – https://www.hra.nhs.uk/ (accessed 25 September 2021).

[5] The Medicines and Healthcare products Regulatory Agency _ https://www.gov.uk/government/organisations/medicines-and-healthcare-products-regulatory-agency (accessed 25 September 2021).

[6] https://methodsanalytics.co.uk/ (accessed 25 September 2021).

[7] https://kareinn.com/ (accessed 25 September 2021).

[8] https://www.allycares.com/ (accessed 25 September 2021).

[9] A collaboration between NICE, the MHRA, the HRA and the CQC. See https://www.nhsx.nhs.uk/ai-lab/ai-lab-programmes/regulating-the-ai-ecosystem/the-multi-agency-advice-service-maas/ (accessed 25 September 2021).