Agenda item

Social Care and Education quarterly dashboard

The Strategic Director for Social Care and Education updates on the new Social Care and Education quarterly performance dashboard.

Minutes:

The Strategic Director for Social Care and Education updated the Commission on the new Social Care and Education quarterly performance dashboard designed to support scrutiny by offering improved access to data and enabling more effective oversight and questioning. It was noted that: 

 

  • The dashboard was initially created in Excel format and included a range of financial, workforce and performance metrics across Children's, Adults and Education services. 
  • Although the data had not yet been fully verified, it provided a working example of what the dashboard would contain and how it might be used. 
  • Plans were in place to eventually host the dashboard on a web page to improve navigation and usability over time. 
  • The dashboard aimed to show direction of travel and included comparisons with national data and statistical neighbour groups to help contextualise performance. 
  • It was intended to be updated on a quarterly basis, with some time lag in data availability expected. 
  • Key content would include financial information, budget variances, and a series of selectable graphs to help interpret trends. 
  • The dashboard would also provide context for local data, comparative analysis, and actions being taken in response to trends. 
  • Further detail would be included on external providers, such as CQC ratings, usage and cost data relating to the most frequently used and most expensive providers. 
  • Information on volumes across different care settings would also be included, covering both adult domiciliary care and children’s services, including those leaving care. 
  • A new set of statistical neighbour comparators had been introduced, including areas such as Birmingham, Coventry, Luton, Manchester, Nottingham and Wolverhampton, although it was noted that not all were considered directly comparable. 

 

In discussions with Members, the following was noted that: 

 

  • Members welcomed the transparency of the new dashboard and the opportunity it presented for improved scrutiny and questioning. 
  • Members noted the importance of having governance arrangements in place to ensure that patterns such as rising placement costs or increased use of unregulated settings were escalated and addressed. 
  • It was confirmed that the dashboard was intended to support strategic-level oversight, with operational data and early intervention continuing to be handled by service teams. 
  • The dashboard aimed to democratise access to information, allowing elected members and scrutiny bodies to examine trends independently and raise questions. 
  • Members asked whether a model similar to performance oversight panels used elsewhere, such as in Cambridgeshire, could be introduced locally to investigate red flag areas in more depth. It was explained that several forums were already in place, including departmental management meetings, lead member briefings and the Education, Health and Care Board, where performance data was scrutinised and turned into actions. 
  • It was acknowledged that historically, a wide range of performance information had not been made available on a regular basis. The dashboard aimed to change this and encourage broader challenge from different perspectives. 
  • Concerns were raised that the focus should not only be on monitoring but also on acting to improve long-term outcomes. Members asked whether outcomes, rather than outputs, would be measured and tracked. 
  • It was confirmed that outcome measures would be included where possible, and that the dashboard could evolve over time based on what data was available and what members wanted to see. 
  • Members highlighted the importance of ensuring the data supported a “triangulated” approach to understanding performance, and not be seen as a standalone source of truth. 
  • The limitations of comparative data were discussed, with members noting that some statistical neighbours were not truly comparable to the local context. 
  • There was support for the use of a live, accessible dashboard, but members raised questions about how to encourage regular engagement with the data beyond formal meetings. 
  • It was noted that there was a risk of drawing incorrect conclusions by focusing too narrowly on data without the broader context. 
  • A live example was shared of a past inspection in which unfamiliar data requests had revealed issues previously unconsidered, reinforcing the importance of diverse data perspectives. 
  • It was emphasised that the dashboard should be used to prompt questions and generate discussion, rather than as a tool to provide definitive answers. 
  • Questions were raised about agency staffing levels and whether there were plans to reduce reliance on agency staff in order to promote cost savings and improve continuity of care. Agency use was minimised wherever possible, though some reliance remained in hard to recruit areas such as Level 3 social work roles. Acrossadult social care and safeguarding,, fewer than 20 agency staff were in post at any one time within a workforce of around 470. 
  • Members welcomed the inclusion of workforce data but requested further breakdowns, such as distinctions between children’s and adults’ staffing, and between domiciliary and residential care provider data. 
  • It was explained that children’s agency staffing had been prioritised due to higher levels of use, while it had not been a significant issue in adult services. However, members’ suggestions could be explored further through the Commission’s annual workforce item. 
  • A request was made for clearer time series data to avoid over-fixation on small changes. It was noted that the dashboard did contain time series graphs to highlight more statistically significant trends. 
  • Members supported the dashboard as a valuable starting point for improving scrutiny and emphasised the need to develop habits around year-on-year comparisons to better understand change over time. 

 

AGREED:

 

1.    That the report be noted and that members welcomed the idea of the dashboard. 

2.    The Virtual schools report to be circulated. 

3.    Rational between residential and domiciliary care to be added to the work programme.  

4.    Agency rates to be added to the next workforce item. 

5.    Diverse by design to be added to the work programme.  

 

Supporting documents: