SECQO paper published in Clinical Risk journal

Page 1

PAT I E N T S A F E T Y

Quality Observatories: using information to create a culture of measurement for improvement Samantha Riley and Katherine Cheema; NHS South East Coast Quality Observatory Abstract In June 2008 High Quality Care for All was published, a blueprint for the delivery of NHS services in the future with quality of care put firmly at the heart of the principles guiding the NHS. Measurement of quality was identified as a crucial aspect of achieving delivery, enabling clinical teams to focus on where they need to improve most, and monitor the effects of interventions and initiatives. To date, the South East Coast Quality Observatory has developed over 80 benchmarking tools, products and analyses which contribute to the goal of delivery high quality analysis in a way that can be understood and utilized effectively by all levels of staff in a variety of clinical and managerial settings to evidence and drive improvement and innovation across organizations and local health economies. Patient safety, as a fundamental aspect of high quality care, makes up a significant part of this output, focusing on key topic areas that can deliver the best, and safest, results for patients. The following sets out details of a selection of the patient safety areas that the Quality Observatory has focused on and describes the analysis undertaken and, crucially, the key outcomes linked to its publication and usage.

Introduction: what is a Quality Observatory? In June 2008 High Quality Care for All 1 was published, a blueprint for the delivery of NHS services in the future. This milestone document put quality of care at the heart of the principles guiding the NHS and made clear that measuring this quality was a crucial aspect of delivery, enabling clinical teams to focus on where they need to improve most and monitor the effects of interventions or initiatives. It was explicitly acknowledged that high performance in all aspects of quality is nearly always present in organizations that proactively and effectively measure their activity and use the information gathered to drive improvements forward. One aspect of embedding this measurement agenda in the NHS as a whole was the requirement for each of the 10 strategic health authorities (SHAs) to establish a formal ‘Quality Observatory’ which built on existing analytical arrangements. Quality Observatories were expected to provide a range of functions, with the key stakeholders being clinical teams operating on the front line. These functions primarily included:

† †

To enable and support benchmarking across regions; To support the development of metrics to enable frontline staff to effectively monitor and improve their services;

In NHS South East Coast, this function had been in place for sometime providing a free service to NHS professionals within the region. Since 2007 there had been a substantive ‘Knowledge Management team’ focused on helping organizations and individuals get more from the wealth of data collected by the NHS. Based at the SHA headquarters, the team provided analysis and benchmarking to the NHS across Kent, Surrey and Sussex on a broad range of subjects including performance, efficiency and a variety of clinical services. In addition to the provision of analysis and benchmarking, the Quality Observatory has provided the following services:

† † † † †

Samantha Riley, Katherine Cheema, NHS South East Coast – The Quality Observatory, York House, 18–20 Massetts Road, Horley, Surrey RH6 7DE, UK Email: samantha.riley@southeastcoast.nhs.uk

DOI: 10.1258/cr.2010.010002

To identify opportunities for clinical staff to innovate and improve services, providing the quantitative evidence to support change where required.

An education and training function both for analysts and ‘customers’ of information. An example of the training provided is the ‘de-mystifying data for clinicians’ training session; How to develop clinical indicators and combine them to understand more about a service; Identifying best practice and evidencing the variation between teams and organizations; Acting as a conduit for local issues/news to be raised with relevant national bodies and vice versa; A bespoke service to individual clinicians and teams requiring support with analysis/development of measures.

To date, the South East Coast Quality Observatory has developed over 80 benchmarking tools, products and Clinical Risk 2010; 16: 93 –97


94

Riley and Cheema

Observatory has focused on and describes the analysis undertaken and the key outcomes linked to its publication and usage.

Healthcare-associated infections (HCAIs)

Figure 1 ‘Old-style’ reporting for Clostridium Difficile Associated Disease (CDAD) cases. Source: NHS South East Coast (available in colour online)

analyses which adhere to the goal of delivering high quality analysis in a way that can be understood and utilized effectively by all levels of staff to evidence and drive innovation and improvement across organizations and local health economies. Patient safety, as a fundamental aspect of quality care, has made up a significant part of this output, focusing on key topic areas that can deliver the best results for patients. The remainder of this paper sets out details of some of the patient safety areas that the Quality

When creating a suite of tools and analysis for this area of patient safety, a key issue was the varied audience that would wish to access the information. Directors of infection prevention and control, their nurse-led infection control teams, ward-based staff, performance managers and commissioning managers are just a selection of the range of staff groups and backgrounds that would need to be able to access what can be complex information in a simple way. The way in which information is presented is very important – it needs to be visually appealing and at the same time enable the user to quickly and clearly understand what the data are saying – often this is sadly not the case. Figure 1 shows a sample of how HCAI information was historically presented. The analysis of trends over time were difficult to gauge at a glance, and the variation in cases was hard to compare with other factors, for example, the time of year. The majority of stakeholders viewing these data would look at a single figure for one month and perhaps one either side of it, and draw conclusions and potentially make judgements. The use of colour coding also invited users to make an immediate judgement based on a limited amount of data which were often inappropriate.

Figure 2 Healthcare acquired and associated CDAD reporting dashboard for PCTs. Source: NHS South East Coast (available in colour online)

Clinical Risk

2010

Volume 16

Number 3


Quality Observatories

The Quality Observatory worked with clinicians to develop a ‘dashboard’ which allowed multiple views of data on one sheet. Figure 2 gives a simple example, where the user can see month by month cases against defined limits, the cumulative position for these cases against limits, and a ratebased view which allows comparison between organizations. The dashboard uses colour and graphs to make it easier to understand what the data are saying, whether the position is improving or declining, and whether thresholds are being met. The dashboard also features controls that allow users to select their specific organization, aggregate views across the counties or region or other organizations they wish to compare their own performance against. This makes the dashboard simple and intuitive to use; users can access a wealth of data, including commentary, at literally the touch of a button. Aside from the regular reporting for MRSA and Clostridium Difficile Associated Disease (CDAD), the application of specific improvement techniques have enabled stakeholders to gain an in-depth knowledge of the variation in cases. Initially, this has been applied to CDAD but could be used for a multitude of patient safety events, such as catheter-related urinary tract infections (UTIs), in-hospital patient falls or drug administration errors. The concept of looking at a system, over time and assessing the variation within it, rather than making judgement on single number or against a set target, is one that is well established in the manufacturing industry and can be applied to great effect in healthcare. By identifying unusual, or ‘special cause’ variation within a system, factors that affect it can be identified and acted upon. Figure 3 is an example of how the Quality Observatory applied this technique to CDAD data with the

95

development of an ‘early warning system’. The graph is annotated to show points of unusual variation, in this case mostly linked to bank and school holidays. Again, the graphical presentation and the non-judgemental approach is far more conducive to engaging all stakeholders in the use of information to evidence their actions and be pro-active in managing the system.

Serious untoward incidents Perhaps the most explicit example of where effective display of information has been key to widening engagement with stakeholders is the area of reporting serious untoward incidents (SUIs). SUIs are a key facet of monitoring and improving patient safety. SUIs are reported via a central system and can be viewed and analysed by the SHA. However, the information is largely textual and can be hard to review within and between organizations. The development of a ‘SUI dashboard’ (currently in draft) enabled a more coherent view of the incidents reported and includes a variety of measures concerning timely reporting and closure – all of which can be broken down by incident type. Similar to the HCAI dashboard already discussed, users can filter the information as they wish using drop-down menus, the display updating automatically. One of the most useful aspects of the analysis is the inclusion of National Reporting and Learning Service (NRLS) information. This adds additional details to the overall picture but also gives a comparison between numbers of incidents reported via each route. The use of

Figure 3 Annotated statistical process control chart illustrating weekly CDAD cases. Source: NHS South East Coast (available in colour online)

Clinical Risk

2010

Volume 16

Number 3


96

Riley and Cheema

Figure 4 SUI dashboard including NRLS data. Source: NHS South East Coast (available in colour online)

Figure 5 Safer, Smarter Nursing Metrics dashboard. Source: NHS South East Coast (available in colour online)

Clinical Risk

2010

Volume 16

Number 3


Quality Observatories

multiple data sources around a single subject is a powerful way to build a full picture of an organization or service and prevent bias towards one route or methodology of reporting. The dashboard also removes the need to examine individual reports to gain an over-arching view of a trust’s approach to patient safety incident reporting, thus saving time and engaging a wider audience in the process and importance of reviewing of this kind of patient safety information. Figure 4 shows the SUI acute trust dashboard, with the NRLS data included in the boxes with the blue background. The figure shows all SUI incident types and has been anonymized. The use of multiple indicators from national data sources is well exemplified above, but as yet does not show how these can be triangulated with the wealth of local data available through acute and primary care providers. The Quality Observatory will be exploring this next phase of development with commissioners and providers.

Safer, Smarter Nursing Metrics The issue of patient safety is nowhere more crucial than on the ward, where staff at the ‘front line’ are directly responsible for ensuring their patients are treated to the highest quality standards. The ‘Safer, Smarter Nursing Metrics programme’ aimed to build a product that included measures that could be ‘owned’ by the staff actually providing care and measure how their influence could tangibly improve outcomes for patients, for example drug administration errors, in-hospital falls and pressure ulcers. The programme had clinical sponsorship from Directors of Nursing and has subsequently been developed, in conjunction with the Quality Observatory, by individual trusts to add more indicators and produce analysis at the ward level. Figure 5 shows the regional Safer, Smarter Nursing Metrics dashboard which has six indicators of nursing quality, all of which were agreed by clinical stakeholders, two of which are sourced from national data-sets, the remainder from trusts’ clinical risk management systems. The simple presentation of the trust data on the blue bars against the regional rate on the red line, allows busy clinicians to quickly assess their position and look at their trend over time. In general, across the region, indicators have shown a downward trajectory since the establishment of this programme. Whether or not this is improvement has been directly influenced by having the data available, freely

97

accessible and easy to use is not proven, however as High Quality Care for All states, ‘we know that a defining characteristic of high performing teams is their willingness to measure their performance and use the information to make continuous improvements’. A key learning point of the Safer, Smarter Nursing Metrics programme is the importance of ensuring clinical teams are actively involved in developing and agreeing indicators. The success of this programme can be seen through the fact that organizations continue to submit local data to the SHA for analysis, something that would be unlikely to happen if they did not see the benefit.

Conclusions The work that has been described here provides just a sample of what a Quality Observatory could produce to support their organizations and clinical teams across a region. The absolute requirement, particularly in the current financial climate, to focus on quality improvement and productivity in the NHS means that monitoring and evidencing good practice in patient safety effectively is of paramount importance. It is very important to make explicit that any analysis should provide a starting point for discussion rather than prompt an immediate (and often inappropriate) judgement. A key role for the Quality Observatory in South East Coast has been to educate our customers to move away from a more ‘judgemental’ approach to a constructive approach which involves the use of measurement for improvement techniques and discussion between teams and organizations. This approach should ensure that the wealth of data that the NHS has at its disposal is turned into intelligence that will not only make practitioners lives easier, but patient treatment safer. Different Quality Observatory models are being established by each of the regions – further information can be obtained from the Quality Observatory website hosted by NHS South East Coast (http://www.qualityobservatory.nhs. uk). If you are interested in any of the dashboards detailed in this article, or are interested to learn about the other models and products which have been developed by the team, look at the South East Coast tab and register as a user.

Reference 1 Department of Health. High Quality Care for All. London: Department of Health, 2008

Clinical Risk

2010

Volume 16

Number 3


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.