Knowledge Matters Volume 5 Issue 2

Page 1

Volume 5 Issue 2 June 2011 Welcome to Knowledge Matters Hello everyone and welcome to another exciting edition! In this issue I am pleased to say that we introduce a number of new products to you including dashboards for A&E, diabetes, fractured neck of femur and mental health. The last of these contains data for all providers in England who submit a Mental Health Minimum Dataset so will be of interest to readers outside of South East Coast. This dashboard has already been shared with the SHA Medical Directors and Quality Observatories in other SHAs to cascade to local organisations. Something else I am very pleased to report is that this issue contains our first International article—an article from Norway which describes an approach to quality improvement which has delivered impressive results. A key component of the approach has (of course) been measurement: - the establishment of effective measures which provide a baseline and that are then tracked over time and shared with the teams making the changes. The range of products that we are developing for GP Commissioning Consortia continues to grow. You’ll learn more about two key products in this issue (pages 4 and 17). In addition we have developed an activity trend explorer tool which has the ability to provide a forecast (in terms of activity and the associated finance) until the end of the financial year. More information on this in August. We have also developed a course aimed at increasing the skills and knowledge of staff working in consortia (clinical and non-clinical) with regards to data and information. We intend to deliver this course over five modules (face to face or via Webex), but are happy to deliver as a full day programme if this is requested. In the next issue we will describe further detail of what’s on offer and we will be keen to hear from teams who are willing to be ‘guinea pigs’ for the initial delivery of this exciting course. Finally, we are currently identifying key products which we have already developed which could easily be adapted to provide useful information to consortia. A good example of this is the dementia dashboard which was developed some years ago and is now being re-designed with GP commissioning consortia in mind. More on this next time…..

Inside This Issue : The Quality and Outcomes Framework (QOF) Tool

3

Ask An Analyst

10

GP Commissioning Consortia KPIs

17

The Diabetes Dashboard

4

Bad Chart of the Month & Meet the Quality Observatory

12

Analysis Ancient and Modern

18

The A&E Dashboard

5

The British Lung Foundation

13

The Mental Health Dashboard

19

Skills Builder—constructing a box plot in Excel

6

The Role of the Information Centre

14

Dangers of Red, Green Amber

20

Healthcare Improvement in Norway

8

The Fractured Neck of Femur Dashboard

16 News

22

twitter.com/SECSHAQO issuu.com/SECQO

www.QualityObservatory.nhs.uk


Page 2

The Quality and Outcomes Framework (QOF) Tool By Fatai Ogunlayi & Nikki Tizzard, Quality Innovation & Productivity Analysts As many people will know, the Quality and Outcomes Framework (QOF) is a voluntary annual reward and incentive programme for all GP surgeries in the UK. It is not about performance management but rather resourcing and then rewarding good practice, and although participation is optional the number of practices that take part is very high. Information is collected from participating practices via the Quality Management Analysis System, or QMAS, which is a system that calculates achievement against a national set of evidence-based indicators. Practices aiming to deliver high quality care are awarded points based on their achievement against the various indicators, which are divided into four ‘domains’; Clinical, Organisational, Patient Experience and Additional Services. As well as measuring overall performance the QOF can also make it easier to monitor long-term conditions such as diabetes, stroke and COPD, and provides useful information around the prevalence of such conditions. It promotes good practice in areas such as record-keeping, staff training and general management of the surgery, as well as things like patients’ access to a GP. Another part of the QOF is exception reporting. This allows practices to pursue improvements in quality of care without being penalised for certain pre-defined situations, for example where a patient doesn’t attend a review or where a particular medication cannot be prescribed for some reason.

The QOF Tool The Quality Observatory have developed a new online tool which uses readily available published information to provide comparative analysis for QOF results across all four nations of the UK; England, Scotland, Wales and Northern Ireland. The tool will soon be available for all to use and some of its features will include:

Various Measures The tool will provide users with the flexibility to view QOF data at any level, from country down to practice. Results can be displayed at a total level, by domain, by a specific disease area or at individual indicator level, on any of the measures below:

   

Points Achieved Underlying Achievement Exception rate Prevalence rate

Benchmarking Users would have the option to benchmark any level of any organisation against, for example:

    

All organisations Country SHA / NHS Board PCT / LHB / CHP / LCG GP Consortia

Quality.Observatory@southeastcoast.nhs.uk

www.QualityObservatory.nhs.uk


Page 3  ONS Clusters  Practice Users can also select up to 3 different peers to compare against, and display benchmarks such as minimum, maximum, median, percentiles, etc.

Trends Analysis A time series analysis is also being included which would allow users to select any practice and any indicator and monitor the performance over several years, looking at any of the measures previously mentioned (exception rate, prevalence rate etc).

Overview An overview across the entire QOF data, or across a particular domain or across a particular disease area is also available at any level from country down to practice. Again, there is the option to select any of the measures (the example shown is a clinical domain overview, looking at prevalence rates for a particular practice).

Possible Future Development GPCC - currently only GP consortia in the South East Coast region have been included, however as the picture of UK wide consortia emerges, the QOF toolkit would be updated accordingly. Admission rates compared against QOF achievement - although this is by no means a direct comparison, it may provide a high level picture of how GP practice behaviour might influence acute activity levels. Comparison of prevalence rate in QOF (clinical domain) vs estimated prevalence rate - where available this would allow users to see the potential 'gap' between QOF observed prevalence and estimated population prevalence. For further information, or to be informed when the new QOF toolkit is published, please contact quality.observatory@southeastcoast.nhs.uk

www.QualityObservatory.nhs.uk

Quality.Observatory@southeastcoast.nhs.uk


Page 4

Diabetes Dashboard By Katherine Cheema, Specialist Information Analyst Diabetes is a condition where the amount of glucose in the blood is too high because the body cannot use it properly. This is because the pancreas does not produce any insulin, or not enough, to help glucose enter the body’s cells – or the insulin that is produced does not work properly. There are 2.8 million people diagnosed with diabetes in the UK and a further estimated 850,000 people who have the condition but have not yet been diagnosed. Nine billion pounds a year is spent on treating diabetes and its complications, and this accounts for about 10 per cent of total NHS spending. The Diabetes Dashboard has been created by the Quality Observatory to help identify areas for improvement in diabetes care. A range of indicators are displayed from both primary and secondary care that together help describe the current status of diabetes services across practices and GP Commissioning Consortia (GPCC) in the South East Coast region. It can also help in finding other localities that are performing particularly well. There are two parts to the dashboard; primary care and secondary care. The primary care charts display information relating to 6 key QOF diabetes indicators, and include Underlying Achievement and Exception Rates. Using drop-down menus you can select your desired GPCC and, within that, your chosen practice. Results for the SHA are also shown as a benchmark. There are currently three years worth of QOF data available in the dashboard. In the secondary care part, you can select a GPCC and see various performance measures related to secondary care for diabetes including admissions, complications and length of stay. Data is displayed either monthly or on a rolling 12 month basis and there is also an option to see the SHA figures again for benchmarking purposes. The dashboard is designed to be used as a tool alongside local knowledge and understanding of the diabetes landscape. Included in the tool is a guide on how to use it, which also describes the data sources used and the methodology employed to calculate the indicators. For more information do get in touch with me at Katherine.cheema@southeastcoast.nhs.uk

Quality.Observatory@southeastcoast.nhs.uk

www.QualityObservatory.nhs.uk


Page 5

The A&E Dashboard By Simon Berry, Specialist Information Analyst The new A&E dashboard has been developed to enable comparison between South East Coast trusts using the recently introduced A&E quality indicators.



Time in department for admitted patients (max, median and 95th percentile)



Time in department for non admitted patients (max, median and 95th percentile)



Time to initial assessment for patients conveyed by ambulance (max, median and 95th percentile)



Time to treatment (max, median and 95th percentile)



% Patients left without being seen



Unplanned re-attendance within 7 days

A&E Provider Clinical Quality Indicators % Unplanned Reattendances within 7 Days

In addition to these indicators, and recognising the somewhat variable quality of the A&E dataset, the dashboard also includes a number of data quality measures to highlight where issues may exist that can be resolved. These are

% Unplanned Reattendances within 7 Days

Sussex

Kent D&G

EKH

8

8

8

6

6

6

6

4

4

4

4

4

2

2

2

2

2

0

0

0

0

0 Feb-11

Dec-10

Apr-10

Feb-11

Dec-10

Jun-10

Aug-10

Feb-11

Dec-10

Apr-10

Feb-11

Jan-11

Mar-11

Feb-11

Dec-10

Jul-10

Oct-10

Nov-10

Jun-10

timing point (indicates where batching of data entry may result in inaccurate data) Feb-11

8

6

% where end time is the same as start time for

Dec-10

8

Dec-10

Sep-10

Apr-10

10

Apr-10

12

10

Oct-10

14

12

10

Jun-10

14

12

10

Aug-10

14

12

10

Oct-10

14

12

Apr-10

14

Oct-10

16

Jun-10

18

16

Aug-10

18

16

Oct-10

18

16

Apr-10

% with missing or invalid data for the timing point

Medway

18

16

Jun-10

Aug-10

Jan-11

Mar-11

Feb-11

Dec-10

Jul-10

Oct-10

Nov-10

Sep-10

Apr-10

Jun-10

Aug-10

May-10

Jan-11

Mar-11

Feb-11

M&TW

WSHT

18

Aug-10

% with valid NHS number

Oct-10

SASH

Apr-10

Feb-11

Dec-10

Apr-10

Mar-11

Nov-10

RSC

Dec-10

0

Jul-10

0 Oct-10

0 Nov-10

0

Sep-10

0

Jun-10

2

0

Aug-10

2

May-10

2

Jan-11

2

Mar-11

2

Jul-10

4

2

Oct-10

4

Nov-10

4

Sep-10

4

Jun-10

4

Aug-10

6

4

May-10

6

Jan-11

6

Mar-11

8

6

Feb-11

8

6

Dec-10

8

6

Jul-10

8

Oct-10

8

Nov-10

10

8

Apr-10

10

Jun-10

10

Sep-10

10

Aug-10

10

May-10

12

10

Jan-11

14

12

Feb-11

16

14

12

Dec-10

16

14

12

Jul-10

16

14

12

Oct-10

16

14

12

Sep-10

16

14

Apr-10

16

Jun-10

18

Aug-10

18

May-10

18

May-10

ESHT

18

Jun-10

BSUH

Frimley

18

Aug-10

Surrey

ASPH 18

A&E Provider Clinical Quality Indicators West Sussex Hospital Trust

Unplanned Reattendances

Admitted Patients Time in Dept

12.0%

25 Max

Median

95th Percentile

Max

20

20

Median

95th Percentile

15

15

10

10

5

5

8.0%

6.0%

4.0%

Jan-11

Feb-11

Mar-11

Jan-11

Feb-11

Mar-11

Oct-10

Dec-10

Nov-10

Jul-10

Sep-10

5.0%

12 Median

Aug-10

Left Without Being Seen

Time to Initial Assessment

12 Max

Apr-10

Jan-11

Mar-11

Feb-11

Oct-10

Dec-10

Nov-10

Jul-10

Sep-10

Jun-10

Aug-10

Apr-10

Jan-11

Mar-11

Feb-11

Oct-10

Dec-10

Nov-10

Jul-10

Sep-10

Aug-10

Apr-10

Jun-10

May-10

Time to Treatment

Jun-10

0

0

May-10

2.0%

0.0%

Max

95th Percentile

Median

95th Percentile 4.5%

10

If you have found an area of interest you can then delve down further to look at all indicators for a specific trust by using the remaining two elements. One provides an overview across all the quality indicators for the trust you have selected.

Non Adm Patients Time in Dept

25

10.0%

May-10

The dashboard comprises of 3 elements, the first of these is the trust comparison sheet. Using the drop down box at the top of the screen you can select the measure of interest and compare relative performance across all South East Coast trusts.

10

4.0% 3.5%

8

8

6

6

4

4

2

2

0

0

3.0% 2.5% 2.0% 1.5% 1.0%

Oct-10

Dec-10

Nov-10

Jul-10

Sep-10

Aug-10

Apr-10

Jun-10

May-10

Jan-11

Mar-11

Feb-11

Oct-10

Dec-10

Nov-10

Sep-10

Jul-10

Jun-10

Aug-10

Apr-10

0.0% May-10

Jan-11

Mar-11

Feb-11

Oct-10

Dec-10

Nov-10

Jul-10

Sep-10

Aug-10

Apr-10

Jun-10

May-10

0.5%

A&E Provider Clinical Quality Indicators Data Quality Brighton And Sussex University Hospitals Trust

% Valid NHS Number

% Departure Time Missing or Invalid

100.0% 90.0%

The data is sourced from the trust’s own SUS submissions. It’s available to download now on the Quality Observatory website:

% Departure Time = Arrival Time

0.0%

0.08%

0.0%

0.07%

0.0%

0.06%

0.0%

0.05%

0.0%

0.04%

0.0%

0.03%

0.0%

0.02%

0.0%

0.01%

0.0%

0.00%

80.0% 70.0% 60.0% 50.0% 40.0% 30.0%

% Initial Assessment Time Missing or Invalid

4.0%

0.1%

14.0%

3.5%

0.1%

Mar-11

Feb-11

Jan-11

Dec-10

Oct-10

Nov-10

Sep-10

Aug-10

Jul-10

Apr-10

Jun-10

% Time of Treatment = Arrival Time

% Time of Treatment Missing or Invalid

16.0%

May-10

Mar-11

Jan-11

Feb-11

Oct-10

Dec-10

Nov-10

Jul-10

Jun-10

% Initial Assessment Time = Arrival Time

Sep-10

Apr-10

Mar-11

Feb-11

Oct-10

Jan-11

Dec-10

Nov-10

Sep-10

Aug-10

Jul-10

Apr-10

Jun-10

May-10

0.0%

Aug-10

10.0%

May-10

20.0%

http://nww.qualityobservatory.nhs.uk. It contains data from April 2010 to the end of March 2011. It will be regularly updated so keep an eye on the website for new versions.

1.2%

1.0% 12.0%

3.0%

10.0%

2.5%

8.0%

2.0%

6.0%

1.5%

0.1% 0.1%

0.8%

If you have any questions you can contact Simon:

0.1% 0.6% 0.0% 0.0% 4.0%

1.0%

2.0%

0.5%

0.0%

0.0%

0.0%

0.0%

Simon.berry@southeastcoast.nhs.uk

0.4%

0.0%

www.QualityObservatory.nhs.uk

Mar-11

Feb-11

Jan-11

Dec-10

Oct-10

Nov-10

Sep-10

Aug-10

Jul-10

Apr-10

Jun-10

May-10

Mar-11

Feb-11

Jan-11

Oct-10

Dec-10

Nov-10

Sep-10

Jul-10

Aug-10

Apr-10

Jun-10

0.0% May-10

Mar-11

Jan-11

Feb-11

Dec-10

Oct-10

Nov-10

Sep-10

Jul-10

Jun-10

Aug-10

May-10

Apr-10

Mar-11

Feb-11

Oct-10

Jan-11

Dec-10

Nov-10

Sep-10

Aug-10

Jul-10

Apr-10

Jun-10

May-10

0.2%

Quality.Observatory@southeastcoast.nhs.uk


Page 6

Constructing a box plot in Excel By David Harries, Health Analyst A box plot can tell us a lot of useful information about the distribution of a dataset; the median, two A B measures of dispersion (the range and inter-quartile range), the skewness (from the orientation of the Box Plot Example median relative to the quartiles) as well as the ability to view any outliers, in a simple concise chart Sample data A Sample data B 78.8 84.8 which makes it very useful when undertaking exploratory data analysis. Whilst anyone with access to Statistical packages can very easily produce box plots, unfortunately Microsoft Excel charts do not have a built in Box Plot chart type. However, it is possible to create a reasonable representation in Excel by using a combination of stacked bar, error bars and the x and y scatter chart . This example uses the sample data opposite which are simply labeled A and B to produce a horizontal box plot. This example uses the sample data opposite which are simply labeled A and B to produce a horizontal box plot The first step is to setup some routine functions to calculate the Min, Max, Median and 1st and 3rd Quartiles of the dataset as shown below:

88.1 92.3 98.3 112.5 98.3 129.0 99.3 132.8 170.6 187.5 96.2 183.0 192.0 204.8 94.7 97.1 97.5 116.3 136.5 141.8 145.5 165.0

118.1 126.0 132.0 144.8 151.1 156.8 195.0 202.5 221.3 237.0 255.0 273.0 212.3 105.8 128.6 129.0 153.0 165.0 180.0 191.3 192.8 225.4

In order to create our box plot it is necessary to make some slight adjustments to the data that will be plotted. The first series of the stacked bar (1st Quartile) will be made invisible and end where the lower boundary of the 2nd quartile begins. The next series will consist of the 2nd quartile (Median-1st Quartile). The third series is the 3rd quartile (3rd Quartile – Median). The length of the error bar representing the max and min values are calculated as 1st Quartile – min and max – 3rd Quartile.

Use of the XY Scatter provides the ability to add individual data point to the chart such as the mean, the value for an area of interest or for displaying outliers. This series will be added after firstly construction the stacked bar chart. The data is now ready to plot, the blue highlighted three rows that include First Quartile, Median and Third Quartile above are selected and a stack bar chart inserted with data range series set to ‘rows’. Add the appropriate labels for X axis, in this case our Box Plot Example header Sample data A and B as shown in the screenshot below:

To add the down error bars (often referred to as ‘whiskers’), select the first series/bottom segment of stacked bar, then in Format Data Series click Y Error Bars tab, and select Custom option. Choose the Minus direction and select the Min row from Data for box plot (the row above the three rows highlighted blue). Click OK and Close to get back to Excel. These down error bars (whiskers) extend from the bottom (left) edge of the 2nd Quartile Box downward (leftward) into the first series/bottom segment of the stacked bar. Repeat for the up error bar (whisker) but instead select the top segment, and in the custom Y Error Bars tab, choose the plus direction and select the Max row from Data for box plot above (row below the three rows highlighted blue). The Tukey Box Plot The box plot, and the name, were invented by the American statistician John Tukey in 1977 in his text, Exploratory Data Analysis.

Quality.Observatory@southeastcoast.nhs.uk

www.QualityObservatory.nhs.uk


Page 7 The chart will now look similar to the one shown to the left below. Simply deleting the bottom segment in our chart and modifying the chart area of remaining series in our stack bar should give us a chart that now resembles a box plot as shown to the right below. First quartile (25th percentile)

Median

Third quartile (75th percentile)

Sample data B Sample data B

Sample data A

Sample data A

0

50

100

150

200

250

0

300

50

100

150

200

250

300

The final stage is to add the selected data point(s) to the box plot. To do this add a 4th series using the XY Scatter Chart x values (in our example 141.8, 237.0). Excel will create another stack onto the existing box plot. Return to the chart and select the 4th series by clicking on the top segment and change the chart type to XY Scatter chart for this series only. Excel will draw both the secondary axes. Next amend the x and y values for the 4th series in Source Data, the x values will be (141.8 and 237.0) and the y values (1 and 3). Double click on the secondary horizontal axis (top of the chart) and on the scale tab of the Format Axis dialog, uncheck the Maximum Value and enter the same value as the maximum on the primary (bottom) axis. Similarly, double click on the secondary vertical and amend the Maximum Value (in this case 4) and Major Unit (1). To finish off the chart the secondary axis can be hidden from view by selecting ‘None’ for each of the tick mark options in the Patterns tab of Format Axis dialog. In the Box Plot example opposite, looking at Sample data A you will notice that the median (the vertical black bar) is not in the centre of the box , but is closer to the bottom than the top, and that the top error bar/whisker is longer than the lower one. This indicates that the dataset is positively skewed, with a longer tail at the high end of the distribution. A box plot like this would suggest that the data do not follow a normal distribution – in the normal case the box plot would be symmetrical, with the median in the centre of the box and the whiskers of even length.

Box Plot Example

Sample data B

Sample data A

0

50

100

150

200

250

300

In terms of comparing the two sample datasets, the box plot shows that the median is greater in Sample data B than Sample data A. Whilst the interquartile range (length of boxes) are reasonably similiar , the overall range for Sample data B is greater (as shown by the distances between the ends of the two error bars (or whiskers) for each box plot. As already stated Sample data A is positively skewed, whilst Sample data B looks more symmetrical with the median closer to the centre and error bars of more even length. See opposite are some general guidelines when comparing box plots taken from The Open University online Learning Space. As you can see the box plot provides a very good visual overview of a data distribution and following the steps outlined above can be quite easily constructed within Excel.

www.QualityObservatory.nhs.uk

Guidelines for comparing Box Plots 1. Compare the respective medians, to compare location. 2. Compare the interquartile ranges (that is, the box lengths), to compare dispersion. 3. Look at the overall spread as shown by the adjacent values (this is another aspect of dispersion). 4. Look for signs of skewness. If the data do not appear to be symmetric, does each batch show the same kind of asymmetry? 5. Look for potential outliers. http://openlearn.open.ac.uk/mod/oucontent/view.php? d=398296&section=1.1.3 (accessed June 2011)

Quality.Observatory@southeastcoast.nhs.uk


Page 8

Bottoms Up—Healthcare improvement in Norway By Andy Hyde, Director of Quality Management, Diakonhjemmet Hospital Diakonhjemmet Hospital in Oslo, Norway has been improving quality for the last five years by applying systems approaches to the quality management and improvement process. And it works, we can prove it! For the last five years we have measurably improved target achievement in not just quality management but also personnel and financial management. About us: Diakonhjemmet Hospital is one of three local hospitals in Oslo. Health services are divided between the three hospitals according to speciality. Diakonhjemmet has departments for internal medicine, surgery, mostly orthopaedics and rheumatologic surgery, rheumatologic medicine, an emergency department and radiology, lab and other support services. The hospital also has psychiatric departments for children and adults. Our areas of speciality are elderly patients with hip fractures, rheumatologic surgery and rheumatologic medicine where we are a European Centre of Excellence and Psychopharmacology. The hospital has about 180 beds and 1500 employees. As in most hospitals in Norway much of the health information has been digitalised. We have had a digital patient journal since 1994, digital lab and radiology, electronic error reporting, digital finance and personnel systems just to take the main ones. The problem (sorry challenge): About six years ago the national heath authority started getting much tougher on quality measurement. Hospitals in Norway had been reporting numbers on a wide range of performance and quality indicators for a long time but now the focus shifted to improvement. In 2005 the National Heath Directorate published it's "National strategy for quality improvement in social- and healthcare (2005 - 2015)" ... ogbedre skal det bli! This translates to "... and it's going to get better". The strategy explained the problem, identified some goals and outlined a number of initiatives hospital's could enact to achieve the goals and measure the results. The basic philosophy was PDCA, Plan Do Check Act (which is basically the same as ‘Plan Do Study Act’ or PDSA which I understand is widely used within the UK). What the strategy didn't take account of was the level of understanding at the time for change and improvement management. At first top management were looking for a solution over rather a short time period, months. It took a lot of persuasion to show that a three year plan was maybe more realistic if sustainable change were to be achieved, although results would be generated almost from the beginning. The three year plan had three clear phases. First to identify, purchase and implement the necessary computer based tools for performance and quality management. Second to start using the tools separately. And lastly to integrate the tools into a holistic quality management system and use it to change the hospital's quality culture. Further to the plan was then to build a management system around this that would support and enhance the use of the performance and quality management system as a part of the overall hospital management system. So, whether to take a top down approach to change, or a bottom up approach….? That was the big question. I’ll cover in a future article some of the pit falls to the former approach, and will explain how we have made the ‘bottoms up’ approach work for us…. But for now I think it’s time to show you some proof that our approach to change has delivered impressive results which have been sustained over time. So, here’s one example of our approach to improving one part of the system….. And I hope that this is an example that Knowledge Matters readers can relate to….. In 2006 Diakonhjemmet Hospital sent out about 40% of its medical summary documents after patient discharge within 7 days. The national goal was 80%. So what were we doing wrong? And more importantly how could we improve?

Quality.Observatory@southeastcoast.nhs.uk

www.QualityObservatory.nhs.uk


Page 9

The first stage was to gather the data and quality control it. Rubbish in means rubbish out!! And not surprisingly data quality was indeed an issue for us. The next question was, where is the bad data coming from? Is one person entering poor quality data—or is the problem systematic…..? There is nothing like sharing well presented, easy to understand data to a team to engage them in discussion. On first presenting data to teams and individuals (whether clinical or not), a first reaction is often to say that the data is wrong (whether or not it is!). People come with a wide array of possible reasons for why the data is wrong—and by the way these reasons are often due to someone else doing something wrong. The system then rapidly prototypes a solution that is published so that the users can see and test the hypothesis. For every 10 possible reasons for bad data only a few are real, many are assumed. Once data quality issues are under control, not removed, just under control then the quality goal can be addressed. In this case we were trying to achieve the target of 80% of discharge documents being sent out within 7 days of discharge, and of course understand why some summary documents were sent out beyond this time scale. By understanding the reasons for ’underperformance’ - and by collectively identifying issues, agreeing potential solutions and then testing these, it is possible to solve the underlying problems without alienating a large group of competent employees. A blame culture certainly is not conducive to achieving sustained improvements and in my experience, a top down approach can encourage criticism and blame which then results in the improvement effort failing. Different employee groups, doctors, nurses, lab, secretaries all have a view of the problem that needs to be tested and each clinical department has its own problems. By involving as many employees as possible from different parts of the system, each only for a few minutes to articulate a hypothesis and to test a solution, the ownership to the results is easier to achieve than a solution suggested from above. This makes change self sustaining so that when the final process change is achieved and new processes implemented there is less chance that things will slip back to the old bad habits. So, were we successful in our attempt to achieve 80%? This is what happened to this key indicator measured across all departments. It took 2 years to achieve the target, but once achieved the improvement has been sustained. Better to take the time to identify and properly solve the underlying problems rather implement a short term fix which results in short term improvement (which of course is demoralising for staff and makes change more difficult next time). Figure 1: Summary after discharge for all departments

KEY

Epikrise = Clincal summary Faktisk = Measured value Mål = Goal

Of course different departments had different issues which impacted upon their ability to achieve the target. The district psychiatric centre had completely different problems from the department of surgery. The key learning therefore is to involve the staff concerned and not assume that what worked in one department will work somewhere else…. Results like this have been achieved with a wide range of indicators and parameters and the hospital can boast that in almost all areas, the national targets have been achieved and are maintained. To foster a culture of continual improvement employees need to be part of the change, after all it is them that need to do things differently and they often are fully aware of this need and know how. They have just never been asked! In the next edition of Knowledge Matters, I will tell you more about some of the other areas where improvement has been achieved and how. Plus I will describe why (in my view) the old paradigm for managing a hospital through measurement and reporting of performance targets through traffic light dashboard reports in out dated and ineffectual.

www.QualityObservatory.nhs.uk

Quality.Observatory@southeastcoast.nhs.uk


Page 10

Using a calendar to validate inputs Application: Microsoft Excel 2003 Dear Quality Observatory I want to use a calendar in my spreadsheet to validate the input that people can put into certain columns. I have got as far as Insert > object > calendar control. I can’t figure out what to do next ? Pauline Smith Programme Manager Patient Experience & Quality Clinical and Workforce Development South East Coast Strategic Health Authority

Solution: Complexity 4/5 — Uses Macros Hi Pauline We don’t often use calendar controls but is a great way to make sure that people use the right date format in cells. This will however require adding some macro code into your sheet to get it to work! The calendar object that you have inserted is a control object. This means that we can control its properties with Macros attached to certain event triggers:- We can get/set its value, decide when to show the form and when to hide it, we can even move it around depending on which cell has been clicked. Event Triggers : Event Triggers are actions that you can use to run code. for example you can attach code to run when the work book is opened, using the Workbook_Open() event. There are lots of triggers Built in to excel some are workbook events (open, close, newsheet, print) and some are worksheet events (activate, change, calculate, control_click) , try not to get them confused! control objects also have event triggers (click, dblclick) these are a subset of the worksheet events and will only be visible after you have added the control to the worksheet.

The first step is to think about what you want the code to do, and how you want to interact with the control.

Here are the action steps that I think we need to consider: 1.

we don’t want the control visible until we click on cells in certain columns

2.

when we click on the date columns show the calendar

3.



check to see if there is already a value in the cell, if yes show that value if no show today



set the calendar position to be near the cell that has been selected

when a date has been selected put that value in the cell that was clicked and hide the calendar

Quality.Observatory@southeastcoast.nhs.uk

www.QualityObservatory.nhs.uk


Page 11

here is the code this should be inside the SHEET that you want it to work on. We will be using the worksheet event triggers.

STEP 1 : We can use the worksheet Activate and deactivate events to hide the calendar when the user move to or away from the sheet. Private Sub Worksheet_Activate() Calendar1.Visible = False End Sub Private Sub Worksheet_Deactivate() Calendar1.Visible = False End Sub

STEP 2: We can use the “Target” variable in the Worksheet_SelectionChange event to check which column the cell is in. we can use the case statement to evaluate the target and an if statement to see if it is empty. We can also set the “left” and “Top” properties of the calendar to be the same at those of the cell so that the calendar is displayed where we clicked. Private Sub Worksheet_SelectionChange(ByVal Target As Range) Select Case Target.Column Case 6 To 7 If Target.Value = Empty Then Calendar1.Value = Now() Else Calendar1.Value = Target.Value End If Calendar1.Left = Target.Left Calendar1.Top = Target.Top + (Target.Height / 2) Calendar1.Visible = True Case Else Calendar1.Visible = False End Select End Sub

STEP 3: We can use the Calendar1_Click event to put the date into the target cell and hide the calendar control Private Sub Calendar1_Click() ActiveCell.Value = Calendar1.Value Calendar1.Visible = False End Sub

www.QualityObservatory.nhs.uk

Quality.Observatory@southeastcoast.nhs.uk


Page 12

Tip of the month We at the South East Coast Quality Observatory love charts, and use them all the time to help present information in a clear logical and understandable way. However from time to time we come across charts that are misleading, unclear, or generally not that useful. Below is a chart that has recently been brought to our attention, that definitely falls in to the latter category. Here is a full graphic that shows us that of a given population 45% were male and 55% were female. Certainly a useful piece of knowledge, but that information can be conveyed easily in just a few words, as I’ve just proved above. There is just no need to stick it in a graphic – it’s taking up space that could be used for something more useful.

Gender of Patient

40%

Male Female

Pie charts are a very limited way of presenting information, but can have there uses – for example if you had one-off survey and each question had a range of answers from 1 to 5 – then, yes, a pie chart could be used to convey that information. All too often though pie charts like this example are abused by people with a non-analytical background, often for little more than the fact that they look pretty, and there is a wide perception that the general public understands them. 60%

If readers of Knowledge Matters come across charts like this please e-mail it to us and we will feature in a future issue quality.observatory@southeastcoast.nhs.uk

Meet the Observatory—Dr Quality interviews Rebecca Matthews So Rebecca, can you please start by telling me what you’ve done in the past? Yes, Dr Quality—no problem. I started off with a degree in Medieval History, followed by a degree in Aeronautics and Astronautics a couple of years later (an unusual combination I know)….. then I started applying for jobs. It was either the NHS or the job I was offered in a second hand car dealership - I decided on the NHS! I have a history of following Kate and Kiran about, having been at Southampton University with Kiran and having worked with Kate at Southampton University Hospitals, so it was only a matter of time until I ended up at South East Coast! I have also by the way covered Kate’s maternity leave on both occasions (once here and once in Southampton) - Kate assures me there won’t be a third time! And what are your specific areas of responsibility within the Quality Observatory? I cover all performance related analysis – so basically all of the indicators listed in Operating Framework (e.g. 18 weeks, activity, performance reporting for the board), although some areas are picked up by other members of the team. I’m also the overall lead for QIPP KPIs reporting and specifically lead on analysis for the Medicines Management work stream so can help with primary care prescribing data and queries. I also manage Unify2 accounts – so let me know if you need accounts, training or any queries answering. I attend the monthly meeting of PIRG (Performance Information Reference Group). This is a national meeting which involves representatives from all SHAs and the Department of Health and discusses issues related to existing reporting and forthcoming changes to national reporting. Finally, what does the future hold for you then Rebecca? Well, I am very much hoping to stay within the Quality Observatory (in whatever guise we end up) as I really enjoy working in the team . I’m looking to further develop my existing areas of work, but also to lead on selected pieces of work to support GP consortia. I’m also learning to drive – so if you find yourself held up on a roundabout in the Lingfield area on a Saturday morning that could be me!

Quality.Observatory@southeastcoast.nhs.uk

www.QualityObservatory.nhs.uk


Page 13

The British Lung Foundation Established in 1985, the British Lung Foundation (BLF) fights to help the eight million affected by a lung condition in the UK. We provide support and information to improve the every day lives of people with lung disease. We are also campaigning for better diagnosis, treatment and prevention for now and the future. Information and resources we provide are available to patients, carers and healthcare professionals. We produce a variety of information and supportive leaflets which are available to order for free via our website. We also have a national Helpline which has trained medical advisers, including a paediatric adviser, to provide confidential and impartial advice to anyone concerned about their lungs. A major aspect of the British Lung Foundation’s work is campaigning to spread awareness and educate on a variety of different lung conditions. Each year the BLF holds a flagship awareness campaign called Breathe Easy Week, with the focus shifting to a different disease or condition each year. This year the focus was on obstructive sleep apnoea, a severely underdiagnosed condition that is potentially dangerous, one that many people are not aware of. Obstructive sleep apnoea is when a person’s airways block during their sleep, meaning they temporarily stop breathing. This often means the person is woken up frequently throughout the night. The result is that people’s bodies and minds are put under real strain, through a combination of sleeplessness a, and the body being consistently starved of oxygen. As a part of our campaign to raise awareness, we commissioned a survey that looked into the sleeping habits of people partners who snore heavily. The survey found that partners can lose an average of 1.5 hours of sleep per night. This amounts to nearly a whole month of sleep deprivation a year. Most people brush off their partners’ heavy snoring as an annoying habit but for some people these findings could be more serious as snoring is a common symptom of the sleep disorder. The findings of the survey were picked up on by a lot of major media outlets, including the BBC and the Daily Mail. In addition, the British Lung Foundation’s website has set up a page where people can take the Epworth Sleepiness Test, which can help an individual to assess their own levels of tiredness. A major survey was also available for people to take part in on the website. So far, 2812 people have taken the Epworth Sleepiness Test, while 301 people have taken the survey. Meanwhile, all over the country, local Breathe Easy groups used the week as an opportunity to spread awareness about a condition that is often initially discounted as mere heavy snoring or tiredness. Please do take the time to have a look at our website: - http://www.lunguk.org/ There is lots of useful information available including a range of information leaflets, information on up and coming events and details of how to join the British Lung Foundation. You can also sign up to our electronic newsletter.

www.QualityObservatory.nhs.uk

Quality.Observatory@southeastcoast.nhs.uk


Page 14

The Information Centre—our new role By Tim Straughan, Chief Executive, the NHS Information Centre As the NHS changes, everyone working in and around the NHS will be challenged to improve the quality of services whilst making significant financial savings and efficiencies. Good quality, timely information will be essential for managers and clinicians to make informed decisions to improve the quality and efficiency of care services. It will also help avoid disproportionate cuts that may affect an organisation’s capability, capacity and investment. I am committed to transforming the way information is collected, analysed and used by the NHS and adult social care services to achieve this. The impact of the change to the NHS In June the government announced key changes to their plans for reforming the NHS including wider involvement in clinical commissioning with ‘clinical commissioning groups, stronger accountability, safeguards on competition and support for integrated care These commissioning groups will be informed by evidence and evaluated by their impact on health outcomes and the patient experience. The need to support these changes and inform a more phased approach means that information will be even more important. I anticipate the amended Bill will only reinforce the importance of information in improving the quality and efficiency of care services and supporting people to make the best possible choices about their health and care The role of the NHS Information Centre The NHS Information Centre will remain as England’s national source of health and social care information. Working with a wide range of health and social care providers, The NHS Information Centre will carry on providing the facts and the figures to help the NHS and social services run effectively. The NHS Information Centre will continue to collect, analyse and convert data in to useful information to help providers improve their services and support academics, researchers, regulators and policy makers in their work. Protecting confidentiality The NHS Future Forum report highlighted concerns to protect patient confidentiality and we will support and contribute to amendments to the Bill which do this in a way that also supports our plans to drive quality improvement through greater access to information; and to promote high quality research. Our responsibility My measure of success for The NHS Information Centre will be how well we “unlock the potential for making better use of information” *. The way we collect data and make it available will be critical in enabling other organisations to make the best use of information for improving care. The effective and efficient use of data depends on an operating principle of “collect once and use many times”. The principle of ‘no decision about me without me’ relies on good information and a culture that enables people to make use of it. Our role now and in the future as a statutory body will be;

 

To produce and publish national and official statistics, indicators and measures to ensure services can be properly audited and held to account for their quality and efficiency To truly become England’s central, authoritative information source, developing a publicly-accessible national repository of health, public health and social care data as well as a searchable catalogue of nationally available information. * Paragraph 3.20—Legislative Framework and next steps

Quality.Observatory@southeastcoast.nhs.uk

www.QualityObservatory.nhs.uk


Page 15    

To continue to be the leading health and social care data source for the national transparency agenda (www.data.gov.uk) To ensure we process and link data from across health, social care and other sources safely and securely to help enrich the range of useful information available nationally. To develop a national framework for assuring data quality which will set out clearly the responsibilities of both local and national organisations. To work collaboratively with Commercial, third sector and other information intermediaries to produce a vibrant information environment

Rising to the challenge Our ability to rise to the challenges ahead will be influenced by how successfully we maximise the important contribution that quality information can bring to:

    

Health and care professionals – by identifying areas for attention and helping them to design programmes to deliver improved outcomes for patients and service users The service as a whole – to help ensure that services are underpinned by principles of responsiveness, efficiency, and effectiveness. Regulators – to provides indicators to monitor NHS and Care organisations and to support Monitor promoting integration of care The public – directly and through intermediaries , to help people understand the range of health and social care services available and the factors which are important to them, including accessibility, quality of care and safety Patients, service users, carers and their families – by empowering them to make choices about their health and wellbeing, and the care they require.

Over the next year, the NHS IC will have an essential role to play in collecting, assuring and providing good quality information to our customers. We will certainly need to continue to evolve our plans to fit in with our new role and the strategic requirements of the NHS and Care organisations in the years ahead. In the next edition of Knowledge Matters I will describe some of the exciting projects that we have underway currently that will make a positive difference to staff and patients of the NHS. See you next time!

Free web seminar—Expert on Call: Organising for Quality & Value Thursday 21 July 2011—4pm Expert on Call is a monthly online seminar provided by the NHS Institute for Innovation and Improvement, where leading thinkers in the NHS Institute and beyond share their insights from research or product development via WebEx (an easy to use web-based conference system). This free seminar is open to all NHS staff. In the next Expert on Call, Alice O’Neill, Lead Associate, and Jenny Bramhall, Associate, at the NHS Institute will talk about the Organising for Quality and Value: Delivering Improvement Programme. The programme supports NHS organisations to develop the service improvement skills of their clinical and operational staff, providing them with a solid foundation in quality and service improvement methods. Alice and Jenny will also discuss the content of the programme and share how it has helped NHS organisations equip their teams with the skills and confidence to implement proven service improvement techniques on local projects where quality and safety in patient care could be improved. During the seminar, participants will have the chance to ask questions, exchange ideas and discuss how they can put these insights into practice. The seminar will take place at 4pm on Thursday 21 July 2011. To register for the seminar, copy this link into your browser https://nhs.webex.com/mw0306ld/mywebex/ default.do;jsessionid=pWpZTKpJ2hRRLmFhr9QTCqLCQLjGF694bFS6MNGBrMyG1H6JjJLd!-687987437? nomenu=true&siteurl=nhs&service=6&rnd=0.035111

www.QualityObservatory.nhs.uk

Quality.Observatory@southeastcoast.nhs.uk


Page 16

Fractured neck of femur tool By Simon Berry, Specialist Information Analyst A key clinical issue for South East Coast with it’s significant aging population is the growing number of patients being admitted to hospital with a fractured neck of femur. With this in mind, a new addition to our suite of tools designed to improve outcomes is the fractured neck of femur benchmarking tool. Data used in the tool is all sourced from SUS—in other words it uses the Trust’s own submitted data. There are a number of views provided within the tool allowing users to gain a quick overview of progress over time and comparison with other Trusts. The first view provides a comparison of performance over time for each of the 11 Trusts within South East Coast for a selected measure. 3 drop down boxes at the top of the screen allow you to select against a range of op- Fractured Neck of Femur Patients tions as follows: All Admissions, Length of Stay, All Ages

Admissions

Length of Stay

25

In Hospital Mortality

25.0%

30 Days +

By Age Average

<=30 Days 30 Day+ Average <= 30 Day Average

250 20

20.0%

15

15.0%

10

10.0%

5

5.0%

0

0.0%

09/10 Q3

10/11 Q1

10/11 Q3

09/10 Q1

10/11 Q3

08/09 Q3

08/09 Q1

10/11 Q1

10/11 Q3

10/11 Q3

09/10 Q3

0 09/10 Q1

0

08/09 Q3

0

10/11 Q3

0

10/11 Q1

5

0

09/10 Q3

5

09/10 Q1

5

08/09 Q3

5

08/09 Q1

10

5

07/08 Q3

10

07/08 Q1

10

10/11 Q1

10

09/10 Q3

15

10

09/10 Q1

15

08/09 Q3

15

08/09 Q1

15

07/08 Q3

20

15

07/08 Q1

20

10/11 Q1

20

09/10 Q3

20

09/10 Q1

25

20

08/09 Q3

30

25

08/09 Q1

30

25

07/08 Q3

30

25

07/08 Q1

30

25

10/11 Q3

30

10/11 Q1

35

09/10 Q3

35

09/10 Q1

35

08/09 Q1

Medway

35

08/09 Q3

07/08 Q3

07/08 Q1

10/11 Q3

10/11 Q1

09/10 Q3

09/10 Q1

07/08 Q1

M&TW

WSHT

35

07/08 Q3

SASH

EKH

The next view is the Trust Age Dashboard. Once you have identified a Trust of interest, you can drill down to compare side by side by age band, numbers of admissions, length of stay and in hospital mortality. Drop down boxes allow you to change the Trust of interest and and also which group of patients in terms of operation you type that you wish to focus on.

Surrey And Sussex Healthcare Trust Fractured Neck of Femur Patients 2010/11 All Admissions

300

10/11 Q3

RSC

08/09 Q3

0 08/09 Q1

0 07/08 Q3

0

07/08 Q1

0

10/11 Q3

0

10/11 Q1

5

0

09/10 Q3

5

09/10 Q1

5

08/09 Q3

5

08/09 Q1

5

07/08 Q3

10

5

10/11 Q3

10

10/11 Q1

10

09/10 Q3

10

09/10 Q1

10

08/09 Q3

15

10

08/09 Q1

15

07/08 Q3

15

07/08 Q1

15

10/11 Q3

15

10/11 Q1

20

15

09/10 Q3

25

20

09/10 Q1

25

20

08/09 Q3

25

20

08/09 Q1

25

20

07/08 Q3

25

20

07/08 Q1

25

10/11 Q1

30

09/10 Q3

30

09/10 Q1

30

08/09 Q3

30

08/09 Q1

30

07/08 Q3

35

30

07/08 Q1



35

08/09 Q1



Kent

D&G

35

07/08 Q1

The first gives you to the option to select either all admissions, all patients undergoing a procedure from a specified basket (OPCS W191, W241, W46, W47, W48), patients for a specific procedure, and patients without a procedure in the basket; The second allows you to select the age band you are interested in; The final allows you to select the measure you want to look at . The options are admissions, bed days, length of stay, 30 day in hospital mortality and in hospital mortality.

ESHT

35

07/08 Q3



Sussex

BSUH

Frimley

35

07/08 Q1

Surrey

ASPH 35

200

150

100

91+ YO

81-90 YO

71-80 YO

61-70 YO

51-60 YO

41-50 YO

31-40 YO

18-30 YO

91+ YO

81-90 YO

71-80 YO

61-70 YO

51-60 YO

41-50 YO

31-40 YO

91+ YO

81-90 YO

71-80 YO

61-70 YO

51-60 YO

41-50 YO

31-40 YO

18-30 YO

0

18-30 YO

50

The final view provided within the dashboard is the Trust Procedure Dashboard. This cuts the data in the other direction. Again, for a specified Trust you can view admissions, length of stay and in hospital mortality, however this time it is grouped by selected procedures. Again there are two drop down boxes at the top of the screen, the first allows you to select the relevant Trust and the second allows you to specify what age band you want to look at. Surrey And Sussex Healthcare Trust Fractured Neck of Femur Patients 2010/11 All Ages

As always, the dashboard is available to download from the Quality Observatory website: http://nww.qualityobservatory@southeastcoast.nhs.uk

Admissions

180

Length of Stay

25

In Hospital Mortality

14.0%

30 Days + <=30 Days 30 Day+ Average <= 30 Day Average

By Procedure Average 160 12.0% 20

140

10.0% 120

If you have any queries or suggestions for enhancements to the dashboard, please do not hesitate to get in touch!

15

8.0%

100

80

6.0% 10

60

simon.berry@southeastcoast.nhs.uk

4.0% 40

5 2.0%

20

Quality.Observatory@southeastcoast.nhs.uk

www.QualityObservatory.nhs.uk

W48

W47

W46

W241

W191

Pats wo Proc

W48

W47

W46

W241

W191

0.0% Pats wo Proc

W48

W47

W46

W241

W191

0 Pats wo Proc

0


Page 17

GP Commissioning Consortia KPIs By Katherine Cheema, Specialist Information Analyst As GP Commissioning Consortia develop apace throughout this and coming year, the use of good information is crucial. This will not only help them plan their commissioning effectively, but also track key performance indicators. This can help us to better understand variation between GPCCs and identify where we might like things to improve at the individual GPCC and regional level. The QO has developed a suite of GPCC KPIs that aim to do just that. The tool itself comes in two parts; a tabular view (see below) gives a snapshot of key measures in four domains across all current GPCC with summaries at county, SHA and England level where available:



Demography (includes number of practices, total population, proportion of women etc.)



Secondary care, split into urgent and emergency care and planned care (includes A&E total attendances, emergency admissions, elective admissions, OP first appointments, first to follow-up ratios etc.)



Primary care (including disease prevalence, prescribing costs, low cost statins and patient experience measure)



Prevention (includes smoking advice and obesity prevalence)

To allow GPCC to dig a bit deeper, the indicators are also shown in two separate dashboards, one for primary care, demogrpahics and prevention, another for secondary care (see below) in a time series format. This enables GPCCs to see their individual progress over time, rather than against other GPCCs. Select GPCC:

GPC Consortia: secondary care KPIs

90%

60% 1,000 50% 800 40% 600 30% 400

20%

200

4.0%

1,500

3.0%

1,000

2.0%

500

1.0%

0

0.0%

Elective admissions and % daycase 100%

2,500

10

150

100

Month Emergency ALoS

Month

First outpatient to follow-up outpatient ratio

Did-not-attend rate (all appointments)

3.5

9%

50%

40% 30%

4,000

30%

3,000 20% 2,000

500

2.5 6% 2 DNA rate

5,000

7% First to follow-up ratio

1,000

Num ber of OPFA s

50%

1.5

5% 4% 3%

1

2%

20%

10%

0.5

1,000

1%

10%

% daycase elective admissions

% first OPs with non-GP referral source

This chart shows the number of follow-up appointments for each first outpatient appointment.

Jan-11

Mar-11

Feb-11

O ct-10

Dec-10

Nov-10

Jul-10

Sep-10

Jun-10

Aug-10

M ay-10

0% Apr-10

M ar-11

Jan-11

Month

Feb-11

O ct-10

Dec-10

Nov-10

Jul-10

Sep-10

Jun-10

Aug-10

M ay-10

0 Apr-10

Jan-11

Number of first OPs Month

Month

Mar-11

Feb-11

O ct-10

Nov-10

D ec-10

Jul-10

Aug-10

Sep-10

0% Jun-10

0 Apr-10

0%

May-10

0 Apr-09 May-09 Jun-09 Jul-09 Aug-09 Sep-09 O ct-09 Nov-09 Dec-09 Jan-10 Feb-10 Mar-10 Apr-10 May-10 Jun-10 Jul-10 Aug-10 Sep-10 O ct-10 Nov-10 Dec-10 Jan-11 Feb-11 Mar-11

Number of elective admissions

60%

8%

3

40%

1,500

Elective ALoS

60%

6,000

70%

4

0

7,000

80%

6

2

50

90% 2,000

8

0

Number of first outpatients with % from a non-GP referral source 8,000

12

Apr-09 M ay-09 Jun-09 Jul-09 Aug-09 Sep-09 O ct-09 Nov-09 Dec-09 Jan-10 Feb-10 Mar-10 Apr-10 M ay-10 Jun-10 Jul-10 Aug-10 Sep-10 O ct-10 Nov-10 Dec-10 Jan-11 Feb-11 Mar-11

Jan-11

M ar-11

Feb-11

Oct-10

Dec-10

Nov-10

Jul-10

Sep-10

Aug-10

Apr-10

Apr-09 May-09 Jun-09 Jul-09 Aug-09 Sep-09 Oct-09 N ov-09 Dec-09 Jan-10 Feb-10 Mar-10 Apr-10 May-10 Jun-10 Jul-10 Aug-10 Sep-10 Oct-10 N ov-10 Dec-10 Jan-11 Feb-11 Mar-11

% short stay emergenc

Emergency and elective average length of stay

200

Month Number of A&E attendances % A&E attendances with a GP referral source

Month

Number of emergency admissions

Jun-10

0%

Number of elective admissions

www.QualityObservatory.nhs.uk

2,000

10%

0

% daycase

If you would like more information on the GPCC KPIs please do get on touch at the address below.

5.0%

% O PFA from non-GP source

These measures may be added to or changed as priorities become clearer and GPCCs develop their commissioning intentions. This tool is designed to be flexible in that measures can be added or changed quickly.

2,500

M ay-10

N um ber of em . adm issions

1,200

6.0%

% A&E atts. from G P referral

70%

Num ber of A&E attendances

80%

1,400

3,000

Average LoS (days)

100%

1,600

Emergency admissions per 1,000 for ambulatory care sensitive conditions 250

Apr-09 M ay-09 Jun-09 Jul-09 Aug-09 Sep-09 Oct-09 Nov-09 Dec-09 Jan-10 Feb-10 M ar-10 Apr-10 M ay-10 Jun-10 Jul-10 Aug-10 Sep-10 Oct-10 Nov-10 Dec-10 Jan-11 Feb-11 M ar-11

1,800

Esydoc

A&E attendances with % referred from a GP

ACS adm issions per 1,000 emergency adm issions

Emergency admissions and % short stay (<2 days)

% short stay

There is a drop down menu to allow GPCCs to select their own information, or indeed have a look at similar consortia against whom they may wish to benchmark themselves.

Month

Quality.Observatory@southeastcoast.nhs.uk


Page 18

Analysis, Ancient and Modern By Katherine Cheema, Specialist Information Analyst

As a healthcare analyst, when you think long and hard about it, most of the information we analyse is pretty depressing; hospital admissions and disease profiles for example serve to remind us of our frail mortality (except maternity analysis maybe, that's usually a bit happier). Perhaps none is as stark as mortality statistics, but they have been and still are crucial in the development of our understanding of patterns of disease, and the outcomes of our attempts to combat them (is there a simpler outcome measure than mortality?). The first mortality statistics were published in 1662, by John Graunt (1620-1674); his book ‘Natural and Political Observations Made upon the Bills of Mortality’ used the ‘bills’ of mortality from parish records to analyse the onset and spread of bubonic plague, and eventually to develop a system to warn the city of an outbreak. Graunt was also credited with producing the first ‘life table’ which gave the probabilities of surviving to a certain age. In reality, the system Graunt hoped to create didn't really happen, but the exercise served to give us a framework for modern demography and populations statistics.

The ‘bills’ of mortality that Graunt used were really the only formal records of mortality that existed prior to the 19th Century. They were primarily designed to monitor plague deaths within London. You can see here the havoc that the plagued wrecked in just one week, with over 7,000 deaths; 96% of London parishes were infected. It also shows early records of cause of death; consumption (TB), fever, childbirth and ‘spotted fever’ (probably smallpox) all feature heavily as common causes. Less common causes include ‘burnt in his bed by a candle at St Giles Cripplegate’ and ‘killed by a fall from the belfry at All Hallows the Great’. And if anyone can advise how over 100 people died from ‘teeth’ we’d be very grateful.

In the 1830s a more comprehensive approach to recording mortality was taken, and included all the English regions as well as London. Weekly and quarterly returns on births, deaths and marriages from all major towns were compiled and published annually as the report of the registrar-general, and also as Medical Officer of Health reports. Central data returns have a long history! This system remains in place today, although the technology has caught up and there isn’t a medical officer of health any longer. The UK Statistics Authority are the keepers of mortality data and here at the QO we spend lots of time poring over in-hospital deaths, analysing patterns and variation across the region as a key indicator of quality. In essence, whilst our data may be better and easier to access, our purpose in assessing mortality statistics has remained the same for 4 centuries.

If you want to see a really fabulous visualisation of life expectancy against variables such as income per head, across the world, from around 1800, check out Gapminder World (http://www.gapminderworld.org/world). Hit the play button to see how things have played out for different nations over the past 200 years.

Bill of Mortality from : http://images.wellcome.ac.uk/indexplus/image/L0049749.html

Quality.Observatory@southeastcoast.nhs.uk

www.QualityObservatory.nhs.uk


Page 19

Mental Health Dashboard By Adam Cook, Specialist Information Analyst The Mental Health Minimum Data Set (MHMDS) has been collected for a several years, but often the data has been patchy and poorly used. The Information Centre for Health and Social Care (IC) has been releasing information for a while from the MHMDS, and much of it has been on an experimental basis, to be treated with caution. However as of the April 2010, the NHS Operating Framework contains a number of performance based indicators all sourced from the MHMDS. This data is published quarterly by the IC. The most recent quarter is always released as provisional, until it is finalised on the release of the next quarters data. The South East Coast Quality Observatory has taken the data from the Information Centre and turned it into a dashboard. This dashboard contains data for every NHS provider in England who submits the Mental Health Minimum Data Set. There are 16 indicators in the dashboard these are split into two domains: Activity and Data Quality. Activity Indicators: The activity indicators look at such things as accommodation and employment, HONOS assessments, CPA reviews, absenteeism for Mental Health Act detainees, and bed days for the under 16s Other activity indicators may be added to this in the future as MHMDS V4 is introduced in 2011/12—when this happens I will look to update the dashboard accordingly.

Data Quality Indicators: Data quality can be further subdivided into two categories: Validity and Data Completeness. The four validity indicators show the percentage of valid codes entered for certain fields and the completeness show the percentage of completeness for 6 key demographic fields. As well as providing data for all provider within England, the dashboard also shows the all England rate with 95% confidence limits. There are also two more pages with detailed notes and indicator construction details. These pages are lifted directly from the original IC source documents and give details on how each of the indicators works and has been constructed.

As new data is released by the Information Centre I will update the dashboard. As with all of our products, to gain access to the dashboard, visit our website, go to the resource catalogue and select ‘mental health’ http://nww.qualityobservatory.nhs.uk/ If you have any queries or suggestions for improvement please do contact me adam.cook@southeastcoast.nhs.uk

www.QualityObservatory.nhs.uk

Quality.Observatory@southeastcoast.nhs.uk


Page 20

Dangers of RED, GREEN, AMBER By Samantha Riley, Head of the Quality Observatory It’s nothing new for CEOs and Boards to request one sheet of paper displaying a colour coded picture of a whole raft of indicators to assure them of the strengths, weaknesses and risks within their organisation or system. In the current climate with financial pressures on the NHS along with the requirement to maintain (and improve) the safety, efficiency and quality of services delivered to patients, the desire for such representation of complex data sets is ever increasing. However, can a one sheet colour coded sheet of paper fulfil this aim……. Is it really possible to assure yourself of the ‘performance’ of an organisation or system in this way? One of the frustrations that the team face on an almost daily basis is the strong desire by teams and individuals to assign a ‘RAG’ (ie red, green, amber) status to an indicator and to attach a judgment with regards to the trend that the data is displaying. Typically 2 or three data points are used to imply this judgment and often these RAG statuses are displayed in a large table. Regular Knowledge Matters readers will, I’m sure, understand too well that this is an extremely limited approach in terms of presentation and one which does not support the measurement for improvement agenda the Quality Observatory has been an advocate of for many years. Anyway, the large table of data often compares this quarters performance to last quarter and the desire is to assign a colour to the most recent period of data. Often the RAG status relates to whether the required level of performance is being achieved. Green would mean good, red bad and amber somewhere in between. I have mentioned in a previous edition of Knowledge Matters some of what I learnt from my friend and colleague Davis Balestracci. Davis is an independent consultant based in USA and is also a senior member of the American Society for Quality (2003-2004 Chair of its Statistics Division). For five years he was the monthly statistical columnist for Quality Digest and continues to contribute to their daily e-version. (By the way I would recommend both Davis’ website www.davisdatasanity.com and the Quality Digest website http://www.qualitydigest.com/ —both of them have really useful articles on quality improvement and measurement). So, credit to Davis for teaching me much of what I am about to relay in this article. Here goes…… If you have three data points, there are only so many combinations possible for those points if plotted on a graph. Here are a few examples of how three points can look, along with the ‘judgment’ that is often attached… I’m sure that you can work out the other combinations for yourselves……

Upward Trend

Downward trend

Downturn

Those enlightened readers will already be aware of the phenomenon of ‘natural variation’ and that statistical theory tells us that it takes a sequence of 6 consecutive increases or decreases to declare a trend (with 20 or less data points you can use 5 increases/decreases). For those of you interested in learning about this in more depth, get in touch and we will point you in the right direction for more detailed reading materials. Now, this article is not an in depth study into heavy statistics, however it is important that we regularly remind ourselves of what constitutes a trend and what does not. This is VERY important if the correct judgements and decisions are to be taken with regards to focus, decisions, investment and (when required) corrective action—we don’t want to waste lots of energy and time on a ‘downturn’ which is ultimately natural variation. So whilst I can understand the requirement to know whether a target is being achieved, and acknowledge that colour coding can be a useful aid in understanding this, I suggest that this is not enough to understand the risks, opportunities and improvements occurring within a system. Lets imagine that I’m a Sister working on a ward which is monitored on a monthly basis to see whether every patient has had a nutritional assessment. Let’s say the target is 95%. Performance for the last month was 85% and this indicator is therefore marked as RED on the report which goes to the management team. What isn’t reflected on the management report (which is presented as a large grid of numbers by the way), is that 5 months ago the performance on this ward was 37% and for every month since then there has been a sustained improvement. So we now have a run of 5 consecutive increases…… which statistical theory tells us is a trend…….. Although we’re still some way off of the target, indications are that within the next month or so we will be achieving the target. We’ve seen a sustained improvement which I would to colour code as GREEN. The Sister on the ward feels dejected when she sees the management report rated as RED—great things have been achieved, the team have worked really hard to improve things…. A rating of GREEN would mean a lot to the staff and no doubt encourage them to improve even further. So I’m not saying that it is always wrong to use a RAG status, I would suggest that we need to be more sophisticated in terms of how it is applied. Intelligent colour coing is possible, however statistical theory needs to underpin the colour coding and the criteria used to rate an indicator a particular colour needs to be transparent. Bearing in mind that it takes quite a number of data points to establish whether we are seeing a trend, the frequency of measurement is clearly something that we need to think carefully about.

Quality.Observatory@southeastcoast.nhs.uk

www.QualityObservatory.nhs.uk


Page 21 The more frequent the measurement, the more quickly we can ascertain whether a trend is occurring. So when thinking about improvement, it’s really important to ensure that key measures are tracked regularly, and that the data is presented in a meaningful way. Lets not forget that ‘High Quality Care for All’ explicitly acknowledged the high performance in all aspects of quality is nearly always present in the organisations that proactively and effectively measure their activity and use the information gathered to drive improvements forward. Let’s look at an example of how the same information can be presented in a more meaningful way. Several years ago, the SHA reporting for MRSA and C-Diff looked like this—apart from the fact that the table was much much larger. Green and red indicated whether the target was being achieved, however it was very difficult to see whether the situation was improving or declining in each Trust. This was easily solved by the development of a range of graphical dashboards which provided an at a glance view of performance over time compared to both local and national limits. An example appears below showing summary cases for all Trusts within South East Coast.

The concept of looking at a system, over time and assessing the variation within it, rather than making judgement on single number or against a set target, is one that is well established in the manufacturing industry and can be applied to great effect in healthcare. By identifying unusual, or ‘special cause’ variation within a system, factors that affect it can be identified and acted upon. The Quality Observatory applied this approach to C-Diff data well over a year ago with the development of an ‘early warning system’. An example of the early warning system appears below. The graph is annotated to show points of unusual variation. In this case the cases of ’unusual’ variation and high numbers of C-diff cases appear to be linked to bank and school holidays. Our experience here at the Quality Observatory is that this type of graphical presentation, the application of statistical theory to data and the non-judgemental approach in terms of sharing the information is far more conducive to engaging all stakeholders in the use of information to evidence their actions and be pro-active in managing the system. So, in summary, I would suggest that we need to think carefully about how we present and interpret information if we are to focus on the really problematic areas, foresee imminent problems and take corrective action before these problems occur, foster a culture of continuous improvement and not waste time focussing on ‘natural variation’. It is possible to utilise colour coding, however as I have already explained, there needs to be a sophisticated approach to this and the methodology needs to be transparent. This is something that we are of course happy to advise on. In addition, (and I will cover this in a future article), we need to consider very carefully which are the critical indicators for us to track. Is it sensible to have a list of 100 or more indicators which are reviewed every month? We need to be able to see the wood for the trees and focus on the leading indicators which will indicate whether our organisation or system is ‘healthy’. You may want to take a look at the paper which myself and Kate had published in the journal Clinical Risk last year— the paper expands on some of the concepts covered within this article and describes how information can be used to create a culture of measurement for improvement. The paper can be viewed and downloaded from http://www.issuu.com/secqo

www.QualityObservatory.nhs.uk

Quality.Observatory@southeastcoast.nhs.uk


Page 22

NEWS NHS Innovation Challenge The NHS Innovation Challenge Prizes are awards given to NHS organisations across the country to recognise and reward those who have proven that they can deliver more for less, and sustain and improve the quality of care for patients. The first set of Innovation Challenge Prize awards have been allocated, with three winning organisations being awarded £100000, £50000 and £35000 respectively: The NHS Institute would really like to hear from teams and individuals who have found innovative solutions for delivering treatment and care in the following areas:  Earlier cancer diagnosis;  Increasing independence for those with kidney failure.  Better management of pregnancy;  The reduction of MSSA and/or E.coli bactraemias across a health economy;  The reduction of waste associated with prescribing practice;  Ensuring that people seeking urgent and emergency care receive the best care at their first attendance;  Reducing avoidable attendances at GP surgeries and other Primary Care settings. For details of how to apply for an award look at the website below. Closing date 14 August 2011. http://www.biginnovationsapply.institute.nhs.uk/ IAPT Dataset The Information Standards Board has approved the IAPT (Improving Access to Psychological Therapies) dataset as a national operational standard. All IAPT services will be expected to return data to a central reporting system from April 2012. During 2011, IAPT services should make required changes to information systems and processes in readiness for it to be made mandatory in 2012. For further details follow the link below www.dh.gov.uk/en/Publicationsandstatistics/ Lettersandcirculars/Dearcolleagueletters/DH_126444 Integrated Performance Measures Return Information regarding the new Integrated Performance Measures Return is now available from Unify2 – this is the former Vital Signs Monitoring Return with unnecessary lines removed. This is now a commissioner only return.

Quality.Observatory@southeastcoast.nhs.uk

Bowel Cancer Audit Bowel cancer care continues to improve in England and Wales, according to the latest annual report from the national audit of bowel cancer. A report was published recently on behalf of the Association of Coloproctology of Great Britain and Ireland and is based on information provided by all but three trusts for the one-year period from August 2008 to July 2009. The report is available at the following link http://www.ic.nhs.uk/webfiles/Services/NCASP/audits% 2 0 a n d % 2 0 r e p o r t s / NHS_Bowel_Cancer_2010_INTERACTIVE.pdf Health Informatics Qualification Accredited A specialist qualification in health informatics has been recognised by the University of Wolverhampton. The Avoca Higher Diploma in Health Informatics is the first qualification to allow health informatics professionals to gain a qualification formally recognised as being equivalent to degree level through workplace experience. For further details visit the website http://www.avoca.co.uk/News_Accreditation.html Updated Information Governance Toolkit available Version 9 of the IG Toolkit is now live. There is some new functionality, including the rollover of all scores, evidence and comments from the previous assessment. Organisations are required to report in the following stages:  Baseline assessment by 31 July 2011;  Performance update by 31 October 2011;  Final submission by 31 March 2012. The following documents provide further details:  IG Toolkit v9 Release Note provides a summary of the system changes.  IGT Toolkit v9 Change Notice provides details of changes to the requirements Both of these documents can be accessed from the following link: https://www.igt.connectingforhealth.nhs.uk/ NewsArticle.aspx?tk=64&lnv=1&cb=16ff5b97-80bb-4f32af93-fe37dcc54536&artid=72&web=yes Information Centre Update The IC has a new group portal allowing access to the SUS data quality dashboards and SUS KPIs. This has easy online access to the interactive dashboards and the downloadable KPI spreadsheets. To get access to this you need to contact the IC enquiries desk (enquiries@ic.nhs.uk) and ask about access to the SUS Dashboards groups.

www.QualityObservatory.nhs.uk


Page 23

Team Update HSJ Awards 2011 Applications for the 2011 HSJ Awards are encouraged from Knowledge Matters readers for each of the 18 categories. New for 2011 are the categories ‘Data and Information Management’ and ‘Research Culture’ Entries to the Data and Information Management category should demonstrate the following elements: -

       

Evidence of excellence in data governance Clear focus on use of data and other information collection across the organisation Data sharing with other organisations Adherence to data protection and other information legislation Use of data to inform strategic decision making Work which utilises data and other information to improve patient outcomes and organisation efficiency Demonstrable benefits to the patient Engaging with clinicians

The closing date for entries is 15th July. To see further details or to apply visit the awards website http://www.hsjawards.co.uk/HSJAwards2011HomePage Cluster Unify2 Accounts Unify2 accounts are now available for Sussex and Kent clusters – cluster accounts can be requested by clicking on the ‘request a new User Account’ link from the Unify2 front page and selecting NHS Support Agency as the organisation type and either Sussex Commissioning Support Unit or Kent and Medway HIS as the Organisation Name. Alternatively email rebecca.matthews@southeastcoast.nhs.uk with details of the accounts required. As Surrey cluster only covers one PCT, no special arrangements are required. Changes to Unify2 account administration During 2011/12 it is planned that administration if Unify2 accounts will transfer to individual organisations. Initially each organisation will need to have one or two ‘Advanced User Managers’ with access to set up Unify2 accounts for their organisation – guidance and instructions will be provided to these individuals. If you think you should have this access for your organisation please contact rebecca.matthews@southeastcoast.nhs.uk with details of your name, job title, organisation and contact details, copying in the Unify2 mailbox: unify2@dh.gsi.gov.uk

Since the last issue, there have been a number of Quality Observatory celebrations/exciting events. Simon celebrated his birthday at the end of May and was awarded with a box of goodies by the team. Kiran celebrated his birthday in June with a birthday BBQ which most of the Quality Observatory attended armed with umbrellas. Samantha’s gift to Kiran was a photo cupcake centrepiece which featured chocolate and vanilla cupcakes with photos of Kiran wearing a pink cowboy hat and in his Shrek headgear from last year’s birthday party. It was with great sadness that the Quality Observatory said goodbye to David Graham (Informatics Graduate Trainee) who has left the Observatory to take up post as a web developer at moneysavingexpert.com The team organised a surprise goodbye picnic for David and Samantha arranged competitive sports …… David was of course awarded a personalised Quality Observatory mug as a memento of his time with the team. A big thank you to NHS North West for producing a short film on the work of the South East Coast Quality Observatory. Filmed during the Innovation Expo back in March, the film is viewable on YouTube. Here’s the link….. http://www.youtube.com/watch?v=o6b9XqGA8BQ Development clinic sessions: If you need help developing your spreadsheet or database project. Then book a slot in our drop in sessions on either 20th July or 31st August. Just email us to book your session.

www.QualityObservatory.nhs.uk

Quality.Observatory@southeastcoast.nhs.uk


Page 24

The Listening Exercise….. What did you do in the pause? Did you examine every clause, Study documentation With due deliberation? What did you do in the pause? Did you consider competition and choice, And make sure they heard your voice? Will improvement grow From what patients know, And how they’re offered their choice. Is the challenge of being accountable Small or insurmountable? And patient involvement, Did you give it acknowledgement, And consider if it would be governable? Is leadership of the NHS Something you need to address, Should clinical advice, Be more precise, How will you measure success? Are the policies on staff education, Important to the whole reformation, And will staff training needs Fall or succeed, Dependant on your contemplation. So, did the listening exercise Make you stop, think and revise Your thoughts on reform, And how we perform, And give you a chance to advise. What did you do in the pause? Did you examine every clause, Did you have your say, Or just look away What did you do in the pause?

Did you know that……. One member of the Quality Observatory has quite a collection of toys and hand puppets? Currently within the collection are 2 meerkats, a rubber duck, caterpillar, Sootie, Sue and (of course) Sweep. The big question is—who do they belong to???

The answer will be revealed in the next edition of Knowledge Matters. Please do e-mail us at quality.observatory@southeastcoast.nhs.uk with your guesses (then we’ll be able to publish a graph in the next edition)

Fascinating Facts….. During 2010/11, there were 183 admissions to Trusts within South East Coast for the Diagnosis M771 lateral epicondylitis (better know as Tennis Elbow) And for those of you participate in track and field sports yourself, watch out as the commonest admission for injuries in 2010/11, where the injury happened on a sports field or track were:



Other fall same level due collision/pushing by another person: 288



Hit, struck, kicked, twisted, bitten or scratched by another person: 224



Striking against or bumped into by another person: 223

Knowledge matters is the newsletter of NHS South East Coast’s Quality Observatory, to discuss any items raised in this publication, for further information or to be added to our distribution list, please contact: NHS South East Coast York House 18-20 Massetts Road Horley,Surrey, RH6 7DE Phone:

01293 778899

E-mail: Quality.Observatory@southeastcoast.nhs.uk To contact a team member: firstname.surname@southeastcoast.nhs.uk


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.