Knowledge Matters Volume 4 Issue 5

Page 1

Volume 4 Issue 5 December 2010 Welcome to Knowledge Matters Hello and festive greetings! Over the past weeks and months, quite a number of important consultation and policy documents have been published—I’m not going to say anything more than have a look at our news section and make sure that you have a read/respond to relevant consultations. As this is a fun time of year, I do want to draw your attention to our in-house poet—Adam Cook—who regularly contributes to Knowledge Matters. Having been challenged by a couple of members of the team to write a rap, Adam has surpassed himself this issue with ‘QIPP Hop’ which appears on the back page. Adam has not only created an amazing rap (with a serious message), but stars in a music video along with a number of other members of the Quality Observatory team. The link to the YouTube video appears on the back page. Have a look and rap long—it’s sure to make you smile. Thanks for reading as always. Merry Christmas and very Best Wishes for 2011.

Inside This Issue : A stroll along the PROMs ..

2

Introduction to NHS Atlas

9

Christmas Quiz !!

18

Liberating the NHS—An Information Revolution

4

Enhancing Quality Programme

10

The IHI 22nd National Forum

20

Winter Pressures Dashboard

5

Analytical Top Trumps

13

Analysis, Ancient and Modern

22

Skills Builder—Forecasting

6

Ask An Analyst

15

News

23

Public Sector Mapping Agreement (PSMA)

8

Quality and Risk Profiles

16 QIPP HOPP!

http://nww.sec.nhs.uk/QualiityObservatory Quality.Observatory@southeastcoast.nhs.uk

24


Page 2

A stroll along the PROMs ……………….. By Adam Cook, Specialist Information Analyst and David Parkin, Chief Economist In an NHS where outcomes and patient involvement are key elements, what better source of outcomes information than patients themselves? Patient Reported Outcome Measures (PROMs) show how patients rate their own health before and after treatment. This allows us to have a patient-focussed view of how much their health improved. Data from the national PROMs initiative, including both pre-opertaion (Q1) and post-operation (Q2) have now been released. They cover four procedures: groin hernia repair, varicose veins repair, hip replacement and knee replacement. There are three different PROMS measures. Two different sets of scores from the EQ-5D questionnaire (EQ5D index and EQ-VAS) are available for all 4 procedures and there are also condition specific measures for hips and knees (Oxford Hip & Knee scores) and for varicose veins (Aberdeen Varicose Vein Questionnaire). More details on how these work are available from the IC website: http://www.ic.nhs.uk/statistics-and-data-collections/hospital-care/patient-reported-outcome-measures-proms

At South East Coast we have also had a look at interpreting the PROMs data. This is a very rich data source, but for a first pass at this we’ve looked at comparisons between providers and commissioners of their average scores for the EQ-5D index and condition-specific measures. The Q1 data are captured before the patient’s operation and the Q2 data 3 months later for hernias and varicose veins, and 6 months later for hips and knees. Knee Replacement

Groin Hernia

40

1

Oxf ord Knee Score Q2

EQ-5D score Q2

Oxf ord Knee Score Q1

EQ-5D score Q1 35

0.8 30

25

0.6

20

0.4 15

0.2

10

5

0 England

ASPH

BSUH

D&G

EKH

ESHT

FP

M &TW

M edway

Q.Vic

RSC

SASH

WASH 0 England

SEC

B&H

ESD&W

E&CK

H&R

M edway

Surrey

W.Kent

W.Sussex

-0.2

Every organisation, both providers and commissioners, in South East Coast showed improvement between Q1 and Q2.This was bigger for hips and knees than for hernias and varicose veins. Note that the smaller the Aberdeen Varicose Vein Questionnaire score the better, so a reduction from Q1 to Q2 means an improvement.


Page 3 The data can be also be looked at by a health gain score, which is the difference between pre- and post-operation scores. This is also adjusted for casemix, taking account of factors such as age and local deprivation. There are fewer scores for this than for the Q1 data, because organisations where there are fewer than 30 completed patient questionnaires data items are excluded to avoid making unjustified statistical comparisons. The results for the health gain score includes a confidence interval , the size of which also reflects the number of completed questionnaires in different organisations. Groin Hernia

Hip Replacement Oxf ord Hip Score casemix adjusted score

EQ-5D index casemix adjusted healt h gain

Oxford Hip Score average score

30

EQ-5D index average score

0.25

0.2

25

0.15

20

0.1 15

0.05 10

0 England

ASPH

BSUH

D&G

EKH

ESHT

FP

M &TW

M edway

Q.Vic

RSC

SASH

WASH

5

-0.05 0 England

SEC

B&H

ESD&W

E&CK

H&R

M edway

Surrey

W.Kent

W.Sussex

-0.1

These first results just show the potential of PROMs data to add to other information on the process and outcomes of care for NHS patients. The data are continuing to be collected, so we will be able to build time series that show how organisations are improving. We will also be able to compare PROMs with other quality indicators, such as length of stay or re-admissions, to give a more complete view of what good quality clinical outcomes are. If you have any queries or would like to learn more, please contact us! david.parkin@southeastcoast.nhs.uk

adam.cook@southeastcoast.nhs.uk or

Quality Observatory Christmas Celebrations….. The Christmas celebrations this year took place at Samantha’s house with a selection of delicious dishes provided for the delectation of the team. Secret Santa was of course on the agenda—he did really well this year. Fats (who is learning to play the saxophone) received a beautiful inflatable sax. Samantha received a beautiful handbag in the shape of a shoe (Secret Santa clearly knows her too well……) Several hours were spent playing Quality Observatory Monopoly—featuring popular landmarks in Horley (Waitrose, Collingwood Batchelor, York House and The Gatwick) and roads in the locality (including the M23 slip road, Massetts Road and Airport Way). The game was tense at times with arguments over who was going to buy The Gatwick—it might be a while before Quality Observatory Monopoly makes another appearance……..


Page 4

Liberating the NHS—An Information Revolution By Samantha Riley, Head of the Quality Observatory In this article, I’m not intending to provide a summary of Liberating the NHS : An Information Revolution as an excellent executive summary is available to download from the Department of Health website—here’s the link http:// www.dh.gov.uk/prod_consum_dh/groups/dh_digitalassets/@dh/@en/documents/digitalasset/dh_120601.pdf. However, as the current consultation nears its close (comments need to be provided to the Department of Health by 14th January), I thought that it would be worth pulling out some areas that may be of particular interest to Knowledge Matters readers. One of the things that I was very pleased to read about was the intention to release a number of data sets in a more timely fashion. There is of course lots of data available already which many of you will be familiar with and use such as HES, data on infections and lots of data on performance). Additional data to become widely available by April 2011 includes:  Inpatient information at provider level (annual, November 2010)  NHS Choices - Provider Quality Indicators (November 2010)  Outpatient information at provider level (annual, December 2010)  Maternity information at provider level (annual, November 2010)  A&E - additional tables (January 2011)  Ambulance Status reports (weekly, February 2011)  Inpatient information at national level for procedures & diagnosis (by April 2011) Additional data to become available from 2011/12 onwards includes: Cancer Registries and Cancer Datasets Information about mixed sex accommodation National Clinical Audit NHS Choices Directories NHS Reference Cost data Financial Information Management (FIMS) Reference data (eg. population and demographic data in more useable forms)

      

Something else which I was very pleased to read about was the requirement for information to be based on accurate data (which of course I am a big supporter of being Chair of the Data Quality Board) which is seamless and joined up to support effective commissioning. The Quality Observatory has on a number of occasions undertaken work to try and join up a range of data sets—it’s not as easy as it sounds!!! There are quite a number of barriers in the way to joining up data sets……. Although the NHS number has been mandated for some years now, it is not yet universally used within the NHS. Social services will now be encouraged to use the NHS number (which may require information systems to be adapted). The pseudonymisation agenda is providing additional challenges to link up data. This is a really important issue which will need to be resolved over the coming months and years. Being somebody who has tried for some years now to champion the use of information to inform discussion and decision making, I was overjoyed to read the following statement : ‘Information management and IT capability will be essential. Strong leadership is needed from CEOs and Boards, clinical leaders and leaders of the information and IT professions. They will need to create a culture within their organisations where information is seen as the lifeblood’. The document goes on to talk about the fact that information can not be seen as something that is the sole responsibility of the specialist and that all staff will require expertise in the use and interpretation of information. This is something which the Quality Observatory has recognised for some years. We have developed a range of training materials and courses aimed at a variety of different audiences to help them with this very challenge. An example of this is our ‘de-mystifying data for clinicians’ module which we have delivered on a number of occasions to teams across the patch. If you are interested in learning more about this and the support that we can provide please do get in touch (quality.observatory@southeastcoast.nhs.uk) The final area that I wanted to mention is the fact that there is now a clear movement to increase the level of information available to the public, patients and service users. The needs of this audience will be varied and a significant amount of thought, discussion and engagement will be necessary if we are to provide meaningful information to this diverse audience. Please do take time to read the consultation and provide feedback by 14th January 2011. Comments made as part of the consultation will inform the forthcoming Information strategy which is expected in the spring.


Page 5

Winter pressures dashboard By Rebecca Matthews, Performance and Planning Analyst Winter is definitely here and to help the winter planning process the Quality Observatory, in discussion with the performance team, have developed a Winter Pressures dashboard. This is provider based and displays a number of weekly and daily data items on one page and aims to show at a glance what pressures are being faced by acute trusts in the South East Coast region. The first chart on the dashboard looks at type 1 A&E attendances, A&E performance and emergency admissions from type 1 A&E departments using data from the weekly sitreps return. The next 3 charts look at measures around beds available, occupied and closed using data submitted daily by trusts on the weekly sitreps return. The fifth chart shows operational problems reported by trusts on the daily sitreps and shows information on the number of A&E diverts, A&E closures and trolley waits of 12+ hours reported. The remainder of the charts on the dashboard display information on cancelled operations, critical care transfers and critical care beds (all from the daily sitreps), weekly C-diff infections reported (sourced from provisional weekly data supplied by trusts) and ambulance activity and performance from the weekly sitreps return. The daily sitreps are only reported on weekdays, so weekend dates will show the same data as the previous Friday. The date above the charts indicates the latest daily sitreps return available—this will generally be a day behind the current date as data is submitted for the previous day. South East Coast Acute Trust Total South East Coast Winter Pressures Dashboard Refer to notes sheet for detailed definitions Latest Daily Sitreps for period: 13/12/2010

80%

400

8

Data source: Unify2 daily sitreps

non-approved group

approved group

Data source: Unify2 daily sitreps

01/11/2010 04/11/2010 07/11/2010 10/11/2010 13/11/2010 16/11/2010 19/11/2010 22/11/2010 25/11/2010 28/11/2010 01/12/2010 04/12/2010 07/12/2010 10/12/2010 13/12/2010 16/12/2010 19/12/2010

0

DToC

A&E diverts

C-diff

Data source: weekly figures from Trusts. Note these figures are unvalidated

01/11/2010 04/11/2010 07/11/2010 10/11/2010 13/11/2010 16/11/2010 19/11/2010 22/11/2010 25/11/2010 28/11/2010 01/12/2010 04/12/2010 07/12/2010 10/12/2010 13/12/2010 16/12/2010 19/12/2010 Occupied Occupancy rate

Unoccupied

Data source: Unify2 daily sitreps

0

% in target (Cat A 8; Cat B 19 mins)

20%

500 0%

Cat calls Cat A %

20/02/2011

0%

1000

06/02/2011

0

40%

1500

23/01/2011

0

20%

2000

09/01/2011

10%

20/03/2011

50

60%

2500

26/12/2010

30%

80%

3000

12/12/2010

40%

150

3500

03/10/2010

Numbers of beds

50%

100%

4000

70% 60%

120%

4500

28/11/2010

80%

5000

14/11/2010

90%

350

200

Trolley waits 12+ hours

Cat A and B ambulance calls and performance (SECAmb total)

400

250

A&E closures

Data source: Unify2 daily sitreps

31/10/2010

Norovirus symptoms/ D&V

100

06/03/2011

Urgent ops cancelled

Urgent ops cancelled 2+ times

01/11/2010 04/11/2010 07/11/2010 10/11/2010 13/11/2010 16/11/2010 19/11/2010 22/11/2010 25/11/2010 28/11/2010 01/12/2010 04/12/2010 07/12/2010 10/12/2010 13/12/2010 16/12/2010 19/12/2010

01/11/2010 04/11/2010 07/11/2010 10/11/2010 13/11/2010 16/11/2010 19/11/2010 22/11/2010 25/11/2010 28/11/2010 01/12/2010 04/12/2010 07/12/2010 10/12/2010 13/12/2010 16/12/2010 19/12/2010

0

3

1

5 20/02/2011

0

4

0

10

06/02/2011

1

23/01/2011

2

10

15

09/01/2011

3

20

5

2

Adult and paediatric critical care beds available and occupied

25

6

50

300

26/12/2010

4

7

100

Data source: Unify2 daily sitreps

30

12/12/2010

Number of cases

5

150

35

28/11/2010

20

esc beds in use

40

6

200

Infections requiring isolation: weekly numbers (unvalidated)

03/10/2010

Number of transfers

30

Unocc beds

Occ rate

45

7

250

Number of calls

Critical care transfers 10

8

40

Occ beds

Data source: Unify2 daily sitreps Escalation beds occupied is estimated

9

50

0%

300

01/11/2010 04/11/2010 07/11/2010 10/11/2010 13/11/2010 16/11/2010 19/11/2010 22/11/2010 25/11/2010 28/11/2010 01/12/2010 04/12/2010 07/12/2010 10/12/2010 13/12/2010 16/12/2010 19/12/2010

Escalation beds

Data source: Unify2 daily sitreps Basline figures submitted to DH in Oct 10

60

10%

0

Baseline

70

20%

1000

Core beds

Cancelled Operations in previous 24 hours

30%

2000

01/11/2010 04/11/2010 07/11/2010 10/11/2010 13/11/2010 16/11/2010 19/11/2010 22/11/2010 25/11/2010 28/11/2010 01/12/2010 04/12/2010 07/12/2010 10/12/2010 13/12/2010 16/12/2010 19/12/2010

07/03/2011

07/02/2011

07/01/2011

07/12/2010

07/11/2010

40%

0

Type 1 Admits

Data source: Unify2 weekly sitreps

Ops cancelled

3000

350

Bed occupancy rate

1,000

0%

Type 1 attends Type 1 performance

Numbers cancelled

2,000

50%

01/11/2010 04/11/2010 07/11/2010 10/11/2010 13/11/2010 16/11/2010 19/11/2010 22/11/2010 25/11/2010 28/11/2010 01/12/2010 04/12/2010 07/12/2010 10/12/2010 13/12/2010 16/12/2010 19/12/2010

20%

3,000

4000

14/11/2010

40%

60%

31/10/2010

10000

4,000

5000

17/10/2010

60%

5,000

70%

Number of incidents

6000

Numbers of beds

6,000

Bed occupancy rate

100%

Numbers of beds

450

9

Numbers of beds

7000

90%

15000

0

10

100%

80%

5000

Operational problems

Beds closed due to norovirus symptoms or unavailable due to DToC

8000

7,000 % patients seen within 4 hours

Admissions and attendances

20000

Bed occupancy

8,000

120%

17/10/2010

Bed Stock: G&A beds available

Type 1 attendances, admissions and performance 25000

Cat B calls Cat B %

Data source: Unify2 weekly sitreps

This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-sa/3.0/ or send a letter to Creative Commons, 171 Second Street, Suite 300, San Francisco, California, 94105, USA.

We have also included a second sheet on the dashboard to show the national heat-maps as published by the HPA. These are published weekly on the HPA/ Q Surveillance website and show the weekly consultation rate by PCT for influenza like illnesses. The maps for the current week and previous 2 weeks are shown. The dashboard is updated each day with the latest daily sitreps data and will be available to download from the Quality Observatory website: nww.qualityobservatory.nhs.uk, from the Resource Catalogue. Please note that only people connected to an NHS sever will be able to access this dashboard. Please contact me if you have an queries or suggestions on the dashboard: rebecca.matthews@southeastcoast.nhs.uk


Page 6

Skills builder—forecasting By Katherine Cheema, Specialist Information Analyst Last time we looked at how a thorough understanding of our data can ensure that we use the right kind of forecast. In this next skills builder we’re going to look at some simple time series forecasts which are best for stationary data rather than seasonal. We’ll assume that all your data is organised appropriately and in electronic format– this is much easier in Excel! Let’s start with the really easy stuff: 1.

Naïve forecasting is the easiest and probably the most overused method in the NHS; it basically means looking at the most recent time period and assuming that the outturn, or whatever outcome you’re using, will be the same. We sue versions of this all the time in healthcare, such as forecasting January A&E attendances on the basis of last January’s A&E attendances. Whilst simple and reactive to most recent change is also reacts to special cause and extreme values (in other words, a knee jerk reaction!) . Having said that, a naïve forecast can act as a useful benchmark; if your chosen forecast method is less accurate than the naïve forecast then it is likely to be pretty bad!

2.

Moving averages (MA) are a bit better than a naïve approach as they smooth out variation so reduce the kneejerk effect. To calculate a moving average you take the average value of the most recent set of data, for example the most recent three months, and using this as an estimate of month 4. The main advantages of the MA approach is that it is simple to calculate and understand, and that it smoothes out some of the ‘noise’ in the data. However, the further ahead you want to forecast the less accurate it becomes because the bigger the time lag between actual data and the forecast period. It also gives equal weight to all data points….

3.

….unlike weighted moving averages which address the weighting problem and can be adjusted to meet individual needs. So, for example, you might have a ten day weighted moving average where you would multiply the most recent value by 10, the day before than by 9 and so on and then divide by the sum of the multipliers (10 + 9 + 8 + 7 etc.) in this case 55. This gives the most recent data point the greatest weight and the furthest away the least. This is much better than the basic moving average approach BUT you are STILL not using all that data you’ve so carefully collated. And we wouldn't want to be wasting data would we?

So, there are other options that deal with this issue for time series data, but before we deal with those we need to understand the concept of error (or you may know it as residuals or deviations). As previously discussed, without the TARDIS it is tricky to forecast 100% accurately; in reality we’re just guessing in a rather educated way. So, by definition, any forecast will have an error in it insofar as it differs from what will actually happen. Error can therefore be defined as: ERROR = ACTUAL — FORECAST (or et = yt — y^t for purists but we like plain English round here) Error is important because it gives us an idea how accurate our forecasting model is likely to be which in turn affects the confidence we can have in its results. Understanding the error in a model can also help us judge which is best for our purposes. What is defined above for a single data point can be applied across a whole actual and forecast time series to see how accurate it has been. So, line up your data, apply a forecast to some of it and then compare it to the actuals. Then see what the error looks like– if its OK, then your method can continue, if not then you might need to change it! There are three measures of error that are easy to calculate and simple to compare: 1.

Mean absolute deviation (MAD!!!!!): sum up all the errors for each actual vs forecast comparison and express it as an absolute number (ignore +/- signs). Divide by the number of data points you have; see table 1 for an example). Pretty similar to standard deviation!

2.

Mean squared error (MSE): sum up the square of each error and divide by the number of data points; this adds weight to larger errors. You can turn this into root mean squared error (RSME) by taking the square root of the result.

3.

Or you can calculate the mean absolute percentage error (MAPE) by summing up all of the error terms divided by the actuals then divide that by the total number of data points and then multiply that by 100 to give you a percentage figure. Its sound hideous but it’s probably the best to use if you’re comparing models, although is tricky if you have lots of zeros in your data set.

This example in the table uses a 3 week moving average as a forecast. You could calculate the same measures for a naïve forecast; using this data the MAD, MSE and MAPE for a naïve forecast would be 6.75, 50.5 and 17.9% respectively. All of these are higher than those shown in our example, so the 3 week moving average is a more accurate forecast method in this instance.


Page 7

A

B

Week

Actual attendances

C Forecast attendances (MA3)

D

E

F

G

Error (B-C)

Absolute error (abs(D))

Percentage error (E/B)

Error squared (E x E)

1

32

2

40

3

36

4

44

36

8

8

0.18

64

5

34

40

-6

6

0.18

36

6

30

38

-8

8

0.27

64

7

38

36

2

2

0.05

4

8

34

34

0

0

0.00

0

9

42

34

8

8

0.19

64

32

0.87

232

Totals

MAD = Σ I et l / n = 32/6 = 5.33 2

MSE = Σ et / n = 232/6 = 38.7

Ideally you should calculate all the error measures to help determine the best method to use, although just a couple of extreme values will skew results for MSE as it penalises larger errors. You should already have plotted the dots so you should be able to see this. If you plot the dots for each of your forecast methods as well, you’ll quickly see which is the best!

MAPE = Σ(l et/yt l) / n = 0.87/6 = 0.145 (14.5%)

Now we understand the concept of error we can stop wasting data and crack on with a method that can use it all. The exponential smoothing method uses all the data you have but places greater emphasis on more recent data through the use of an adjustable ’smoothing factor’ between 0 and 1; the higher this smoothing factor is set at, the greater the emphasis on recent data. The next forecast is calculated as follows: NEXT FORECAST = MOST RECENT FORECAST + (SMOOTHING FACTOR X MOST RECENT ERROR) Therefore, if the previous forecast was too low the next is adjusted upwards and vice versa. The smoothing factor (expressed as α ) used depends on what characteristics you want in your forecast; if you want something very reactive with little smoothness then set α close to 1. If you want more smotthness, perhaps to account for extrem,e values, set α closer to 0. In reality, choosing α is a bit of trial and error; you want to find the α which results in theleast amount or error, but there are some tools to help. Naturally, the QO has a spreadsheet with a slidey bar that can help you model a wide range of α to see it effect on the error terms and within Excel there is the Solver function on the tools menu which can find the lowest error value by adjusting α.. Table 2 gives a Table 2: Exponential smoothing with alpha at 0.1 worked example of how exponential smoothing works, with all the error terms calcualted with α=0.1 . If we changed α to 0.7 the MAPE would be 10.42%, which makes α=0.1 the more accurate inthis case.

A

B

Week

Actual attendances

C D Forecast attendances (ES) Error (B-C)

E Absolute error (abs(D))

F Percentage error (E/B)

G Error squared (E x E)

1

32

32

0.0

0.0

2

40

32

8.0

8.0

3

36

33

3.2

3.2

4

44

33

10.9

10.9

0.2

5

34

34

-0.2

0.2

0.0

0.0

6

30

34

-4.2

4.2

0.1

17.5

7

38

34

4.2

4.2

0.1

17.9

8

34

34

-0.2

0.2

0.0

0.0

19.7

0.5

153.9

9?

34 Totals

MAD = Σ I et l / n = MSE = Σ et2 / n = MAPE = Σ(l et/yt l) / n =

118.4

2.46 19.24 6.37%

α= 0.1

1

What we’ve covered here are relatively simple models for stationary data that doesn't show a seasonal trend and doesn't have any trend up or down. We’ll cover these issues in the next (and final!) part of this forecasting skills builder. In the meantime, if you have any questions or want to have a look at the examples here in spreadsheet form (so you can play with the numbers) just let us know at the usual address.


Page 8

The Public Sector Mapping Agreement By David Harries, Health Analyst NHS users of Geographic Information will be aware that the current provision of Ordnance Survey data to NHS Organisations is licensed through the NHS® Digital Mapping Agreement (supplied through Dotted Eyes.) From 1st April 2010 the Government made a significant amount of Ordnance Survey data available free of charge, and free of restrictive licensing, through the OS OpenData initiative see http://www.ordnancesurvey.co.uk/oswebsite/ opendata/index.html for further details. Where the products were included in the NHS DMA any Ordnance Survey royalty fees were removed. It was then the decision of each individual organisation to assess the merits of taking raw data direct from Ordnance Survey or to remain in the NHS DMA (offering a value added service delivered by Dotted Eyes). In addition to OS OpenData, NHS organisations will further benefit from Ordnance Survey digital data being available free at point of use with the introduction of the Public Sector Mapping Agreement (PSMA). The PSMA will be a centrallyfunded agreement for the provision of Ordnance Survey Geographic Information data to all of the public sector in England and Wales that will come into effect from 1 April 2011. The Agreement will mean that all organisations will have access to the same data meaning there will be no disparity between the datasets available to those in Health, Central and Local Government, something that has been a real problem under the current NHS DMA and Mapping Services (Local Government). The PSMA will be open to all NHS organisations who are current Members of the NHS DMA and those who are eligible to join but have not done so. From 1st April 2011 NHS Organisations in England will have the following options: 1. 2. 3.

Sign up for the Public Sector Mapping Agreement and access digital data from Ordnance Survey directly free of charge. Continue to source your digital mapping data from Dotted Eyes either through the DMA or directly. This option will incur a charge but is likely to be less than you currently pay Source data through another 3rd party supplier where you may incur a charge. By introducing a new licensing framework, the agreement will enable more collaborative working with delivery partners and allow public-sector organisations to reuse the data for core and non-commercial public-sector activities.

Benefits A key benefit of PSMA is that the NHS will have access to a far wider, and more detailed suite of digital mapping including: 

        

OS MasterMap Topography Layer OS MasterMap Integrated Transport Network with Road Routing Information and Urban Paths themes 1:10 000 Scale Raster and 1:10 000 Scale Black and White Raster OS VectorMap™ Local 1:25 000 Scale Colour Raster 1:50 000 Scale Raster Code-Point Code Point with polygons Detailed addressing products In addition, all NHS organisations will have unrestricted access to OS OpenData, where they can download and order a range of raster and vector mapping datasets. These include the detailed 1:10 000 scale OS Street View®; Boundary-Line™, which provides the electoral and administrative geography of the country. *Please note that the PSMA does not provide the GI software.


Page 9 The new PSMA is expected to deliver significant efficiency savings and improvements in public service delivery, with the agreement supporting both the Government’s transparency and localism agendas by enabling more effective joint working between public-sector bodies, as well as widening access to location-based information and data. Efficiency savings are expected through:



Reduced procurement and management costs. The PSMA will be a long term agreement (10 years) with all members having access to the same data, under the same terms making cross-organisational working a reality, helping to reduce costs and duplication of effort.

Additional savings by removing need for paperwork when data sharing



Reduced data translation costs when sharing

Plus, if you currently source digital mapping data through the NHS DMA, providing you sign up to the PSMA you will be entitled to a pro-rata rebate of the Ordnance Survey royalty portion of the product fee, depending when your renewal is due within the year, after the introduction of PSMA. In summary then, the PSMA has many benefits, it will certainly make joint working much easier and should result in even greater use of GI to help improve public services. Organisations should be able to sign-up to the new PSMA from 1 February 2011. For further information: http://www.ordnancesurvey.co.uk/oswebsite/business/sectors/government/publicpsmafaqs.html

An introduction to the NHS Atlas By Andrew Hughes, Principal Analyst, South East Public Health Observatory Awareness is the first important step in identifying and addressing unwarranted variation; if the existence of variation is unknown, the debate about whether it is unwarranted cannot take place. In the recent White Paper, Equity and Excellence: Liberating the NHS, there is a commitment to providing better value from the resources available to healthcare. This requires the NHS to address variations in activity and spend. Such variations indicate the need to focus on appropriateness of care, and to investigate the possibilities that there is overuse of some interventions and that some lower value activities are undertaken. In the Atlas, Right Care presents a series of 34 maps of variation selected from topics which National Clinical Directors and others have identified as being of importance to their clinical specialty. They have worked with a wide range of teams in the Department of Health, Observatory network (including SEPHO’s contribution in mapping, design, figures and data) and primary care organisations to create this Atlas. The Atlas also contains a guide to the tools and data available for analysing health investment. The aim in publishing this Atlas is to stimulate, within all levels of the NHS, a search for un-warranted variation, defined as “Variation in the utilization of health care services that cannot be explained by variation in patient or patient preferences”. Commissioners should consider the opportunities to maximise health outcome and minimise inequalities by addressing unwarranted variation. It is hoped the atlas will stimulate action to tackle the drivers of unwarranted variation to improve quality for patients and increase value for the NHS. For further information and links to the Atlas, please visit the QIPP Right Care website http://www.rightcare.nhs.uk/ To find out more about SEPHO please visit our website http://www.sepho.org.uk/


Page 10

By Paul Carter, EQ Communications and Engagement Manager Fatai Ogunlayi, Quality Innovation and Productivity Analyst Background Enhancing Quality (EQ) is a PCT Alliance sponsored Programme using an innovative and proven rapid improvement model which has been successfully applied in the North West region on high volume clinical pathways. EQ reports into the SHA Quality Board to ensure alignment with the strategic quality agenda.

What is EQ? Essentially it is a clinical change Programme which triangulates information to drive quality improvements in clinical interventions, patient reported outcomes and patient experience. An incentive system through CQUIN recognises and rewards providers for improving their patient care consistently, just as QOF does in General Practice.

Programme Desired Outcomes The vision for the Programme is to drive up the standards of hospital, primary and community care by accurate benchmarking, pin-pointing variation and then supporting clinical leadership to innovate and improve patient outcomes. It also recognises best practice and nurtures a collaborative approach whereby learning is shared and adapted within and between different Trusts. The Programme aim is to:

       

Save lives and improve care Identify, recognise and then to reduce variations in patient outcomes and patient experience between Trusts and within Trusts Increase efficiency and cost savings by reducing length of stay, complications and readmissions. The Programme will deliver this by: Clinicians using information they have generated to improve their results. Developing improvement skills in clinicians and managers; using data to improve care; promulgate techniques in changing ‘hands on practice’; engaging patients in the change process. Encouraging all units towards delivering current upper decile performance through incentives and celebrating achievements. Extending the scope of clinical metrics, performance and benchmarking driven by clinicians, patients and stakeholders.

Des Holden, Clinical Director for EQ (and Medical Director of Brighton and Sussex University Hospitals NHS Trust) explains why the EQ programme is so important: ‘The aims of the EQ project are really exciting—to look at what steps add value to the patient journey, and improve outcomes, to reduce the variation in these steps being undertaken within individual units and between units, to learn from each other and above all to improve patient health. Over the next year I hope to see the project move from data collection to driving better process, and our challenge is to deliver even more impressive results than those of the AQ programme in the North West’.

Why is it important? EQ Programme team have been working with the Quality Observatory to produce a range of dashboards aimed at showcasing the impact that the program will have on clinical outcomes of patients in the patch. These clinical outcomes are length of stay, 30-day re-admission rate, complications rate, mortality rate and admission rate. Initial set of analyses (looking at baseline data) supports the view that there are very significant variations in outcomes across the region (see Dashboard 1). Dashboard 1 is for length of stay for Heart Failure. This type of analysis is useful in identifying trends and variations and would reveal potential areas of investigation. This is something colleagues at the Quality Observatory like to refer to as “Where we have been and where we are going”. Trusts have been able to use these dashboards to compare themselves to other trusts in the patch and use this information to challenge themselves. One particular trust have noticed that they have lower than average LOS and In-Hospital Mortality rate for most of the clinical areas, which is good, however for 30-day Re-admission rate, they are above the patch average (see Dashboard 2 from Heart Failure data for 09/10). This has prompted them to maybe have a look at reasons why and see if there is any action that can be taken.


Page 11 Dashboard 1: Length of stay trend all trusts

Dashboard 2: Trust Level indicators


Page 12

EQ aims to streamline care, improve documentation and generally make the care provision more consistent and reliable – every time for every patient. We know that reliable care will yield higher quality clinical care, better outcomes and lower costs. The data used in the analyses are sourced from Secondary Uses Service (SUS) and over the course of the EQ Programme, these outcomes will be measured quarterly with the aim that improvements in these can be attributed with some degree of confidence to specific actions taken as a result of the EQ programme.

The EQ pathways are: Acute Pathways : Hip and Knee Replacement, Acute Myocardial Infarction, Pneumonia and Heart Failure

Community and Primary Care Pathway: In an extension to the existing pathways adopted by the North West, the EQ Programme is looking at the entire heart failure pathway from Acute settings into Community and Primary Care.

Dementia Pathway: The EQ methodology is also being applied to the dementia pathway linking closely to the NICE Quality Standards.

How does it work?

The EQ Cycle

It's very similar to QOF! Using international evidence, a set of clinical indicator metrics are agreed with clinicians across SEC. These are the interventions known to have the greatest impact on optimising patient outcome. NICE, National Service Frameworks and other initiatives have also provided us with evidence based guidance but EQ takes this a step further – it goes on to measure and report whether each and every patient received that appropriate care at the right time – each and every time.

Metric identification

Clinical Concensus

Evidence base Data definition set New pathways Data recording

Service Improvement loop Data extraction Benchmarking

Data Analysis

Once the required metrics are defined and agreed, all providers must have an efficient mechanism to collect and submit the data and then review individual and comparative results. To do this, a standardised data collection and reporting tool is made available to all provider organisations. Patients meeting the criteria for each pathway can be identified using algorithms from SUS data. In this way, each Trust can upload the data required for those patients into the standardised tool. At any time, the Trusts can review their performance in the tool. Community and Primary care Heart Failure and Dementia pathway doesn’t have the benefit of SUS data so we are developing other ways of collecting data. Once the information is collected and validated, it is benchmarked against other providers, nationally and internationally, enabling clinical teams to see where they can improve their patient care. One year into the programme and it is already becoming clear that the best outcomes are achieved by a winning combination of motivated and innovative multi-disciplinary teams supported by rigorous documentation, data collection and analysis. The extent of clinical engagement across different disciplines on EQ is the envy of other programmes and the amount of collaboration, espoused by the call not to be afraid to plagarise what works elsewhere, has been welcomed and applauded by many. If you would like more information about: Enhancing Quality Programme, please contact : Paul.Carter@westsussexpct.nhs.uk The QO dashboards please contact : fatai.ogunlayi@southeastcoast.nhs.uk


Page 13

Turn your Team into your very own set of analytical top trumps ! Simply Print out the Template Below , add in the Pictures, fill in the Ranking values and you are good to go!



Page 15

Managing if() Formulas Application: Microsoft Excel 2007 Dear Quality Observatory

=IF(ISBLANK($F42),0,IF($F42="A",VLOOKUP(L$5,Apportionments! $D$3:$O$140,3,FALSE),IF($F42="B",VLOOKUP(L$5,Apportionments! I have a spreadsheet a that contains $D$3:$O$140,4,FALSE),IF($F42="C",VLOOKUP(L$5,Apportionments! some very large Formulas. as you can $D$3:$O$140,6,FALSE),IF($F42="D",VLOOKUP(L$5,Apportionments! see from my example. $D$3:$O$140,7,FALSE),IF($F42="E",VLOOKUP(L$5,Apportionments! $D$3:$O$140,8,FALSE),IF($F42="F",VLOOKUP(L$5,Apportionments! Is there any way I can get excel to do $D$3:$O$140,9,FALSE),IF($F42="G",VLOOKUP(L$5,Apportionments! the same thing without making it so large $D$3:$O$140,10,FALSE),IF($F42="H",VLOOKUP(L$5,Apportionments! and complex? I’m using Excel 2007 $D$3:$O$140,11,FALSE),IF($F42="I",VLOOKUP(L$5,Apportionments! -Shirley Ashby $D$3:$O$140,12,FALSE),IF($F42="J",VLOOKUP(L$5,Apportionments! Information Analyst, Surrey PCT $D$3:$O$140,13,FALSE),IF($F42="K",VLOOKUP(L$5,Apportionments! $D$3:$O$140,14,FALSE)))))))))))))*E42

Solution: Complexity 2/5 — Uses logic and lookup formulas This is one of the common issues that we have seen recently, the recent developments in excel 2007 mean that you are no longer limited to the 7 level nesting that you had to maintain in 2003. Whilst removing these restrictions can be very handy, this does make it very easy to create very large formulas with out thinking, and this will have an impact on both the final size of the file as well as the calculation processing time. Your calculation above can be simplified very easily using the Vlookup() function to return the column offset number and an if() statement to handle logic. Main syntax used: IF(logical_test,value_if_true,value_if_false) VLOOKUP(lookup_value,table_array,col_index_num,range_lookup) The first step is to create a table in your workbook to hold the lookup_value and the col_index: You can now adjust your formula as follows: if() statement to handle True/false Evaluation statement (ISBLANK())

=IF(ISBLANK($F42),0,VLOOKUP(L$5,Apportionments!$D$3:$O$140,VLOOKUP($F42,$A$1:$B$11,2) ,FALSE))*E42

ISBLANK() is used to test for blank values ISERROR() could also be used to test for values not in lookup list

Main Vlookup to return value to process

This nested Vlookup is used to return the Col_OffSet value to the main Vlookup


Page 16

Quality and Risk Profiles—Tin openers, Neil Young & Digital Taylorism By Richard Hamblin, Director of Intelligence, Care Quality Commission The Quality and Risk Profile (QRP) for the NHS is now about six months old and has been shared with the NHS for three. I write this as Christmas approaches rapidly, and in seasonal vein take the chance to reflect on progress so far. For those who don’t know, the quality and risk profile or QRP is our way of bringing together what we know about an organisation in one place to assess the risk of that organisation failing to comply with the essential standards of quality that they signed up to when registering with CQC to provide health or social care. Since making the QRP (or at least a .pdf analogue of it) available at the end of September we’ve had plenty of feedback from the NHS, the vast majority of it good. However, there are some concerns which I think spring from a prior era which can get in the way of using the thing properly. Put simply these are philosophical, so I’ll need to make a quick trip around my rather cryptic heading before alighting on some practical suggestions for using the QRP successfully. The QRP very definitely isn’t a performance management tool. And this, I appreciate, is a major philosophical shift from the last ten years. The last ten years have been all around micro management of numbers to achieve particular goals, primarily around access times: eighteen weeks, four hours, eight minutes etc. Now this has achieved something of value, access is better than every before, and this often means better outcomes and experience for patients: just last week the British Social Attitudes survey reported the highest level of satisfaction with the NHS since records began. Yet we know there were negative consequences of this approach too, as could have been predicted from Peter Smith’s seminal work on the subject of performance measurement1.

300000 250000 200000 150000 100000 50000

Dec-07

Jun-07

Dec-06

Jun-06

Dec-05

Jun-05

Dec-04

Jun-04

Dec-03

Jun-03

Dec-02

Jun-02

Dec-01

0 Jun-01

People waiting >6 months for elective admission

Figure 1 : Performance management drove improvement in the 2000s

To my mind, one of these, though it has been little commented on, was the effect on professionalism, including that of health informatics professionals; something often referred to as “Digital Taylorism”2. This, with a nod to the works of F.W Taylor, the American advocate of “scientific management”, is the view of close measurement and management of very specific targets which supersedes professional judgment – in other words numbers risk becoming a replacement for thinking, rather than an aid to it. This tendency becomes exacerbated when we have very simple yes/no, on/off threshold targets. It was Neil Young who deplored digital recording (which of course uses an on/off representation of music) when compared with analogue – this is why, of course, vinyl sounds better: music, and for that matter healthcare, happens in analogue. This is a guiding principle of the QRP; it is an “analogue” rather than “digital” system. We don’t use threshold targets, and we don’t use single measures. What you see in the dials at the top two levels reflects the consistency of a pattern of variation, rather than achievement of targets below. For this reason we don’t use it to make judgments – the QRP tells us about an unusual pattern that we need to understand; rather than poor performance that we need to manage and make pronouncements on.


Page 17

Figure 2 : The QRP dials identify unusual patterns which may point to failure to comply with registration

The use of the QRP to prompt inspectors to ask questions rather than make judgments is quite distinct from how information was used in the star ratings and annual healthcheck regimes, but is scarcely a radical departure from thinking about how to measure the success of healthcare. Twenty years ago Carter, Klein and Day spoke of indicators having primary use as “tin openers not dials” – and this has been very much a guiding philosophy in the development of the QRP. So what does this mean for how best to use the QRP3. Some quick tips. First, don’t print out the .pdf!! Think of the trees! And the more the QRP feels like a report rather than an interactive tool the less easy it is to use as a tin opener. The correct response to a red dial is “that’s interesting, I wonder why that is?” not “that’s terrible, we’ve got to address this!” or “that must be wrong, we’ve got to disprove this!” Second, start at the top, with the dials at section and outcome level and work downwards to the individual measures, rather than the other way round. The pattern is what matters, rather any individual measure. Each dial has enough underpinning indicators that the effect managing any one of these to a specific level cannot be predicted – not least because of how other measures may change in the meantime. Three, genuinely, use this to prompt questions rather than to look for something to “fix”. To give an example from real life, there is one example of a trust with a red dial on the safe management of medicines, which is driven by a consistent pattern of outpatient survey results about low numbers of respondents being given enough information about the drugs they were prescribed, how to take them, potential side effects etc. A “fixing” approach would micro manage an increase in the number of patients who answered these questions “yes”. However, this result should prompt questions about clinic organisation, links with pharmacy and so forth – answering and addressing these is more likely to lead to sustained improvement when the micro-manager’s attention is distracted onto the next problem. Finally please keep giving us feedback, we’ve always said that no system is perfect, but we strive to keep improving it. Visit our website www.cqc.org.uk or e-mail us at enquiries@cqc.org.uk 1. P Smith: Journal of the Royal Statistical Society. Series A 1990, The use of performance indicators in the public sector, The Guardian, 31 Aug 2010 2. Why our jobs are getting worse, Aditya Chakrabortty, The Guardian, 31 August 2010 3. Neil Carter, Rudolf Klein and Patricia Day. How Organisations Measure Success: The Use of Performance Indicators in Government, Routledge, London, 1992


Page 18

Christmas Questions 1. The 26th of December is traditionally known as: 1 St Stephen's Day, 2 Boxing Day. 3 St Nicholas day 2. In which year was The Queen's Christmas speech first televised? 3. What is the definition of a white Christmas in the UK? 4. Which well known actor died on Christmas Day in 1977? 1 Michael Bates 2 Elvis Presley 3 James Stewart 4 Charlie Chaplin 5. Where was it always winter but never Christmas?

Sci-Fi Questions 1.

What is the name of the ships’s doctor in Star Trek Voyager?

2.

Which character or characters have starred in all 6 Star Wars films (and have

been played by the same actor/ actress)? 3.

Mr Sulu was the helmsman of the U.S.S Enterprise, but which ship did he later

go on to command? 4.

In which Sci-Fi series/ film would you find the character Slartibartfast?

5.

An episode of which television series featured Rasputin, Queen Victoria, Gandhi

and Noel Coward among others?


Page 19 Name the Dashboardsee the charts below and match them to the dashboards listed.

Specialty Dashboard Dementia Dashboard

QIPP Trust Dashboard HCAI Dashboard

NHS data and definitions : What NHS programme do the initals Q.I.P.P represent? The following KPIs can all be associated with SEC QIPP programmes – match the KPI to the programme: Day case rates

Maternity and nNewborn

Number of NHS health checks delivered Women:midwife ratio

Safe Care Staying Healthy

IAPT practitioners

Planned Care

Catheter associated UTIs

Mental Health

If a hospital episode had an unbundled HRG code of ‘SB10Z’ what treatment would you expect the patient to have had? A patient has a discharge code of ‘19’ – where will they be going on leaving hospital? Which 3 waiting times targets were no longer monitored after 31st March 2010?

ANSWERS NEXT TIME……...


Page 20

The IHI 22nd Annual Forum By Samantha Riley and Katherine Cheema The Twitter update received at 4am on Friday 3rd December saying that Gatwick had finally reopened was a welcome one for Samantha and Kate who were due to fly to the 22nd Annual Institute for Healthcare Improvement National Forum in Orlando, USA. Taking off from a very icy Gatwick, it was relief to land in a relatively balmy 14 degrees only 4 hours overdue. But this wasn’t a busman’s holiday like last year, we were there to represent the Quality Observatory and NHS South East Coast in a more formal capacity. Samantha was both excited and at the same time alarmed at the prospect of presenting on the importance of effective measurement and innovative methods of presentation to engage front-line staff in improving services. Samantha was one of three presenters for the full day Minicourse attended by 150 delegates entitled ‘Strategies to deliver exceptional care experiences, efficiencies, and outcomes all in one’. Despite the previous evening’s food poisoning, Samantha’s session went well, numerous copies of Knowledge Matters were taken away and quite a number of teams got all of the questions correct to the quiz……. During the main conference, Samantha immersed herself in an many measurement modules as possible. Having worked with Davis Balestracci (whom some of you may be familiar with) 7 years ago, it felt like time for a recap. So Samantha’s first module was ‘Data Sanity : Statistical thinking for leaders’. Davis has degrees in chemical engineering and statistics and describes himself as a ‘right-brained’ statistician. Davis’ session was a great refresher in measurement techniques which really do drive improvement and a very real reminder that effective measurement doesn’t need to be complicated—in fact the key is to keep it simple. A really powerful technique that Davis focussed on in his session was the pareto matrix. For those of you that are unfamiliar with this technique, have a look at goggle or wait until the next edition of Knowledge Matters. Davis is happy to provide a free SPC tool and you can sign up to his extremely informative (and entertaining) fortnightly e-mail update which I would encourage you to do if you are interested in measurement for improvement (e-mail davis@dbharmony.com) Another session not to be missed was the session entitled ’Dashboards : Please no more green, yellow or red!’. A key presenter in this session was Lloyd P. Provost—a senior fellow with the IHI. No prizes for guessing what this session was about….. Delegates were taught that whilst colour-coded tabular one page summaries were desired by many leadership teams, these were of limited utility. Even worse than this, used in isolation, solely using RAG status summaries exposed organisations and teams to real risks in not understanding the true level of performance, improvement or deterioration. An alternative (very achievable) approach was described in this session— again, more on this in a future edition of Knowledge Matters. In the afternoon of the first day of the general conference the Quality Observatory poster joined hundreds of other examples of innovative and exciting improvement stories in the poster exhibition; with its snazzy black background (thanks Kiran!) it certainly stood out from the crowd. We were fortunate to be spotted by hospital CEO and avid Twitterer and blogger Paul Levy (President and CEO of Beth Israel Deaconess Medical Centre in Boston) who added us to his forum blog; check out the short film at http:// runningahospital.blogspot.com/2010/12/poster-session-ihi-annualforum.html (Kate appears at the very end). Quality Observatory poster in situ

Now, let’s hand over to Kate for her thoughts……..


Page 21 There is no doubt that the movement for healthcare improvement has gained new momentum as the US gets to grips with it’s own healthcare reforms and, as ever, it was inspiring to see so many people with a primary aim of quality improvement over profit and cost saving. There was a tangible shift in focus towards the experience of the patient at this forum, with patient champions in attendance and a raft of workshops, as well as a growing interest in community and population health (what we would term public health). Whilst this was primarily due to the nature of the system wide reforms taking place over there, the message was received loud and clear that whole pathway improvement, not just hospital, is necessary to address the health needs of the population both of today and of the future. Sound familiar? So, whilst we may think that differences in terms of the system of healthcare delivery preclude us from learning much from our global neighbours, we have far more in common than we know. As many nations struggle with financial issues and systemic changes in healthcare, it would be wise to look as far afield as possible to learn as much as we can and share what we know we already do well! If you would like to know more, or access any of the material from the forum, available from January, just drop us a line and we’ll point you in the right direction. (quality.observatory@southeastcoast.nhs.uk) We would also encourage you to have a good look at the IHI website—there are a wealth of useful tools and resources available to download which will help you in your efforts to improve the quality, safety and efficiency of services provided to patients. www.ihi.org Quality Observatory

mascots Clive and Barnie take in the IHI stand

Knee replacement comparison tool recently developed By Simon Berry, Specialist Information Analyst A new tool has now been developed for analysing knee replacements by consultant equivalent to the hip replacement tool covered in a previous issue of Knowledge Matters. The tool contains data for the 12 months up to the end of September 2010 (NB The hip replacement tool has also been updated to the same period). For further information on how to obtain this tool please contact Megan Beardsmore-Rust megan.beardsmorerust@esht.nhs.uk For any technical issues regarding the tool please e-mail simon.berry@southeastcoast.nhs.uk


Page 22

Analysis, Ancient and Modern It’s not a popular thing here at Quality Observatory HQ, but as they are so beloved by millions across the globe, a nod should perhaps be given to the pie chart and a definite ‘hurrah!’ to the man who is credited with inventing them. Pie charts have over 200 years of history behind them with the first known pie chart being published in 1801. It described the proportions of the then Turkish empire located in Asia, Europe and Africa and was published in ‘Statistical Breviary’ by William Playfair. By all accounts, William Playfair was a bit of a character and was renowned at the time as someone fond of ‘getrich quick’ schemes and was often to be found in a sticky situation; in 1789 he took part in the storing of the Bastille in Paris, a far cry from his Scottish engineering background. Playfair published many statistical diagrams throughout his career, not least within the pages of his Commercial and Political Atlas of 1786. This was the first major publication to contain statistical charts and showed time series line charts and clear bar charts to display an array of financial information pertaining to exports, imports and trade balance amongst other things. They were also quite attractive diagrams. The trade balance time-series line chart below shows the balance of imports and exports between England and Denmark & Norway over 80 years; where the balance is in favour or England the gap between the lines is coloured green, where against, red. Could this have been the birth of RAG rating as well?

Perhaps it is fair to say, as Beniger and Robyn did in 1978, that some of Payfair’s graphics were inspired (if not driven) by a lack of data. For example, he was unable to source more than one year of import/export data for Scotland and instead of leaving it out or bemoaning it’s loss, he instead produced a bar chart showing the imports and exports of each of 17 trading partners for the country. This provided his readers with reasonable and useful data, despite his issue with sourcing it. So, in many respects, we have a lot to thanks Playfair for; where would the lowly analyst be without a good bar chart? And lessons can certainly be learned from Playfair’s meticulous structuring of his charts in an age before computers; they are clear and easy to understand, despite the complexity of the data they are designed to show. Whether or not we should thank him for the pie chart is probably a matter of personal opinion, but this author has no doubt that the example of its misuse to the right of this column, would have William storming the Bastille all over again.


Page 23

NEWS Suicide Data Collection

Operating Framework published

The numbers of suicides of people being treated by Mental Health services has always been of great concern—so to further investigate this the Quality Observatory has set up a web based tool to collect details on suicides from the three mental health providers in South East Coast. This data will be able to identify trends and clusters of patients that can then be targeted in greater detail. For further details please contact Adam Cook (adam.cook@southeastcoast.nhs.uk)

The 2011/12 Operating Framework was published on 15th December. This year’s Operating Framework sets out what needs to happen over the transition year 2011/12. All parts of the health service are required to work across organisational boundaries to respond positively to the reform set out in Equity and Excellence: Legislation for, whilst ensuring that service quality and financial performance are maintained and improved at a time of change. Further details are available from the Department of Health website: -

Outcomes framework published The outcomes framework has now been published after consultation. Both the framework itself and the technical documentation for the indicators identified are available from http://www.dh.gov.uk/en/Publicationsandstatistics/ Publications/PublicationsPolicyAndGuidance/ DH_122944 The framework outlines the overarching indicators and areas of improvement which will be used to hold the NHS Commissioning Board to account for NHS performance and improvement. Many indicators do not yet have full definitions or data sources attached to them; these will be developed throughout 2011/12 with a view to full annual refresh and measurement against each indicator in the framework against an agreed trajectory from 2012/13. Information revolution & Liberating the NHS consultations

http://www.dh.gov.uk/en/Publicationsandstatistics/ Publications/PublicationsPolicyAndGuidance/ DH_122738 Public health white paper published The public health white paper was published on 30th November 2010 with consultation closing on 8th March 2011. Two further consultations on the funding and commissioning for public health; and on the outcomes framework and relevant indicators for the public health system will close on 31 March 2011. For details of these consultations along with other live consultations, see the link below: http://www.dh.gov.uk/en/Consultations/ Liveconsultations/index.htm National Hip Fracture Database

The closing date for these important consultations is 14th January 2011. Further details are available from the Department of Health website: http://www.dh.gov.uk/en/Consultations/ Liveconsultations/DH_120080 http://www.dh.gov.uk/en/Consultations/ Liveconsultations/DH_119651 Normalising Birth video on YouTube The Quality Observatory recently made a short film about the region-wide Normalising Birth programme. Here’s the link http://www.youtube.com/watch? v=PaQxUABOBIE

The Quality Observatory now has access to the National Hip Fracture database and will be undertaking comparative analysis on this data early in 2011. If you have any specific questions or have ideas about what analysis would be useful please contact Simon Berry simon.berry@southeastcoast.nhs.uk Drop in Sessions 2011 Drop in sessions have now been arranged for the first part of 2011:  26th January  23rd February  23rd March If you have an analytical query, you need help interpreting data or need help with Excel, Access or SQL— the Quality Observatory can help you! Simply contact us at: quality.observatory@southeastcoast.nhs.uk to book a one-to-one session.


QIPP Hop by Adam Cook YO! Check this out... QIPP hop QIPP hop Q to the I to the double P, QIPP hop QIPP hop, Q to the I to the double P One - Q is for quality, bringing better care to you and me, better outcomes from yo' operation all across the whole damn nation. Adding years to life and life to years For quality we are the pioneers. A whole health plan to make you well, In this way we will excel QIPP hop QIPP hop Q to the I to the double P, QIPP hop QIPP hop, Q to the I to the double P Two - I is for Innovation and I'm sending yo' this invitation for new ideas, better ways to work, bring whole new flava and whole new quirk. Don't be no square, work outside the box, The old ways of working are on the rocks. Small ideas have great potential, and the changes they bring are exponential. QIPP hop QIPP hop Q to the I to the double P, QIPP hop QIPP hop, Q to the I to the double P

Three - P's for productivity We're living in a time of austerity, do more for less, save some dough, tighten yo' belt 'cos here we go. Making changes to release the money It ain't easy, it ain't funny. Gotta make decisons that are mighty tough, 'cos times is hard and times is rough QIPP hop QIPP hop Q to the I to the double P, QIPP hop QIPP hop, Q to the I to the double P Four - P for prevention, stopping disease is our intention, Rooting out causes of poor health, regardless of age, race or wealth. Stopping the spread of an epidemic needs practical action, not academic. To keep the country strong and fit Gonna take effort, gonna take grit. QIPP hop QIPP hop Q to the I to the double P, QIPP hop QIPP hop, Q to the I to the double P We're the QIPP hop crew from the NHS for healthcare needs we're the best QIPP hop QIPP hop Q to the I to the double P, QIPP hop QIPP hop, Q to the I to the double P

To view the music video starring MC AC (Adam Cook) and his backing singers, type this address into your browser, turn up the volume and join in if you dare!! http://www.youtube.com/watch?v=N-WoVIbC2-M Knowledge matters is the newsletter of NHS South East Coast’s Quality Observatory, to discuss any items raised in this publication, for further information or to be added to our distribution list, please contact: NHS South East Coast York House 18-20 Massetts Road Horley,Surrey, RH6 7DE Phone:

01293 778899

E-mail: Quality.Observatory@southeastcoast.nhs.uk To contact a team member: firstname.surname@southeastcoast.nhs.uk


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.