Home » Posts tagged 'aggregated data'
Tag Archives: aggregated data
Achieving great service is straightforward if unconventional: give customers/patients/service users what they need. However, convention dictates that to adopt this approach will lead to expensive ‘gold-plating’ of services (quality service but at higher costs).
Instead leaders and organisations seek to follow convention and manage their activities by protecting budgets, imposing access restrictions via criteria or eligibility introducing service level agreements and focusing on efficiency through reducing transactional or unit costs.
Yet, the real paradox is that an explicit focus on managing activities paradoxically increases the very thing organisations seek to reduce – cost. Convention decrees that the answers to problems are already known and pre-prescribed solutions can be delivered through meticulous plans and reports. As a consequence these change approaches fail to deliver in practice.
Take conventional healthcare commissioning in the NHS – a person as a health and/or social care need; their need is assesses and then sooner or (most often) later a service is first commissioned then provided by different professionals. Service level agreement met, project milestone ‘green-lighted’ and ticked.
What happens is because services are not designed around the need(s) of the person/patient/service user, they represent to the service in the vain hope that their need(s) maybe better met. Professionally, the response to this problem is to repeat the process of assess, commission and provide. The outcome experience of the person/patient/service user is that they continue to represent and costs rise. Why does this happen? It is because the mind-set is misguided.
Conventional business change needs to change
Conventional change or improvement relies on a wrong-headed ‘back-to-front’ perspective. What is meant by this phrase is a rear-guard focus on removing cost by reducing activity and expecting patients to change their behaviour. It results in an obsession with activity volumes and ‘bottom-line’ costs. The one problem with this approach is that it doesn’t work. Back-to-front thinking always leads to distortion of organisational performance and higher costs.
The mechanics of conventional change follow a typical path. Invalidated hunches, opinions and/or data consisting of worthless aggregated activity, arbitrary benchmarking and/or cost volumes are used to identify problems.
Understanding the patient or service user (as opposed to non-user public) perspective in this process is rarely, if ever sought. Agreeing appropriate governance arrangements typically loom large at this point and take-up not inconsiderable internal discussion, effort and time.
Much time is consumed completing a litany of project management induced paper-chasing reports such as project initiation documents or PIDs. Once signed-off this document helps formulate a project plan which is established to solve the preconceived problem. A business case is then written which outlines time, costs and predetermined outcomes.
These outcomes are then ‘monitored’ through office-based completion of reporting documents such as highlight reports full with activity and cost volumes with ‘traffic light’ systems – green for good, amber for somewhere in between and red for bad. None of this involves spending time in the work and empirically understanding ‘why’ things are the way they are.
Improvement activity is often relegated to time-limited projects and people who sit outside of the actually work. Disproportionate time and effort is then spent on conducting public consultations (‘the blind leading the blind’) which replace the opportunity to generate empirical knowledge of what is actually happening and causing problems at the sharp-end, in the work.
Performance metrics derived at the business case stage tend only to measure activity on whether a project is completed ‘on time’ and ‘to cost’. Little regard is made of how much operational improvement is achieved in resolving the real problem(s).
‘Off-the-shelf’, standardised solutions are mandated that usually involve automation or greater use of technology (sometimes referred to as ‘channel shift’ or ‘digital-by-default’); sharing or outsourcing services; restructuring to establish new ‘target operating models’; rationing buildings and service provision; charging and trading services and/or reducing staff numbers. Here abstract cost-benefit equations are the order of the day.
Unless consultants are engaged, ‘delivery’, ‘execution’ or ‘implementation’ is then solely contracted out to frontline staff who are left to try and make the problem fit the predetermined standardised solution(s). Benefits realisation plans are written remotely and focus on the completion of activity tasks not performance improvement.
Consequently, ‘change’ seeks to solve symptoms not address root-causes. The predictable result of this way of approaching improvement are failing projects, higher costs and poorer service as experienced by patients.
Tomorrow I will outline the more intelligent way to conceive of how to change for better and undertake meaningful performance improvement.