1. Clinical Governance

Measurement and quality improvement

Action 1.8

The health service organisation uses organisation-wide quality improvement systems that:

a. Identify safety and quality measures, and monitor and report performance and outcomes

b. Identify areas for improvement in safety and quality

c. Implement and monitor safety and quality improvement strategies d. Involve consumers and the workforce in the review of safety and quality performance and systems

Intent

An effective quality improvement system is operating across the organisation

Reflective questions

How does the quality improvement system reflect the health service organisation’s safety and quality priorities and strategic direction?

How does the health service organisation identify and document safety and quality risks?

What processes are used to ensure that the actions taken to manage identified risks are effective?

Key tasks

  • Define quality for clinical services (for example, effectiveness, safety, consumer experience) and share this information with the workforce
  • Review the quality improvement system, including the vision, mission, values and objectives, to ensure that they reflect the organisation’s clinical safety and quality priorities, and strategic direction
  • Decide how feedback will be collected from the workforce, patients and consumers
  • Consider whether there is a coherent, planned and systematic schedule of audits of clinical and organisational systems, and reliable processes to capture findings and implement necessary improvements
  • Develop a schedule for reporting to the governing body and managing the design and performance of key clinical systems
  • Monitor and review progress on actions taken to improve safety and quality, and provide feedback to the workforce, patients and consumers
  • Provide information and training, where necessary, to the workforce, patients and consumers to encourage their involvement in the analysis of performance data.

Strategies for improvement

Hospitals

Develop a quality improvement system

The elements of a successful quality improvement system include:

  • A description of ‘high quality’ that is reflected through the organisation’s vision, mission and values
  • A definition of the organisation’s stakeholders
  • Clearly defined and aligned organisational objectives and clinical quality objectives
  • Clearly defined processes and responsibilities that are required to meet quality objectives
  • Training for the organisation’s workforce in safety and quality
  • Processes to verify that the quality improvement system is operating effectively
  • Mechanisms for monitoring consumer satisfaction, measuring quality and implementing improvements.

Define quality and how it will be measured

Define the elements of quality to be used by the organisation (for example, safety, effectiveness, consumer experience). Provide a common language and understanding for the design, implementation and monitoring of safety and quality performance throughout the organisation. An example of this is the national list of hospital-acquired complications (HACs).

HAC refers to a complication for which clinical risk mitigation strategies may reduce (but not necessarily eliminate) the risk of that complication occurring. The national list of HACs includes 16 complications that were selected based on the criteria of preventability, patient impact, service impact and clinical priority. HACs are identified using routinely collected data extracted from patient healthcare records. Codes are used to identify the diagnosis, and a flag is used to indicate that the diagnosis arose during the episode of care.

The HACs list provides a succinct set of complications to support monitoring of patient safety in a hospital setting. Regular reports to clinicians, boards and other stakeholders on HACs can help identify areas that require attention, as well as areas of best practice. While the identification and reporting of HACs are important elements supporting patient safety, they are intended to complement and be used alongside other quality improvement processes.

The HACs list was developed through a comprehensive process that included reviews of the literature, clinical engagement, and testing of the concept with public and private hospitals. The list was agreed by the Commission’s Inter-Jurisdictional Committee in June 2016. In March 2017, the list was included in the National Health Reform Agreement for use in pricing and funding of Australian public hospitals.

The HACs list, and further information on how it was developed and tested, are available on the Commission’s website.

Involve the workforce, patients and consumers in defining quality, and in processes such as reviewing quality improvement systems.

Define the key indicators for safety and quality measures that will be routinely collected and reported to management and the clinical workforce, as well as the level of detail required to enable the governing body and workforce to fulfil their responsibilities. These may include data from incidents and complaints management systems, safety and quality audit reports, infection control reports, reviews of clinical practice, and clinical indicators relating to specific actions in the NSQHS Standards.

Routinely measure and monitor patient experience by using national core common questions on patient experience developed by the Commission.

Conduct regular reviews and audits

Develop a schedule of reviews and audits that cover the variety of services and locations used for the delivery of care, to ensure that there is systematic oversight of safety and quality systems.

Conduct audits throughout the organisation, including the clinical, departmental, divisional and whole-of-organisation levels. Actively engage clinicians and consumers in the audit processes and analysis of results. Ensure that audits test the design and performance of the organisation’s clinical governance system.

Audits are effective if their outcomes are used for improvement and assurance purposes. Independent auditors or reviewers can assist to ensure a high level of assurance of objective reporting for the governing body. Report audit outcomes throughout the organisation – to the governing body, the workforce, and patients and consumers.

Record the outcomes of clinical system audits on a register, together with proposed actions and responsibilities, and evidence of implementation and follow-up. These records can be used to show how risks and opportunities identified through the quality improvement system are addressed, to improve safety and continuously improve performance.

Examples of evidence

Select only examples currently in use:

  • Policy documents that describe the processes and accountability for monitoring the safety and quality of health care
  • Documented safety and quality performance measures
  • Schedule for internal or external audits
  • Audit reports, presentations and analysis of safety and quality performance data
  • Feedback from the workforce about the use of safety and quality systems
  • Feedback from consumers about their involvement in the review of safety and quality performance data
  • Quality improvement plan that includes actions to deal with issues identified
  • Examples of specific quality improvement activities that have been implemented and evaluated
  • Committee and meeting records in which reports, presentations, and safety and quality performance data are regularly reviewed and reported to the governing body or relevant committees
  • Training documents on the health service organisation’s quality improvement system
  • Communication with the workforce, patients and carers that provides feedback regarding safety and quality of patient care
  • Reports on hospital-acquired complications indicator set (public hospitals only).
Day Procedure Services

Develop a quality improvement system

The elements of a successful quality improvement system include:

  • A description of ‘high quality’ that is reflected through the organisation’s vision, mission and values
  • A definition of the organisation’s stakeholders
  • Clearly defined and aligned organisational objectives and clinical quality objectives
  • Clearly defined processes and responsibilities that are required to meet quality objectives
  • Training for the organisation’s workforce in safety and quality
  • Processes to verify that the quality improvement system is operating effectively
  • Mechanisms for monitoring consumer satisfaction, measuring quality and implementing improvements.

Define quality and how it will be measured

Define the elements of quality to be used by the organisation (for example, safety, effectiveness, consumer experience). Provide a common language and understanding for the design, implementation and monitoring of safety and quality performance throughout the organisation.

Define the key indicators for safety and quality measures that will be routinely collected and reported to management and the clinical workforce, as well as the level of detail required to enable the governing body and workforce to fulfil their responsibilities. These may include data from incidents and complaints management systems, safety and quality audit reports, infection control reports, reviews of clinical practice, and clinical indicators relating to specific actions in the NSQHS Standards.

Routinely measure and monitor patient experience by using national core common questions on patient experience developed by the Commission.

Conduct regular reviews and audits

Record outcomes of clinical system audits on a register, together with proposed actions and responsibilities, and evidence of implementation and follow-up. These records can be used to show how risks and opportunities identified through the quality improvement system are addressed, to improve safety and continuously improve performance.

Actively engage clinicians and consumers in the audit processes and analysis of results. Ensure that audits test the design and performance of the organisation’s clinical governance system.

Examples of evidence

Select only examples currently in use:

  • Policy documents that describe the processes and accountability for monitoring the safety and quality of health care
  • Documented safety and quality performance measures
  • Schedule for internal or external audits
  • Audit reports, presentations and analysis of safety and quality performance data
  • Feedback from the workforce about the use of safety and quality systems
  • Feedback from consumers about their involvement in the review of safety and quality performance data
  • Quality improvement plan that includes actions to deal with issues identified
  • Examples of specific quality improvement activities that have been implemented and evaluated
  • Committee and meeting records in which reports, presentations, and safety and quality performance data are regularly reviewed and reported to the governing body or relevant committees
  • Training documents on the health service organisation’s quality improvement system
  • Communication with the workforce, patients and carers that provides feedback regarding safety and quality of patient care.
MPS & Small Hospitals

Identify the local governance arrangement for monitoring and improving safety and quality, including identifying local individuals or groups with responsibility for oversight of clinical safety and quality risk management.

MPSs or small hospitals that are part of a local health network or private hospital group should use the description of ‘high quality’ that is reflected through the network or group’s vision, mission and values.

Small hospitals that are not part of a local health network or private hospital group should develop a description of ‘high quality’ – for example, describe an effective and safe health service organisation in which consumers have a good experience of care.

The organisation then:

  • Shares this information with the workforce
  • Determines how feedback will be collected from the workforce, patients and consumers
  • Considers whether there is a coherent, planned and systematic schedule of audits of clinical and organisational systems, and reliable processes to capture findings and implement necessary improvements
  • Develops a schedule for reporting to the governing body and managing the design and performance of key clinical systems
  • Monitors and reviews progress on actions taken to improve safety and quality, and provides feedback to the workforce, patients and consumers
  • Provides information and training, if necessary, to the workforce, patients and consumers to assist their involvement in analysing performance data.

One safety and quality measure that could be used in public health service organisations is hospital-acquired complications (HAC). HAC refers to a complication for which clinical risk mitigation strategies may reduce (but not necessarily eliminate) the risk of that complication occurring. The national list of HACs includes 16 complications that were selected based on the criteria of preventability, patient impact, service impact and clinical priority. Not all complications will be relevant to MPSs or small hospitals.

The HACs list, and further information on how it was developed and tested, are available on the Commission’s website.

Examples of evidence

Select only examples currently in use:

  • Policy documents that describe the processes and accountability for monitoring the safety and quality of health care
  • Documented safety and quality performance measures
  • Schedule for internal or external audits
  • Audit reports, presentations and analysis of safety and quality performance data
  • Feedback from the workforce about the use of safety and quality systems
  • Feedback from consumers about their involvement in the review of safety and quality performance data
  • Quality improvement plan that includes actions to deal with issues identified
  • Examples of specific quality improvement activities that have been implemented and evaluated
  • Committee and meeting records in which reports, presentations, and safety and quality performance data are regularly reviewed and reported to the governing body or relevant committees
  • Training documents on the health service organisation’s quality improvement system
  • Communication with the workforce, patients and carers that provides feedback regarding safety and quality of patient care
  • Reports on hospital-acquired complications indicator set (public hospitals only).

Action 1.9

The health service organisation ensures that timely reports on safety and quality systems and performance are provided to:

a. The governing body

b. The workforce c. Consumers and the local community

d. Other relevant health service organisations

Intent

Health service organisations provide accurate and timely information on safety and quality performance to key stakeholders.

Reflective question

What processes are used to ensure that key stakeholders are provided with accurate and timely information about safety and quality performance?

Key tasks

  • Endorse a schedule of reporting that outlines the topic areas, format and frequency of reporting on safety and quality performance, and the effectiveness of the safety and quality systems
     
  • Collaborate with the workforce, consumers, local communities and other health service organisations to identify the topic areas, format and frequency of reporting to these groups on safety and quality performance, and the effectiveness of the safety and quality systems.

Strategies for improvement

Hospitals

Routinely collecting process and outcome data, monitoring data for trends and reporting clinical alerts enables organisations to understand outcomes from service delivery, and to respond to deviations from the expected outcomes promptly.

Monitoring safety and quality performance data should include all clinical areas and cover all locations of service delivery to ensure a comprehensive picture of performance.

Clearly documented processes to ensure the accuracy, validity and comprehensiveness of information will increase the organisation’s confidence in data quality. Providing the governing body and the workforce with access to the organisation’s most important safety and quality metrics (indicators) will enable regular review of progress and will allow the organisation to respond to issues as they arise. Suitable metrics may include:

  • Key relevant national priority indicators and regulatory requirements
  • Those covering safety, clinical effectiveness, patient experience, access and efficiency across the organisation’s range of services and service locations
  • Trends in reported adverse events, incidents and near misses
  • Compliance with best-practice pathways.

Provide the governing body and management with regular, comprehensive safety and quality presentations and reports from managers and clinicians. Schedule data presentations following agreed criteria (for example, significance of risk, patient volume, organisational priority or focus).

Effective data presentations should cover:

  • The design of the systems and processes being used
  • Evaluation and management of risks
  • The effectiveness of the risk management system
  • Compliance with evidence-based practice
  • Safety and quality outcomes, including consumer experience and patient-reported outcome measures
  • Plans to improve safety and quality, and reduce risk.

In addition to providing data to the governing body, provide information to:

  • The workforce, who should review the data to identify emerging safety and quality issues, or assess the impact of safety and quality initiatives
  • Consumers and local community members as major stakeholders
  • Other relevant health service organisations that may use the information in planning for patients who are referred to or from the organisation.

Examples of evidence

Select only examples currently in use:

  • Reports on safety and quality performance data that are provided to the governing body, managers, committees, the workforce or consumers
  • Committee and meeting records in which information on safety and quality indicators, data or recommendations by the governing body are discussed
  • Committee and meeting records in which the appropriateness and accessibility of the health service organisation’s safety and quality performance information are discussed
  • Communication strategy that describes processes for disseminating information on safety and quality performance to the community
  • Communication with the workforce and consumers on the health service organisation’s safety and quality performance
  • Records of safety and quality performance information published in annual reports, newsletters or other local media
  • Reporting templates and calendars
  • Reports provided to external organisations.
Day Procedure Services

Routinely collecting process and outcome data, monitoring data for trends and reporting clinical alerts enables organisations to understand outcomes from service delivery, and to respond to deviations from the expected outcomes promptly.

Clearly documented processes to ensure the accuracy, validity and comprehensiveness of information will increase the organisation’s confidence in data quality. Providing the governing body and the workforce with access to the organisation’s most important safety and quality metrics (indicators) will enable regular review of progress and will allow the organisation to respond to issues as they arise. Suitable metrics may include:

  • Key relevant national priority indicators and regulatory requirements
  • Indicators covering safety, clinical effectiveness, patient experience, access and efficiency across the organisation’s services
  • Trends in reported adverse events, incidents and near misses
  • Compliance with best-practice pathways.

Provide the governing body and management with regular, comprehensive safety and quality presentations and reports from managers and clinicians. Schedule data presentations following agreed criteria (for example, significance of risk, patient volume, organisational priority or focus).

Effective data presentations should cover:

  • The design of the systems and processes being used
  • Evaluation and management of risks
  • Compliance with evidence-based practice
  • Safety and quality outcomes, including consumer experience and patient-reported outcome measures
  • Plans to improve safety and quality, and reduce risk.

Some organisations may choose to participate in benchmarking groups, in which they submit clinical indicator data and are provided with benchmarking reports. This enables them to assess their performance against data collated from similar peer groups.

Examples of evidence

Select only examples currently in use:

  • Reports on safety and quality performance data that are provided to the governing body, managers, committees, the workforce or consumers
  • Committee and meeting records in which information on safety and quality indicators, data or recommendations by the governing body are discussed
  • Committee and meeting records in which the appropriateness and accessibility of the health service organisation’s safety and quality performance information are discussed
  • Communication strategy that describes processes for disseminating information on safety and quality performance to the community
  • Communication with the workforce and consumers on the health service organisation’s safety and quality performance
  • Records of safety and quality performance information published in annual reports, newsletters or other local media
  • Reporting templates and calendars
  • Reports provided to external organisations.
MPS & Small Hospitals
  • Develop a reporting schedule that outlines the topic areas, format and frequency of reporting on safety and quality performance, and the effectiveness of the safety and quality systems
  • Involve the workforce, consumers, local communities and other health service organisations in identifying the information, format and frequency of reporting to these groups on safety and quality performance, and the effectiveness of the safety and quality systems
  • Routinely collect process and outcome data that include all clinical areas and cover all locations of service delivery, and monitor data for trends and reporting clinical alerts.

Suitable metrics may include:

  • Key relevant national priority indicators and regulatory requirements
  • Indicators covering safety, clinical effectiveness, patient experience, access and efficiency across the organisation’s services, and service locations
  • Compliance with best-practice pathways.

Some organisations may choose to be involved in benchmarking groups, for which they submit clinical indicator data and are provided with benchmarking reports. This enables them to assess their performance against data collated from other similar peer groups.

Examples of evidence

Select only examples currently in use:

  • Reports on safety and quality performance data that are provided to the governing body, managers, committees, the workforce or consumers
  • Committee and meeting records in which information on safety and quality indicators, data or recommendations by the governing body are discussed
  • Committee and meeting records in which the appropriateness and accessibility of the health service organisation’s safety and quality performance information are discussed
  • Communication strategy that describes processes for disseminating information on safety and quality performance to the community
  • Communication with the workforce and consumers on the health service organisation’s safety and quality performance
  • Records of safety and quality performance information published in annual reports, newsletters or other local media
  • Reporting templates and calendars
  • Reports provided to external organisations.
Last updated 30th May, 2018 at 09:27pm
BACK TO TOP