Feedback (debriefing and assessment)
Whenever an assessment instrument is being considered, or a tool being developed, ensure the principles of competency-based assessment should be met
Valid
the assessment must measure what it states it is measuring.
Reliable
the assessment must measure it consistently.
Fair
the assessment must be equitable to all learners.
Flexible
the assessment must provide for recognition of knowledge and skills, however acquired; it must cover elements of both on and off job components of training; it must be accessible to learners through a variety of delivery modes.
Other principles of assessment include:
Feasible
in cost and practicability
Useful
to the learner and the teacher
Acceptable
in terms of equity issues
The rules of assessment should also be considered:
Valid
covers a broad range of knowledge and skills; aligns to performance standard
Sufficient
enough evidence to deem the participant competent
Current
recent evidence that aligns to current industry standards
Authentic
genuinely the participant’s work; would they demonstrate the same skills if you weren’t watching?
Observational assessment
Observation checklists should be simple so that the facilitator can remain focused on the simulation event. These checklists are often completed in consultation with faculty and clinical experts at the completion of the simulation.
They should highlight both technical and non-technical skills where relevant.
Following are examples of templates that could be used, but requirements will change depending on the skills required and whether the scenario is multidisciplinary.
Some frameworks exist for assessing non-technical skills, which could be applied to varying disciplines. Refer to Manual of Simulation in Healthcare (Riley, 2008), section 22.5.2, p. 311–15, for further reading on non-technical skills assessment frameworks. The template below shows an example of a simple template for shorter or less advanced scenarios.
Self-efficacy
This technique should be used as a self-assessment tool that can be reflected on during debriefing or at the conclusion of the event. The self-efficacy tool should be completed by the participant prior to the simulation event. It can take the form of a short-response questionnaire or a Likert scale.
It is used for the participants to reflect on their beliefs of skills and how they actually perform. It enables the participant to honestly reflect on whether they are over- or under-confident about their skills, and how this affects their clinical practice. The questions developed should relate directly to the desired outcomes and course objectives. Following are examples of how to develop self-efficacy tools.
Debriefing
Debriefing tools will vary depending on the technique chosen to run the debrief. For example, the scenario design may include a basic outline of the debriefing session, with a simple form that allows facilitators to take notes relating to learning objectives.
Regardless of the debrief tools and techniques chosen, a clear link should be made between how participants can improve or modify existing skills and implement new skills in their work environment based on their training experience.