Submitted by obuadmin on
Directive 2005/62/EC requires that in blood establishments, competence of personnel shall be evaluated regularly (Annex: 2.4). If this principle is to be extended to cover all staff involved in the clinical transfusion process, it will be necessary to consider the points that follow.
The purpose of assessment is to evaluate or measure achievement of learning and competence, and provide information for more effective teaching. There are four stages of development that an individual progresses through from acquiring knowledge to performing a task in clinical practice and these are “knows, knows how, shows how and does”, and each level requires to be assessed differently. See Figure 10.3
Level 1 and 2 theoretical competency
A number of methods can be used to assess the retention of theoretical knowledge following training. These can be paper based or part of the e-Learning programme. The advantage of the e-learning approach is that assessments are scored and recorded online, avoiding time-consuming traditional methods.
Level 3 and 4 practical competence
Formal assessment of clinical competence can be used to integrate theory with practice. Level 3 and 4 are difficult to assess. Issues that have been identified in the UK during the introduction of competency assessment for the clinical transfusion process are:
- The large number of individuals to be assessed.
- Dedicated preparation time is required for the assessor.
- Time must be allocated for the staff to be assessed
- Difficulty in finding clinical situations for assessment
- Cost
Tools for assessing practical competency are available from several organisations: examples of English language versions can be found at the sites below:
http://www.npsa.nhs.uk/patientsafety/alerts-and-directives/notices/blood-transfusions
http://www.skillsforhealth.org.uk
A description of the methods that can be used assess theoretical and practical competency is provided in table 10.2.
Table 10.2 Assessment of knowledge and competency
Method | Description |
---|---|
Background Knowledge Tests | Short, simple questionnaires for use prior to implementing a training programme or introducing an important new topic. |
Multiple-choice questions (MCQ) | Measures both simple knowledge and complex concepts. MCQ can be answered quickly and can be easily and reliably scored. |
True-false questions | Are less reliable because random guessing may produce the correct answer. However, they provide a method for recall and can be easily and reliably scored. |
Matching tests | An effective way to test learners’ recognition of the relationships between words and definitions and categories and examples. |
Checklist evaluation | Useful for evaluating any competency that can be broken down into specific behaviours, activities or steps that make up a task or procedure. Can also be used for self-assessment of practice skills. |
Objective structured clinical examination (OSCE) | Assessments are administered at a number of separate standardised patient encounter stations. Each station lasting 10-15 minutes. |
Live simulated situation | Imitate but do not duplicate real life situations. ‘Actor’ patients or mannequins can be used and scenarios can be administered individually or in groups. They are resource intensive however, and the assistance of technical expertise is required. |
Computer simulation | Expensive to create, however, provides an opportunity to assess skills without possible harm to live patients. There is exposure to standardised training content and the ability to provide immediate feedback to the learner. |
Direct observation of practice | Assessment takes place in a real practice setting. Desired or proficiency required in specific behaviours in conducting skill have to be demonstrated. |
Videotaping a practice session | Seen as a poor assessment technique however, as it captures performance and not competence. |