D1b: Assessment Principles and Methods

Introduction

During your time as a student at ORMS you will be involved in some sort of assessment. The specific type depends on your programme of study, and the assessment requirements for the module or unit you are studying.

Principles of Assessment

The assessment activities we use are based on the following principles:

  • Assessment should be valid. Validity ensures that assessment tasks and associated criteria effectively measure student attainment of the intended learning outcomes at the appropriate level.
  • Assessment should be reliable and consistent. This requires clear and consistent processes for the setting, marking, grading and moderation of assignments.
  • Information about assessment should be explicit, accessible and transparent. Clear, accurate, consistent and timely information on assessment tasks and procedures should be made available to students, staff and other external assessors or examiners.
  • Assessment should be inclusive and equitable. As far as is possible without compromising academic standards, inclusive and equitable assessment should ensure that tasks and procedures do not disadvantage any group or individual. We do have a duty to ensure that all those who complete some of our courses meet certain standards of proficiency.
  • Assessment should be an integral part of programme design and should relate directly to the programme aims and learning outcomes. Assessment tasks should primarily reflect the nature of the discipline or subject but should also ensure that students have the opportunity to develop a range of generic skills and capabilities.
  • The amount of assessed work should be manageable. The scheduling of assignments and the amount of assessed work required should provide a reliable and valid profile of achievement without overloading staff or students.
  • Formative and summative assessment should be incorporated into programmes of study to ensure that the purposes of assessment are adequately addressed. Many programmes may also include diagnostic assessment.
  • Timely feedback that promotes learning and facilitates improvement should be an integral part of the assessment process. Students are entitled to feedback on submitted formative assessment tasks, and on summative tasks, where appropriate. The nature, extent and timing of feedback for each assessment task should be made clear to students in advance.
  • All those involved in the assessment of students must be competent to undertake their roles and responsibilities.

Formative Assessment

  • The purpose of formative assessment is to facilitate learning during the delivery of a module. Formative assessments contribute to the student’s learning experience and their development, and subsequently to enhancing their performance in summative assessment.
  • Formative assessment enables students to give and to receive (where appropriate) individual, group or general feedback which identifies where they can make an improvement in their work and maintain achievement.
  • Formative assessment can also be described as ‘assessment for learning’ since an assessment that is entered into voluntarily, and on which no final qualification depends, can prompt learners to adjust their own performance.
  • Tutors will set formative assessment tasks during their teaching; this is to give them (and you) feedback about how you are progressing

Summative assessment

  • Summative assessment demonstrates the extent of a student’s success in meeting the assessment criteria used to gauge the intended learning outcomes of a module or programme, and which contributes to the final mark given for the module. It is normally, though not always, used at the end of a unit of teaching.
  • Summative assessment is used to quantify achievement, to reward achievement, to provide data for selection (to the next stage in education or to employment). For all these reasons, the validity and reliability of summative assessment are of the greatest importance. Summative assessment can provide information that has formative/diagnostic value.

Ipsative assessment

  • This is assessment against the student’s own previous standards. It can measure how well a particular task has been undertaken against the student’s average attainment, against their best work, or against their most recent piece of work.
  • Ipsative assessment tends to correlate with effort, to promote effort-based attributions of success, and to enhance motivation to learn. An example of ipsative assessment is a student’s performance during practice placements where their skill develops with coaching and practice.

Assessment Methods

We use a range of assessment methods to give students greater ability to demonstrate their knowledge and skills across a range of contexts. By adopting a wider repertoire of assessment methods we can also help to support students who may for one reason or another be disadvantaged by the extensive use of particular assessment formats. A diversification of assessment methods, where appropriate and practical, can therefore effectively lead to a more inclusive approach to assessment design.

Your course may use any (or all) of the following methods of assessments:

Assignments; Closed and Open book assessments; Computer assisted assessment (MCQs; SBAs; SAQs; True/False; Sorting; Matching); Observation; OSCEs; Portfolios; Posters; Practical Tasks (see also OSCEs); Presentations; Reflective Journals; Simulations.

Assignments

Assignments are typically of about 2,000 words, and are presented in the form of an essay. You are allowed a variation in word count of 10% (that’s 200 words). You are assessed not only on your statements, but your ‘SPaG’ – Spelling, Punctuation and Grammar, and on your use of sources and how they are referenced. We use the Harvard Referencing system (more details here). Part of the skill in writing assignments is understanding the question and responding correctly. The Open University’s Guide on Essays is useful (more details here)

Closed-book assessment.

This is the traditional mode of assessment, in which students are allowed to take no notes, books or other reference material into the examination room, relying entirely on their memory to answer the questions set.

Open-book assessment.

Here, students are allowed to refer to any material that they wish to consult while carrying out the assessment, which can take place either in a formal examination setting or in a less formal setting. The object of such assessment is to see how the students can use the information at their disposal to solve problems, carry out tasks, etc, with the memory factor being largely eliminated. Such open-book assessment is well suited to the assessment of higher cognitive and many non-cognitive skills, including practical skills.

Computer assisted assessment

Computer Assisted Assessment (also referred to as computer-aided assessment) is the application of computers to assessment processes, including delivery of tests, capture of responses and marking by either computer or human marker. Tests may be delivered through Moodle, or there may be a paper-based equivalent for students to complete. The type of questions found in this type of assessment are as follows:

MCQs

Multiple-choice questions (MCQs) are a form of assessment for which students are asked to select more than one of the choices from a list of answers. MCQs are used to assess both recall of factual knowledge and elements of application of that knowledge (e.g. making a diagnosis and deciding the correct treatment).

SBA

Single best answer (SBA) is a form of Multiple Choice Question (MCQ) in which the student is required to select one answer from a short list, usually of four or five items.

SAQ

Short Answer Questions (SAQs) are open-ended questions that require students to create an answer. They are commonly used in examinations to assess the basic knowledge and understanding (low cognitive levels) of a topic before more in-depth assessment questions are asked on the topic.

True / False

True/false questions are composed of a statement. Students respond to the questions by indicating whether the statement is true or false. For example: The heart has four chambers True / False (Answer: True).

Sorting

Sorting or ordering questions present a disorganised collection of words or statements and require the student to rearrange them in the correct order. For example: When blood first enters the heart from the systemic circulation, what path does it take? Lungs, Right Ventricle, Left Atrium, Right Atrium, Left Ventricle? (Answer: Right Atrium, Right Ventricle, Lungs, Left Atrium, Left Ventricle).

Matching

Students respond to matching questions by pairing each of a set of definitions with one of the choices provided on the exam. These questions are often used to assess recognition and recall and so are most often used in courses where acquisition of detailed knowledge is an important goal.
For example: match the following list of bones to their location.

Location

Axial Skeleton

Appendicular Skeleton

Bones (unsorted)

femur, fibula, frontal, humerus, lacrimal, metacarpals, metatarsals, nasal, occipital, parietal, phalanges, radius, tibia, ulna, vomer

Axial Skeleton (sorted) Appendicular Skeleton (sorted)
occipital, parietal, frontal, nasal, lacrimal, vomer tibia, fibula, femur, metatarsals, phalanges, humerus, radius, ulna, metacarpals.

Observations

Direct Observation assessment is where an assessor observes the student performing the assessment task to see if they have the ability to perform it correctly. Clinical areas often use direct observation to assess students. Group work such as problem-based learning may sometimes use direct observation to judge a student’s input. An oral assessment is often used as a follow-up assessment task to ask supplementary questions. Sometimes, there is no effective alternative to direct observation.

OSCEs

Objective Structured Clinical Examinations (OSCEs) may be used as a summative or formative assessment and on their own or with another form of assessment. Summative OSCEs are frequently used at the end of courses or programmes, or on completion of a module to test students against set objectives and learning outcomes. Where they are used as a formative assessment, the feedback provided helps students to progress (Taras, 2005; Alinier, 2003). Formative OSCEs also help to prepare students for placements, encourage them to engage with their learning and help them to achieve their learning outcomes (Nulty et al, 2011).

An OSCE can consist of one skill station where students perform one or a variety of skills and are tested on the underpinning clinical and theoretical knowledge, or multiple stations, each testing a different skill or piece of underpinning knowledge (Mitchell et al, 2009).

Examples of practical skills include performing vital signs on a patient and using an aseptic non-touch technique to perform a simple dressing change; an assessor is present during the procedure to mark each student on their skills. The underpinning knowledge, including anatomy and physiology, can be assessed as a paper-based or verbal exercise at a staffed or unstaffed station and marked afterwards. Verbal questions, multiple-choice or short-answer questions might be used.

Filming students’ performance in formative and summative OSCEs is common practice. The film can be used to identify areas where students need to improve, or by assessors to resolve a query regarding a student’s performance and also as a form of moderation. This should take place for summative OSCEs to avoid any subjectivity, and external examiners should be involved in reviewing the content of the stations, checklists and marking criteria.

Students preparing for an OSCE should:

  • Be psychologically prepared;
  • Be familiar with how equipment works;
  • Know which procedures/guidelines are to be used in the OSCE;
  • Be familiar with checklist/marking criteria;
  • Rehearse the skills;
  • Know the timing of the OSCE;
  • Develop skills on practice placement;
  • Revise the underpinning theory of the skills being tested;
  • Use feedback from mock/formative OSCEs;
  • Use available resources such as guided study, quizzes and videos;
  • Check whether they should wear uniforms;
  • Confirm the date, time, venue and allow enough time to get there;
  • Practise answering questions verbally.

Marking OSCEs

Tavares et al’s (2012) Global Rating Scale (GRS) for the Assessment of Paramedic Clinical Competence is used in combination with a checklist for assessing OSCEs. The rating scale describes seven dimensions, specifically: Situation Awareness, Patient Assessment, History Gathering, Decision Making, Procedural Skill, Resource Utilisation, and Communication. Not all of the criteria are used depending on the OSCE being conducted. For example, at an intubation station, Decision Making, Resource Utilisation and Procedural Skill might only be examined, whereas in a simulated casualty examination, all seven components would be tested. The GRS involves the assessor identifying the level of the skill performance across the range “Unsafe / Unsatisfactory / Poor-Weak / Marginal / Competent / Highly Competent / Exceptional”. Students are required to achieve ‘Competent’ in all aspects being tested, with one retry being allowed.

Extended (or Justified) OSCEs

Extended or justified OSCEs test knowledge in a deeper, more applied way. They are most commonly used on a clinical or medical assessment to examine diagnostic reasoning. This usually takes the form of additional questioning where the assessor presents a range of options, to which the student is invited to present their choice of actions for the case in question.

Portfolios

A practice portfolio is a collection of a student’s work, which gives evidence to show how the student can meet the specified learning outcomes during operational duties. A typical portfolio consists of work completed by the student, evidence of competence recorded by the Practice Placement Educator and self-reflection on the learning process. Compiling a portfolio is a developmental process, which shows how the student has developed during the given period. Portfolio is an assessment method that monitors the growth and development of student learning.

Posters

Posters (or Poster Presentations) is the process of showing the content and the findings of a topic to an audience or a group of audiences at different times. It is often used to assess student learning in group research projects. Peer and tutor assessment can be used as part of the grading process.

Practical Tasks

Similar to OSCEs, practical tasks may be set formatively or summatively to develop the student’s general and specific skills to perform a clinical activity. A practical task may be a discrete skill that forms part of a complex series of skills that would be tested in an OSCE. A series of practical tasks may be tested at a series of skill stations.

Reflective Journals

A reflective journal is a means of recording ideas, personal thoughts and experiences, as well as reflections and insights a student have in the learning process of a course. A reflective journal requires the students to think more deeply, to challenge their old ideas with new incoming information, to synthesize the course materials they have learnt into their personal thoughts and philosophy, and to integrate it into their daily experiences and future actions. The benefits of the reflective learning process are usually accumulated over a period of time, in which the students usually show a series of developmental changes, personal growth and changes in perspectives during the process.

Simulations

Simulation education is a bridge between classroom learning and real-life clinical experience. Students may learn how to do injections by practicing on an orange with a real needle and syringe. Much more complex simulation exercises may rely on computerised mannequins that perform dozens of human functions realistically in a way that is indistinguishable from the real thing. Training simulations do not put actual patients at risk. Simulation-based assessment uses clinical aids and computer simulations to test competency.

Conduct of examinations

  1. You must:
    1. produce photo ID (Passport, Driving Licence, Student ID) to identify yourself to the Invigilator before the exam starts;
    2. be on time for all your examinations and any required periods of supervision;
    3. provide what you need e.g. pens, pencils and rulers;
    4. follow the instructions of the invigilator;
    5. write in blue or black ink;
    6. Switch off mobile phones and all other electronic devices.
  2. You may
    1. use a calculator unless you are told not to do so. You must not use the calculator function of another device.
  3. You must not
    1. use correction fluid;
    2. make any marks on the examination paper;
    3. have access to items other than those stated in the instructions on the question paper, the stationery list or the specification for that subject in the examination room;
    4. Have access to mobile phones, electronic communication or storage devices, including:
      1. iPods and iPads, MP3/4 players, wrist watches which have a data storage device, or any other products with text or digital facilities;
    5. become involved in any unfair or dishonest practice before, during or after the examination;
    6. sit an examination in the name of another candidate;
    7. have in your possession any unauthorised material, including electronic devices and mobile phones;
    8. have in your possession any equipment which might give you an unfair advantage;
    9. talk to, attempt to communicate with or disturb other candidates once you have entered the examination room.
  4. Possession of unauthorised material is breaking the rules, even if you do not intend to use it. If found in possession of unauthorised material you will be subject to penalty and possible disqualification.
  5. If you are in any doubt speak to the invigilator.
  6. If you leave the examination room early (with the exception of those who leave temporarily accompanied by a member of Centre Staff) you will not be allowed back into the room.
  7. You must, when leaving the examination room, leave behind the question paper, your answer book or answer paper, rough work and any other (used or unused) materials.
  8. As a candidate, if you are not satisfied with an outcome of an assessment, you should submit your appeal to us in writing within 20 working days of the feedback being given.
    See ORMS D6: Academic Appeals Policy for further information.

References

Alinier G (2003) Nursing students’ and lecturers’ perspectives of objective structured clinical examination incorporating simulation. Nurse Education Today; 23: 6, 419-426.

Chan C (2009) Assessment: Reflective Journal, Assessment Resources@HKU, University of Hong Kong [http://ar.cetl.hku.hk]:Accessed: 26 October 2016

Liddle C (2014) The objective structured clinical examination. Nursing Times; online issue. Available at : https://www.nursingtimes.net/roles/nurse-educators/the-objective-structured-clinical-examination/5074066.article

Mitchell M et al (2009) The OSCE: optimising its value in the undergraduate nursing curriculum. Nurse Education Today; 29: 4, 398-404.

Nulty D et al (2011) Best practice guidelines for use of OSCEs: maximising value for student learning. Nurse Education Today; 31: 2, 145-151.

Taras M (2005) Assessment – summative and formative – some theoretical reflections. British Journal of Educational Studies; 53: 4, 466-478.

Tavares, W., Boet, S., Theriault, R., Mallette, T., & Eva, KW, Global Rating Scale for the Assessment of Paramedic Clinical Competence Prehospital Emergency Care Vol. 17, Iss. 1,2013