Controlling Response Shift Bias: Using a retrospective pre-test model to measure student change in postgraduate taught programmes.

Location:

When did you first introduce the innovation?

Between 12 and 24 months ago

Please describe the innovation you have developed

This innovation uses the retrospective pre-test model to measure student change following the completion of specialist modules (Diagnostic Assessment and Decision Making; Independent and Supplementary Prescribing) as part of a masters in advanced clinical practice at the Centre for Innovation and Leadership, Faculty of Health Sciences, University of Southampton. This innovative method provides a accurate assessment of the extent to which students change over the course of a postgraduate educational programme.

What prompted you to develop this innovation?

There is an increasing focus on providing evidence of the impact postgraduate education for healthcare professionals has on student outcomes and their ability to apply what has being learned in clinical practice. The ultimate outcome is that specialised postgraduate healthcare education will positively impact on patient care. Traditionally the design used to evaluate the impact of an educational programme or module is the measurement and comparison of the student’s self-reported pre-test scores with their post-test scores. Traditional pre-test/post-test measures work on the assumption that the respondent’s assessment and understanding of the concept being measured will not change from the pre-test to the post-test. However, the respondent’s perception of the construct under evaluation may change as a result of the educational intervention leading to an under-reporting by the respondent of any real change occurring between the pre-test and post-test. This change in perception of the construct being measured between pre-test and post-test is known as response shift. One way to reduce the confounding effect of this response-shift is the use of retrospective pre-tests when evaluating student self-reports of change.

In your view, what is it about this innovation that makes it different/important?

The use of a traditional pre-test/post-test method may have resulted in an underestimation of the master’s level modules on student outcomes. Using the retrospective pre-test model as a different approach is much better indicator of change. In addition, this approach facilities the collection of high level and accurate information on the impact of an educational programme; this information is highly relevant to key stakeholders such as patient representative groups, policy makers, healthcare funders and healthcare managers who want to know whether specialist education programmes for nurses, midwives and allied health professionals are effective in relation to the development of capabilities that impact on the delivery of quality patient care.

To what extent does your innovation make use of existing approaches, resources or technologies?

The innovation builds upon the pre-existing pre-test/post-test model of measuring student change; however, it adds a retrospective pre-test design. The retrospective pre-test method differs from the traditional pre-test/post-test design in that both post-test and retrospective pre-test perceptions of respondents are collected at the same time. Data is collected on students’ capabilities at the beginning of the programme; following completion of the module, students are then asked firstly report their ability as a result of the module (post-test) and then at the same time recall the beginning of the programme and compare it to where they are now; this is known as the then-test. The collection of then-test and post-test ratings at the same time leads to the reduction of response-shift bias due to the fact that the respondent is making the ratings from the same perspective.

To what degree has this innovation led to changes in education or clinical practice?

Postgraduate programme and module evaluation has traditionally used student satisfaction as an indicator of the quality of educational provision. Introducing the retrospective pre-test design has resulted in a more comprehensive understanding of the impact that specialist modules at master’s level have on students’ professional and clinical capabilities. This allows education and clinical stakeholders target areas in specialist post-graduate programmes that need to be amended or improved.

What evidence do you have of the impact of the innovation?

We have used this innovative approach to measuring student change in a number of modules (Diagnostic Assessment and Decision Making; Independent and Supplementary Prescribing). What was evident from using this approach is that course participants made substantial gains in a number of capabilities as a result of completing specialist modules at master’s level.

To what degree has the innovation been disseminated in your organisation or elsewhere?

The results of the innovation have been disseminated to key stakeholders involved in postgraduate provision within the Wessex region, including policy makers, clinical staff and fundholders. Previous publications using this approach include:

Drennan J. (2012) Masters in nursing degrees: an evaluation of management and leadership outcomes using a retrospective pre-test design. Journal of Nursing Management. 20 (1), 102-112.

Drennan J., Hyde A. (2008) Controlling response shift bias: The use of the retrospective pretest design in the evaluation of a master’s programme. Assessment and Evaluation in Higher Education, 33 (6):699-709.

Please provide details of any plans you have to disseminate the innovation in the future.

It is proposed that the results of this innovative approach to measuring student change will be published in leading healthcare journals as well as presented at national and international conferences.