When did you first introduce the innovation?
Less than 12 months ago
Please describe the innovation you have developed
This is a piece of coursework designed to promote student engagement (both in attending lectures and practical sessions, as well as engaging fully with the material) and build up laboratory skills to be able to apply them. It is comprised of themed ‘portfolios’ mapping to nine of the twelve practical/lecture fortnights across the academic year within one module. The best eight of these nine count towards the assessment score.
Each portfolio consists of five parts, each of which carry a grade. Each portfolio is ‘open’ for the duration of the fortnight over which the lecture/practical covers, and the parts contained within must be done in a particular order within smaller timeframes in that period. Each portfolio follows a similar structure:
- Pre-practical quiz
- Pre-practical video
- Engagement with class
- Post-practical quiz (a score of 40% or better is required to access the next step)
- Task – this varies from practical to practical, being tailored to the skills developed but often includes online scenarios or data analysis.
What prompted you to develop this innovation?
Student attendance can sometimes be poor; both in practical sessions as well as lectures. A lot of faith is put in the lecture notes or practical handouts being available online and therefore need only be read when exams are approaching. With the majority of coursework assignments being in the form of essays or something similar, there is an expectation by students that their coursework will only focus on one topic and so it is that one topic they will learn. Further, in science-based degrees practical sessions are crucial in developing new scientists, but there is a steady decline in the engagement of students with these tasks; preferring instead to turn up, go through the handout/protocol, and leave as soon as possible. This lack of engagement is reflected in exam answers based on previously completed practical tasks where students often appear not to have understood why they were doing the task, or what the results mean.
The design of the portfolio aims to reward engagement and to ensure students maintain that engagement (with their notes as well as with their attendance and involvement) all year long as opposed to only revisiting material come revision time, or forgetting about the previous week’s lecture as soon as it had ended. The solution had to be mostly automated however, as with class sizes of 200 or more every year the workload or continuously marking the number of questions available per student would be unsustainable. But the recording of attendance allowed for an engagement with each student that is sometimes missed, providing it was done in an alternative way to a register.
In your view, what is it about this innovation that makes it different/important?
To my knowledge, the use of continuous mini tasks all year round is entirely new. Many programmes have portfolios or work, or lab books that are read at the end of the year. But none allow a gradual build-up of marks to show each student where they need to engage more. Nor do they link together the different types of learning provided or spread out the knowledge field to cover all topics with the module. What has been highlighted to me in exploring the grades attained by students across the year is that they pay more attention to the answers they provide in the quizzes. They understand that by not paying attention to the practical, they will do less well on the post-practical quiz, and that when they realise they are not doing so well because they have lost easy marks here and there, they really up their game.
The most crucial part of this innovation however, is the overall design and the use of online resources. The entire year is setup before the year begins, with each portfolio set to appear and disappear according to the dates it covers. Each part within that portfolio is also built ahead of time, and incorporates adaptive release so that it only becomes available under certain conditions (being in a particular group, getting a particular grade in a quiz, between specific dates, etc.). The quizzes are automatically marked and are from larger banks of questions, covering eleven different question types, and the order of questions is randomised, giving every student a slightly different experience. Aside from the attendance records and putting those records into a group for post-prac quiz access (a task that personally I enjoy as I get to know my students better by going around the lab and engaging with them while recording their names), the entire assessment is automated, leaving academic staff more time to focus on assessments at higher levels (such as level three where manual marking is unavoidable) and to engage with their classes and students more fully.
To what extent does your innovation make use of existing approaches, resources or technologies?
The resource is hosted in the Virtual Learning Environment of the university and makes use of the adaptive release function and ‘learning module’ features it contains.
To what degree has this innovation led to changes in education or clinical practice?
As this is a first run of this type of assessment, it is difficult to qualify changes to educational practice. However, there has been significant expression of interest by colleagues looking to use this model. This may well change the educational practice of many science and clinical based modules and programmes where practical skills can be assessed through coursework (at level one at least – higher levels require more freedom of expression in providing evidence of understanding and application) and theoretical knowledge can be assessed through exams. The model is highly adaptable as demonstrated by a colleague who asked for my aid in developing a conditional access assessment where the student must at least have completed one part to then access the next, thereby guiding them along the process of a clinical case study. This design received a student-led teaching award in 2015 at the university. In terms of educational practice it has encouraged more consideration of lecture material when developing or modifying/updating lectures to ensure they provide information that informs the upcoming practical sessions. This goes both ways with design of new practicals where the consideration is on what they will have learned from the lecture that might help shape what is accomplished in the related practical.
What evidence do you have of the impact of the innovation?
The engagement time each fortnight per student is in reality only 45 minutes of their time as each quiz is timed (with lots of room for inclusivity and reasonable adjustments), the video is only a couple of minutes, and the 5th task is varied but often expected to take only 15-20 minutes. Even an hour a fortnight is little to ask, and has been met with positive feedback. Students have communicated the advantage of keeping on their toes all year, and finding that information is not as foreign to them when they come to revise. The average coursework marks for the module are up (though this is the first run of this type of assessment and therefore not directly comparable with previous assessments), and as a result the exam marks are also up due to continued engagement with the learning materials as well as the obvious ‘practice element’ of having online quizzes all year round to have to do that carry weight.
To what degree has the innovation been disseminated in your organisation or elsewhere?
I have been asked to present this innovation at various meetings around the university, including at the internal UWE Learning and Teaching conference in 2015. I have also run an additional workshop to departmental colleagues after significant expression of interest in adapting the model for other modules.
Please provide details of any plans you have to disseminate the innovation in the future.
I will be presenting this innovation at the Association for Learning Technology conference in Manchester in September 2015.