The evaluation of CPD and its impact is one of the most difficult tasks for CPD leaders. Sue Kelly demonstrates some far more interesting ways of evaluating CPD than simply handing out a questionnaire
Evaluating the impact of CPD has to be the one task that gives CPD leaders most anxiety. We know it is important to recognise the complex nature of impact evaluation and we also appreciate few have really cracked it. Most of us are still struggling to measure impact effectively, let alone work out how to use this information once we’ve got it, to feed back into the process of whole-school and individual planning for CPD. Having said that, there are exciting opportunities to develop practice in this area, to create innovative tools which will move our schools forward and to share these through viable networking communities in the true spirit of collaborative CPD approaches.
The research document entitled Evaluating the Impact of Continuing Professional Development, put together for the DCSF by researchers from the Universities of Newcastle and Warwick, provides a fascinating overview of the way that impact is measured across a range of schools. However, I intend to focus on more practical ideas which will, I hope, prompt consideration of how we measure and monitor the impact of CPD across a school or institution, as well as the key elements of recognised good practice in this area. If we know what these are then we can develop systems and practices to fit the context of our own schools, confident in the knowledge that we are making progress in the right direction.
Simple steps forward
Before any formalised CPD activity takes place, such as coaching partnerships, external training sessions, use of Inset time, workshops, conferences or developmental meetings, it makes sense to engage the participants in the process of considering the following:
- What targets or objectives the CPD activity is designed to meet.
- What the expected impact of engaging in the CPD activity will be.
- What the outcomes will be in terms of the impact on classroom skills/strategies/ knowledge and how this will affect the learning of students.
- How the above can be measured.
The consideration of impact should begin well in advance of any CPD activity if the whole experience of engaging in CPD is not to be merely an ad hoc process or a one-off training opportunity, with little consideration given to how it might benefit individuals, students or the school.
Yet research has shown that, all too often, evaluating impact comes in the form of a participant satisfaction questionnaire. While this can be valuable in gauging positive or negative attitudes to CPD – the latter obviously having an adverse effect on staff motivation and commitment to their ongoing development – there are far more diverse and exciting ways of evaluating impact. For instance, Thomas Guskey’s five levels of impact evaluation (Guskey 2000) will help schools to develop effective evaluation tools rather than merely gauge what participants thought of any particular CPD activity. Dr Guskey’s five levels are: participant reaction, participant learning, organisational support and change, participant use of new knowledge and skills, and pupil learning outcomes.
I advise starting with contemplation of the following statements, questioning how each statement relates to what is happening in your own context:
- Dissemination: sharing/cascading of CPD activities is not the same as evaluating the impact. What distinctions, if any, are made in your school to avoid falling into this trap?
- Completing a participant satisfaction questionnaire following a CPD activity should not be seen as an end in itself. If this is the case in your school, what other evaluation tools are available/could be used/might be relevant to the CPD taking place in your workplace to evaluate
- impact across all the levels outlined by Dr Guskey?
- How far are you, or colleagues in your school, fully aware that the main purpose of CPD is not just to change teacher behaviour but, more fundamentally, also about the effect on students’ learning in the classroom? What should your role as CPD coordinator be in changing the mindset?
- The evaluation of the impact of CPD activities is only relevant for teaching and support staff. Given that the above is a common misconception – how will you begin to challenge this way of thinking in your school? How can you begin to build in evaluation tools which consider the impact of CPD on all learners in the school and how this relates to students’ learning outcomes in classrooms?
You will also need to consider different approaches for measuring and evaluating both formal and informal outcomes of CPD. Formal outcomes can include:
- the analysis of statistical data to measure the impact of CPD on student achievement through internal and external assessments
- the analysis of staff and student feedback questionnaires to provide quantitative evidence of positive impact on development of learning, knowledge and skills
- statistical analysis of the observation criteria strands based on Ofsted criteria to judge teaching and learning in the classroom
- the analysis of impact of CPD on staff retention and recruitment in the school/workplace
- the analysis of staff and student absence rates and how this can be related to a positive learning culture and CPD activities in the school
- formalised outcomes through performance review procedures, where evidence of progress in a particular aspect of teacher performance or student progress can be provided.
Informal outcomes can include:
- an increased feeling of positive general wellbeing in the school – staff and students feel valued and that their learning needs are catered for
- staff being able to articulate the culture of CPD within the school, ‘This is how
- we do it here’, and are proactively engaged in their own development
- a sense of ownership
- informal dialogue and feedback at performance management reviews
- positive staff attitudes, varied and innovative approaches to CPD taking place at individual, team and whole-school levels
- feedback from students generally being sought and valued, which can feed into the ongoing CPD of a colleague or department or be used to evaluate the outcomes of it.
Practical ways forward
I have included below a practical, ready-to-use (or adapt) proforma to inspire you to take your impact evaluation practice forward. As always, this is only one model, which may or may not suit your current context and may need adapting to suit it.*
Evaluation of the impact of planned CPD activities
Using the information from your Individual Staff Development Proforma, please complete the following table for each planned, formalised CPD activity before you participate in. You will refer to this after the activity has taken place to record additional comments and outcomes.
Planned CPD activity Venue/Date/Time allocated
Please indicate the number of your Preferred Evaluation Choice (PEC) in the appropriate column marked PEC using the key below:
Agreed future CPD needs linked to outcomes Agreed cost-effectiveness score
*The ‘Evaluation of the impact of planned CPD activities’ proforma can be used in tandem with the ‘Individual staff development proforma’ from Chapter 2 of The CPD Co-ordinator’s Toolkit to ensure that your practices reflect a ‘joined-up thinking’ approach.
As previously mentioned, much impact evaluation carried out in schools involves teachers filling out questionnaires based on how satisfied they were with the CPD in question, rather than considering the learning gained from it and how this in turn affects students in classrooms. Yet, in my experience, teachers rarely see the value of completing such questionnaires; they feel it is a perfunctory task which little benefits their ongoing development and is therefore a waste of their valuable time.
Completing a questionnaire may not be the most meaningful or valuable way of considering the impact of CPD on teaching and learning in classrooms. Other more imaginative methods may help to engage colleagues in the ongoing learning process rather than seeing the activity as a ‘one off’ with little or no value in the long term.
Allowing staff to choose the most meaningful mode of evaluation for themselves should help to engage them in the longer-term outcomes of CPD and may be more allied to the nature of the CPD in which they participated. Dr Guskey’s five levels are reflected in the proforma as part of the drive for meaningful evaluation. There is some scope for evaluation of other factors such as the impact of venue, cost-effectiveness and incidental learning, which can be shared as an integral part of participant reaction to the CPD and which will provide important feedback at this level to the coordinator.
Good practice in impact evaluation considers the potential impact before an activity takes place and is an integral part of the planning for it. The method of evaluation used here completes the cycle of good practice by monitoring outcomes with a focus on student learning and progress, informing future planning.
Evaluation of CPD over time, constructed to run alongside ongoing CPD activities, should be dovetailed carefully with the cycle of performance review and interim review meetings which are timetabled into this process; these are not just a ‘bolt on’ extra.
It is inevitable that you will need to spend time working alongside staff to implement this new approach. One of the best ways I have found when introducing new ideas is to ask those staff who are most resistant to change or most vociferous in their lack of support for new systems and procedures to give you feedback on how these new processes work and what could be done to improve them. Use opportunities like this to engage staff who may not always appear to be on side; you will usually find them keen to help, supportive, flattered at being asked and keen to work with you. In the rare cases where the outcomes are less positive – remain focused and confident in your leadership.
Make sure you are clear about the systems you are introducing to communicate your ideas clearly to staff. The key features of the proforma below have already been outlined but in addition you will need to explain that formal evaluation in this detailed way should only be applied to two or three planned CPD activities of a more formal nature – for example, attending a regular network group; planned external input from the local authority or other CPD provider; attending an external course, conference or series of workshops; planned peer or line manager observations with a particular focus; an action research project undertaken over a period of time; coaching or mentoring partnerships over time with an agreed pedagogical focus, perhaps using the Pedagogy and Practice materials issued by the DCSF.
Also worth considering is that for each planned CPD input, the member of staff should complete one of the forms above. The form is designed to avoid repetition when considering impact before CPD, immediately afterwards, and again after a period of time, or which are best practice ideas identified in the research document mentioned above. Similarly, the review process can be built into existing performance management procedures to avoid impact evaluation becoming a ‘bolt on’ extra.
Finally, as always,encourage staff to keep their professional portfolios updated with this powerful evidence, not only of impact evaluation outcomes, but also of a proactive approach to their ongoing professional development.
What the proformas do not provide for and which needs also to be considered is how we evaluate informal learning, which takes place on a day-to-day basis in a variety of ways and which we must encourage staff to value.
To help staff capture the ‘nuggets’ of learning that take place on a day-to-day or more informal basis, consider introducing a simple online tool or asking them to design their own, which can be completed throughout the year. The more formalised planned CPD activities that have been chosen for detailed evaluation of impact can be copied from the proforma and any additional incidental or informal learning recorded. In this way, colleagues will build a powerful learning journal which can be referred to at any time.
The journal can also form the basis of professional learning conversations on a formal or informal basis to keep the learning dynamic. The key here is that individuals are completing this for their own benefit and are therefore far more likely to feel positive about doing so.
In addition to this, most teaching staff are passionate about improving their practice in the classroom and are far more likely to engage with something that helps them in this way. What we must all consider far more thoroughly are the measurable outcomes in terms of what our students learn as a result.
Sue Kelly is author of The CPD Co-ordinator’s Toolkit
- Guskey, T (2000) Evaluating Professional Development, Thousand Oaks, CA, Corwin Press
- Kelly, S (2006) The CPD Co-ordinator’s Toolkit, Sage Publications