Cliff Jones reviews a report revealing the positive impact of postgraduate professional development

Evaluating the impact of professional learning is not straightforward but it can be rewarding. It is always likely to bring to the surface unexpected evidence for unintended outcomes and to make use of a variety of perspectives. Sometimes we have to wait for the impact to develop. The recent report on the impact of postgraduate professional development (PPD) written by Peter Seaborne for the Training and Development Agency for Schools (TDA) provides support for those who argue that the concept of impact is very rich with meaning and should in no way be confined to the simple minded notion of ‘finish course on Friday and get better examination results on Monday’.

We now have a basis for critical and well grounded professional learning, leading to an articulate professional voice. This comes at a time when the new CPD strategy for schools is beginning to emerge, including a requirement to self-evaluate the impact of CPD. There is, therefore, much that schools, colleges and all of us engaged in the education enterprise can learn from this report.

At the end of last November the 56 institutions that had successfully applied for funding towards the cost of providing postgraduate professional development for teachers were required to submit reports on their self-evaluation of the impact of their programmes. Peter Seaborne’s report is based upon a comprehensive examination of these individual reports.

The self-evaluation reports
Among other things, each evaluation report had to:

In more general terms it had to gather operational data and monitor and evaluate the programme’s impact on practice in schools.

Before applications for funding were submitted to the TDA the Universities Council for the Education of Teachers (UCET) provided guidance for its members in how to plan for evaluation from the outset. This paper formed the basis for the self-evaluation proforma used by the TDA.

The guidance was divided into nine components.

1. Needs analysis. 2. Context. 3. Intended professional impact. 4. Expected evidence for impact. 5. Activities. 6. Monitoring arrangements. 7. Review of evidence for impact. 8. Impact claim.

9. Follow-up plans.

Many providers of PPD used this framework of components in order to plan their evaluations. Regular readers will notice that they have also formed the basis of a number of articles in CPD Update to do with whole-school, individual and collaborative evaluation of professional learning. In our issue of November 2006, for example, I presented a critical professional learning framework derived from the UCET guidance paper.

All of this remains work in progress because as we learn more about professional learning so we try out new techniques, make use of different perspectives, test our values and generate a critical professional voice.
The TDA required a template or proforma for providers of PPD to complete and submit and so the self-evaluation reports were divided into six questions, derived from the original nine components. It might be a good idea for those of you contemplating systematic whole-school evaluation of the impact of professional learning to begin to experiment with a similar template. I have, therefore, very slightly modified the template questions to ensure that they are seen to be more relevant to schools.

Looking at the evidence
It is clear that Peter Seaborne regards needs analysis as of paramount importance to beginning the process of making sense of professional learning. He is also clear that evidence for the impact upon the learning experiences of pupils is sometimes indirect and that the factors affecting attainment can be many, varied and take time: the longer the time, however, the greater the number and complexity of variables.

Seaborne is very careful to distinguish between reports that relied upon assertion and those that explored the nature, strength and significance of evidence. He also understands that this is the first year that such a detailed nationwide attempt at evaluation has taken place. Over the next two years the perspective of time will help to grow our sense of what is happening. Leaving evaluation until the end is far less effective than building it in from the start, based upon good quality needs analysis. Schools that wait until the end of the performance management review cycle, for example, to ask questions about its usefulness and impact will have a weaker basis for the evaluation of professional learning than those schools that have designed their approaches to professional learning systematically. It’s about learning from learning.

The report provides an example of how a provider created a typology for impact which identified the following.

  • Changes in subject/process knowledge base of participants. 
  • Changes in confidence and self-esteem of participants. 
  • Changes in classroom practice of participants and/or the practice of colleagues. 
  • Improved reflection on practice. 
  • Improved motivation of pupils. 
  • Improved achievement of pupils.

‘The provider goes on,’ says Seaborne ,‘to observe that “one of the striking features of this typology is the further down the list, the greater the distance between the PPD activity and the impact, and the greater number of other variables come into play.” Providers also point out that changes in teachers’ knowledge, skills and behaviour are more likely to be evident during or soon after the PPD, whereas impact on pupils’ achievements… might not be evident for months or years, by which time other factors may also have had an effect.

‘It is also interesting to note that most providers of PPD identified teachers’ “improved capacity to reflect on their practice as a key positive outcome of PPD, with a claimed associated benefit in the school and classroom. As one provider stated “it is difficult not to believe that teaching (and learning) is better in the hands of a reflective professional than one who teaches by numbers”.’

Providers of PPD have made use of a variety of sources of evidence. This seems to have been very wise, particularly since we are in the early days of this kind of evaluation. It follows my adage: ‘looking for evidence wrong: looking among evidence right’. The adage is really about the sterility of setting targets and then ignoring all evidence that does not show how close to your target you managed to get. Real learning and real evaluation recognise professional penicillin when they see it.

A systematic approach
Peter Seaborne draws attention to how some providers made close links ‘with the individual’s performance management targets, school priorities or, in some cases, the school improvement plan. An increasing number of PPD providers have incorporated ‘shell (content-free) modules’ that allow the provider to validate research/enquiry projects tailored to the specific needs and priorities of the participant’s school. This opportunity has also seen a growth in groups of teachers from a single department or school undertaking PPD study together and conducting linked enquiries under the guidance of tutors.’

He notes that a couple of providers attached sample documents including the requirement that a line manager should countersign each teacher’s proposal for a school-based PPD project. This included in one case a statement supporting the proposal and expressing interest in the impact on practice or policy of the research. ‘This clearly puts impact at the heart of PPD,’ says Seaborne, ‘and the structure offers the potential for good evidence about the impact of the provision in schools.’ However, he notes that a provider with a similar policy found that the administrative burden made the objective difficult to monitor.

This last point shows how important it is to fit PPD into a whole-school systematic approach to professional learning. 

Seaborne was very much taken with the participation of small-scale research teams and the attention given to exploring ‘pupil voice’. He also found encouraging the number of perspectives employed in making sense of professional learning because they helped to ensure that evidence was collected and examined from more than one source. But he is aware that while collaborative professional learning is very valuable it is not possible for all.

Corroboration from senior managers and others is, says Seaborne, very useful and helps to discover the deeper impact of this kind of professional learning. It also relates to how such learning becomes embedded and sustained.

Positive impact

The report also draws attention to positive impact upon:

  • teachers’ self esteem and confidence 
  • the value of school-based provision where possible 
  • the learning experience of pupils 
  • the professional learning of tutors.

Illustrating recurring themes of the impact evaluation reports Seaborne identified: 

  • ability to give a clearer rationale for one’s actions 
  • more confidence in managing and influencing colleagues 
  • greater willingness and ability to contribute productively to debate in staff meetings 
  • greater ability to question alternative viewpoints  
  • teacher participants becoming more confident in advocating and defending their claims to new knowledge… [sometimes even] in school networks 
  • ability to lead change initiatives linked to pedagogy 
  • by disseminating key outcomes to professional audiences, teachers not only empower themselves but also redefine their professionalism.

It will be fascinating to see how the government reacts to this endorsement of critical professional learning. I think that they should conclude that if you put together UCET, TDA, local authorities, schools, teachers and a few others within a supportive critical professional learning framework the value of what they produce will be worth far more than the very small amount of money that the government contributes to PPD.

Self-evaluation of PPD

Q1: How well are you achieving the objectives as identified in your plans?


  • Have you addressed pupil learning experiences? 
  • What evidence do you have to support this judgement? 
  • How did you collect and analyse the evidence? 
  • Whom did you consult?

Q2: How far were your original objectives realistic?


  • What evidence do you have to support this judgement?
  • How was this evidence collected and analysed?

Q3: Has your evaluation led to any reprioritisation of your objectives?


  • Are all your objectives ongoing? 
  • Have certain objectives become more significant and others less so? 
  • How and on what basis have these decisions been reached?

Q4: Are there areas of impact that you did not originally anticipate?


  • What evidence do you have to support this judgement?
  • How did you collect and analyse this evidence?

Q5: What is changing about your provision as a result of your evaluation?


  • What evidence do you have to support this judgement? 
  • How did you collect and analyse the evidence? 
  • What changes have you made/are you making to the way your consortium functions?

Note: you may wish to attach an action plan as part of your answer to this question.

Q6: Please provide a summary of the activities that collaborative funding has supported.


  • How effective do you feel these activities have been in promoting partnership and collaboration?

The Training and Development Agency for Schools PPD Impact Evaluation Report by Peter Seaborne was published in March 2007.