I fear you may be about to read this and want all the answers to making sense of and implementing the Personal Learning and Thinking Skills (PLTS) framework. There are two particular reasons why I cannot do that:

  1. Experience from and research about the National Curriculum and National Strategies shows that teachers need to engage with innovations and do their own work on them in order to make sense and take ownership of them. In that way they can better reflect local understandings and context.
  2. There are no obvious answers because the PLTS framework is a confection. There is no underpinning logic to the six groups of skills or the framework, or if there is, it is being kept a well-guarded secret. There is certainly no logic in research terms.

I do not regard the second point as bad news, although some rationale would have been comforting. The advantage of this mishap is that it provides an opportunity to make the framework your own – I do commiserate if you see this as just more work. My analogy is this: you may have wanted a road map of how to get from A to B and you can’t have one, partly because B does not exist so there cannot be a map to get there. Instead you need to make a map and work out where you want to go and how to get there.

What not to do
Please don’t do an audit of PLTS with subject leaders and finish with a matrix of what is already done in which subjects and conclude a small action plan to plug a few gaps. You need to remember that there is usually a big difference between the curriculum as written and the curriculum as experienced, so it is very unlikely that such an audit bears much relation to reality. Audits are not useless, but use them with extreme caution. I appreciate that this is what many schools have done (see Graham Watts’s ‘Starting point 1′), but it is unlikely to lead to deep thinking about PLTS.

What to do
If you are already using a generic learning framework which makes sense to you and your students then carry on using it. If it makes sense to teachers and learners then it is valid in a way that the PLTS framework will may not be. More obvious examples of such frameworks are the 5Rs (Building Learning Power), the Opening Minds competences and Habits of Mind and you could try mapping your established framework across into the PLTS (which is encouraged on the QCDA website). So, for example, the Relating to People competences from Opening Minds are mapped onto PLTS in this table.

This comparison should give you some confidence that there is nothing magic or ultimate about the PLTS framework. It is a form of words that may be better or worse than the form of words you are currently using with students to describe more generic learning outcomes. Personally I think that the Opening Minds competences are more coherent and offer headings that I can relate to more readily. You will find in the case study from Durham Gilesgate School that they have made connections to SEAL outcomes. You could worry that an Ofsted inspector might be critical of such a stance, but if you argue your case well then you are on solid ground. Several of the schools featured as PLTS case studies on the QCDA website are relating PLTS to Opening Minds competences. One has integrated much of their work on Social and Emotional Aspects of Learning.

QCDA guidance
It is sobering to reflect that the QCDA, which has for 20 years (in various incarnations) provided guidance for teachers, is now in jeopardy. That should lead to much reflection in the profession. Nonetheless the QCDA website provides a number of case studies of schools using PLTS as a pillar of curriculum planning. In several of the schools the impetus has come from an understanding that students are not skilled or confident in working independently. One of the most interesting comments comes from an anonymous teacher as follows: ‘Is me sticking an aim and an outcome on everything we do stopping them from going where they really can go?’

It is significant to reflect that the P in PLTS stands for personal and the framework was conceptualised as part of the personalised learning agenda. Now I suspect that personalised learning really had more to do with marketising education and creating an image for parents that education would concentrate on the needs, understanding and interests of individual pupils as consumers. It had less to do with constructivist learning theories that see learning as uniquely created by each individual through the lens of existing experience and understanding. Nonetheless, the genie is out of the bottle and PLTS is a gateway to thinking seriously about learning again. We have had 20 years of a growing dominance of objective-led planning and the teacher comment puts the spotlight on the growing tension in the curriculum created by the behaviourist learning model which states what pupils are expected to learn en masse, despite nods in the direction of differentiation. This tension is reflected in different models of assessment for learning. Harry Torrance and John Pryor have distinguished between convergent and divergent assessment and the latter seems to be a concept that should inspire the development of PLTS.

Convergent and divergent assessment
Convergent assessment is a model in which the teacher sets out to assess if a student has learned some pre-specified outcome. The curriculum is to be mastered by the learner. Therefore teaching tends to be linear – a plan to be followed step by step – and questioning probes students’ understanding of the subject to be learned. Feedback is therefore authoritative, judging whether the student has reached the specified level.

Divergent assessment starts from a very different place – finding out what a student has learned. In these circumstances the curriculum does not dominate the learner, but to some extent reflects and follows emerging learning with the consequence that questioning is more open and planning is more responsive as lessons can take unexpected turns. There are more open forms of recording outcomes (such as observation), and questions are often aimed at challenging or prompting reflection, so that students arrive at some metacognitive outcomes. Assessment is at least a joint responsibility between student and teacher and students’ questions are significant stimuli for guiding the course of learning. At present we don’t have explicit models of lesson planning which accommodate such an approach, but the teacher quoted earlier can clearly feel the need for it. It is such flexibility and responsiveness that lies at the heart of teaching thinking, where teachers have to decide ‘on the hoof’ what are the most valuable outcomes to pursue.

The case study on pages 9-10 outlines peer group sessions in which pupils take the bullet points from the ‘creative thinkers’ section of PLTS and reflect on them with a critical friend in order to identify success criteria and targets. While this has the advantage of handing responsibility to students and developing a language around creativity, it does carry some risk – that students’ understanding of creativity is rooted in and limited by the PLTS vocabulary. However, the case study stresses that teachers work with pupils to define creativity, rather than imposing definitions on them, which has resulted in engaged and curious pupils.

I believe that curiosity is a real gem in terms of pupils’ dispositions that needs careful nurturing. At one of the schools we have worked with, we introduced the students to the idea of ‘brain radar’, encouraging them to be conscious of when they noticed something, a thought struck them, or they enjoyed or perhaps worried about something. Once the radar had ‘bleeped’ they could decide whether they would just ignore it as unimportant or ‘I’m too busy with other stuff’. Or they could slow down and look more seriously at this blip on the radar – is it friend or foe? It is too early to say whether this is a practical approach, but it is a marker of what education is failing to do – pay attention to the learner and their unique sense-making.

Another very significant issue for PLTS is how they are planned for in the curriculum. On the QCDA website the Top Valley School in Nottinghamshire appears. Their approach has been to devote some specific lessons to PLTS and to develop a toolkit of skills. This school, like many others, has opted to work in large groups in the school hall where teachers can team teach as pupils embark on a new ‘mission’ every two to three weeks. It is important to remember that just creating a more flexible working environment does not guarantee better outcomes – such work can be routine and unchallenging and it needs to be planned carefully so that there are new challenges and opportunities to reflect and find meaning and pattern in the experience.

The QCDA case studies have that slightly eerie feel of a deserted ship. They are presented in an almost universally positive light and I am sure that the schools would happy to acknowledge the problems that they have had along the way – that is life. These feel that they are not occupied by people (not the fault of the schools). In truth, you experiment, make mistakes and encounter the unexpected, so you adjust and tinker. Sometimes things just turn out wrong and you abandon them because they represent no improvement. The Durham Gilesgate case study in this edition is a good example of ‘a warts and all’ account.

The last important point to make is about the professional learning of staff. Your school will have enthusiasts who take to generic learning outcomes, as in PLTS, with real energy. Others will hesitate, hover on the edge or indeed prove hostile. To create positive learning experiences for most students, most staff will have to engage and find a way of seeing value in the approach. PLTS is as much about staff learning as it is about students’ learning.

David Leat is professor of curriculum innovation and executive director of the Research Centre for Learning and Teaching (CfLaT), Newcastle University