ResearchEdFE – Oliver and me

Last week (03.12.16), Oliver and I delivered our ‘Choose Science, Not Myths’ presentation at the first ResearchEd devoted to Further Education.

Below are the slides from the presentation and Oliver kindly put together the presentation notes in his blog here and here.

The first part of the presentation explored a range of myths and while it is acknowledged that the jury is still out on some of these, it is important to remember that we were attempting to be contentious in order to spark debate. The second part of the presentation explored a range of effective learning strategies which are supported by both classroom experiments and cognitive science.


You don’t need to…

Over the last few weeks I’ve been working with a new group of unqualified, practicing teachers. It’s fascinating to hear about some of the things they’ve been indoctrinated with from others – those that did their teacher training many years ago.



Over a series of sessions, I’ve dispelled several myths about things they think they need to do and asked them to question their current practice. It’s not that this practice is necessarily wrong or ineffective, but the view of a ‘model lesson’ is, in my opinion. I’m going to explore some of these myths in this post and hopefully reassure readers that you don’t need to do any of them:


1. You don’t need to… start all lessons with a ‘starter activity’.

While it might be beneficial to grab the attention of the learners, a lesson needn’t start with an activity that has little relevance to the content. If you’re going to use one, I’d suggest a quick recap quiz for retrieval practice and initial assessment. Having said this, sometimes you might just fire straight in with the main body of the lesson and that’s fine, there isn’t a ‘right’ way to do this.


2. You don’t need to… write your learning objectives on the board.

It’s so frustrating that people think this makes a difference to the learners. Often the language used on the board is written in learner unfriendly, educational jargon. In most cases it is important to share the intentions with learners, so that they know what they’re doing and why, but sometimes you might reveal the intention as the lesson progresses. Whether you write down, tell learners or mime it, it doesn’t matter. Having said this, I often write intentions on the board so that learners have a point of reference should they wish to clarify what they’re aiming for, but I normally write this in the form of a question.


3. You don’t need to… make your learning activities fun, engaging and relevant to learner interests.

Two of my favourite quotes: ‘learning happens when people have to think hard’ (Coe, 2014) and ‘memory is the residue of thought’ (Willingham, 2009) should be considered here. All learning activities should give the learner the opportunity to think about the content. If fun, engagement and interest is a byproduct, then fine, but we should ensure that the focus is on content first. It doesn’t really matter how you do it, though some methods have, through research, demonstrated to be more effective than others (see here and here).


4. You don’t need to… worry about having enough time to teach the qualification.

This is something I hear a lot of, particularly in recent years where the guided learning hours of qualifications are being stripped back and every minute of a teacher’s contract is accounted for. For a start, you’re probably spending too much time on writing lesson objectives, doing starters and fun activities where the learners aren’t actually learning. If 5 mins are wasted in each lesson doing this and the learner has 12 lessons per week for 36 weeks, my maths says that’s 36 lost hours that I’ve just found you. In reality though, we are time short, so let’s not waste the precious time we have on nonsensical, ineffective tick box exercises.


5. You don’t need to… develop learners’ English, maths, soft skills etc in every lesson.

Whilst I am a huge advocate of developing literacy and numeracy through subject lessons, I don’t believe this should be at the expense of the content. I also don’t think we should force something in to ‘tick a box’. Natural opportunities should be taken and opportunities to develop the skills around the subject should be considered where appropriate. For example, if an learner uses subject specific terminology incorrectly, I would look to explore their understanding of the term and help them to put the word into context through use of a glossary.


6. You don’t need to… have a lesson plan.

Of course, you’d be foolish to think that you can teach without some sort of a plan, but you certainly don’t need to complete a particular lesson plan template. I’ve seen people plan to the exact minute in their lessons, but if learners don’t get something, rather than moving on because it is 9:23 and your plan says that you should be giving learners an activity, stop, and respond to what the learners need. Having a broad aim, an idea of how you’ll achieve it and how you’ll monitor learner progress towards it will allow for a more responsive approach to the learners – you might even be able to squeeze all of this information onto your fag packet.


7. You don’t need to… do what’s always been done.

New teachers, old teachers, teachers with no label – there’s an obsession with doing things how it has always been done. You pick up a new unit to teach, so follow the scheme that was planned by the teacher who did it in 2007, because that’s how its always been done. You include a learning styles inventory within your induction period and write the results on the group profile with no intention of using them, because, that’s how its always been done. Hey you that is nodding your head to this! Take control of this situation and your professionalism.


There’s probably many more myths that I dispel in every session, but these are a few that I have and will continue to challenge. You don’t need to do any of the above and don’t let anyone tell you otherwise, but equally, if you want to do them, then that’s your call.

Stop recycling learning styles

“You need to include learning styles to show how you’re going to differentiate for each learner”


This is a comment that a learner on my course received from their employer last week. I often hear comments about learning styles and still have to challenge the beliefs of teachers and learners about it… IT’S 2016 god damn it!! It’s almost as if the whole notion is being recycled, rather than trashed.


It was over a decade ago that Coffield et al (2004) critically reviewed the literature on learning styles, examining in detail 13 of the most influential models. Apart from the fact that they could find no credible evidence for the utility of any model, during their research they stumbled across a whopping 71 models of learning styles! Let’s imagine that every learner had each of these types of learning style (imagine each was dichotomous), then according to DeBruyckere et al, that would be 2 to the power of 71 combinations of learning styles, or in other words:

Two sextillion, three hundred sixty-one quintillion, one hundred eighty-three quadrillion, two hundred forty-one trillion, four hundred thirty-four billion, eight hundred twenty-two million, six hundred six thousand, eight hundred forty-eight.

Let me remind you that there are only 7.2 billion people on earth. So the numbers don’t quite add up. Moreover, if you wanted to cater for every individual need in your classroom, it would be a pipedream.


Apart from the aforementioned, there are two other main problems with learning styles according to Debruyckere et al.

  1. Most people do not fit one particular style – we’ve all done the tests haven’t we… eagerly waiting to find out what our label is, only to find that we are a bit of everything.
  2. The methods used to assign learning styles are inadequate – Due to the fact that all learning styles inventories are self reported, depending on how you feel that day will determine the result. i.e. I have just read a cracking article in the paper, would I be more inclined to think of reading activities more favourably?


Despite the distinct lack of evidence for them and the problems associated with determining so called ‘styles’, they are still at large in the teacher training text books and on teacher training qualifications. I kid you not. There has been a name change in some of the literature to ‘learning preferences’, but this has done nothing to reduce the myth, rather the terms are conflated.


This is ACTUALLY a picture in a teacher training textbook published last year. Learning styles is so popular that they’re included twice.


Why are they still so prevalent? Well, I believe that it is often those in decision making powers that are not in touch with the research. This lack of understanding, along with a positon of power, propagates the myth. As mentioned above,  I have a trainee teacher that was only informed last week that they needed to plan to meet individual learning styles by their manager. When she challenged this, she was shot down with my favourite comment of all time… “because Ofsted like it”. How did she know this? Well it turns out that she once worked as an inspector. Whilst Ofsted’s recent attempt at myth busting has only been a positive thing, there are many ill-informed inspectors (current and ex) that continue to perpetuate myths and reinforce poor practice.


In general, those that believe in learning styles, believe in making learning easier and more accessible for learners (note: I am not against inclusion). However, despite easier learning activities improving immediate performance, it is detrimental to long-term learning. Here Pashler et al offer more on this:

‘Conditions that make performance improve rapidly during instruction or training, such as blocking or temporal massing of practice, can fail to support long-term retention and transfer, whereas conditions that introduce difficulties for learners and appear to slow the learning process, such as interleaving different types of problems, or employing temporal spacing of practice on what is to be learned, often enhance long-term retention and transfer.’

Bjork’s work on desirable difficulties has also demonstrated vast improvements in long term retention when learning is difficult. With this in mind, were learning styles to exist, one would be better to teach using an approach that contradicts the style e.g. using an auditory mode of learning for kinaesthetic learners, as alluded to by the Learning Scientists in their recent blog post.


So there we have it, no one really likes learning styles. They don’t help learners, so, rather than just recycling them, let’s stick them in the trash once and for all and focus on stuff that works.



Superman likes Kryptonite – my response to Howard Scott

Last Friday the TES featured an article of mine based on 5 myths that I believe exist in Further Education. I really thought that in writing it, I had put the cat amongst the pigeons, yet the vast majority of those that have read the post have willingly shared within their own network, often writing positive comments in agreement with my points. I was always expecting a reply from one individual that was sure to challenge my thinking however… He’s like my kryptonite; but in a good way. He questions my views and often provides a well informed and considered counter argument. Unlike Superman, I like this. Why? Well it makes me read more to find out what the hell he is talking about, but also keeps me from having a blinkered view of education, namely learning. Here is Howard’s reply.


In response to each of Howard’s points, I would now like to respond in order to answer his questions:

Myth 1

With regards to a making a choice between knowledge and generic skills, there shouldn’t be one and that’s the point I’m trying to make. We can’t (as far as the research I’m aware of is concerned) explicitly teach generic skills and should instead focus on knowledge. We need domain specific knowledge to enable us to teach the domain specific skills you refer to in the English lesson (now I appreciate that there’s a wealth of research debating what a ‘domain’ is). Attempting to teach a skill to be used across domains is absurd in my opinion (based on my understanding of the research) and is often encouraged/prescribed in FE institutions through teaching in novel ways.


Tricot and Sweller (2014) cite Geary’s (2008) evolutionary psychology which asserts that we have two types of knowledge: biologically primary knowledge that we have evolved to acquire over many generations and biologically secondary knowledge that has become culturally important, but that we have not specifically evolved to acquire. Essentially, the secondary stuff is the domain specific information we need to teach (subject knowledge) and the primary stuff is that which is naturally acquired (skills). The more secondary knowledge we have, the greater we are able to draw upon the primary – being able to problem solve or be critically analytical (skills) – this may include both generic and domain specific skills, but it is naturally acquired – not achieved through instruction.


With regards to ‘instruction’, I didn’t use that term and have been misquoted, but that’s beside the point. I am happy to read evidence to the contrary, rather than an anecdote of the Beatles in a studio. Here’s what Tricot and Sweller say about biological primary knowledge:

‘While biologically primary knowledge may be unteachable, it does not follow that it is unimportant to instruction. It can be important in at least two respects. (1) People may learn the different contexts in which an already acquired generic skill can be applied. Learning the contexts in which a generic skill can be applied provides another example of acquiring domain-specific knowledge. In other words, general problem-solving strategies are “teachable” in a very restrictive sense, i.e. indicating to learners that a primary, general problem-solving strategy, already acquired by the learner, is usable to solve a specific academic problem. (2) In addition, biologically primary knowledge may facilitate the acquisition of biologically secondary information that provides the subject matter of instruction. Pointing out to learners that a biologically primary skill that they have can be used to assist in a biologically secondary task may be useful. Similarly, instruction that is organized in a manner that facilitates the use of primary skills in the acquisition of secondary skills may be beneficial (Paas & Sweller, 2012). In other words, while primary skills may be unteachable because they have already been acquired, they may be useful in leveraging the acquisition of secondary skills.’


I concede that we can develop skills within a domain as Hattie, Biggs and Purdie tell us. Let’s use my mechanic and problem solving. We can support them to follow a particular process, but (1) it requires them to have knowledge still, and (2) it isn’t something they can easily go and apply in another context; their maths lesson for example. I may be completely wrong and am open to the research, but am yet to find anything convincing. This is corroborated by the National Research Council 2012:

‘Research to date provides little guidance about how to help learner’s aggregate transferable competencies across disciplines. This may be a shortcoming in the research or a reflection of the domain-specific nature of transfer’.


Myth 2

If there is an overlap between units delivered, then that’s great. It’s when a module is taught and not revisited that I take issue with. This is particularly prevalent on BTEC’s and the like.


Myth 3

Firstly, this point ‘don’t OFSTED approve of multiple activities taking place in classrooms, rather than one steamrollered practice?’ Ofsted approve of many things that lack sufficient evidence base and continue to be debunked (grading lessons, progress etc.), so I’m not going to go there with this one.


Having read more around this subject, I think the term individualised instruction is better suited. There is a bit of a ‘jangle fallacy’ with this and in FE the term personalised learning is used as opposed to individualised instruction. My understanding of it is giving learners different learning activities to do within the same lesson to either reach the same or different outcomes. Essentially, the teacher is spread far more thinly than when the group moves at a similar pace with whole class instruction. Yes, you’re right about the research being in school, but there is no evidence to support its use in any setting, and that is my point. Why is it so prevalent with such little evidence?  I think in some cases, it may be beneficial, but it seems the fad of the moment – the ‘in’ thing, and we need to be careful with it.


Myth 4

Again, in some cases, this may be beneficial, but current research shows that student control over learning is good for student motivation, but not so effective for learning. It will be interesting to see the outcome of research using technology in this area. To clarify student control over learning, it is not too dissimilar to the above – choice over methods and over content. Have you ever given learners the choice of something to go away and research, for them to come back with a poor or complete misunderstanding of it? I certainly have. We (the teacher) know what is needed, so why waste valuable learning time ‘facilitating’ learners to get there, when we can be more direct using methods that have shown to be more effective?



I’d like to quote your point about the opportunity for innovation: ‘Are we as professionals afforded some opportunity for innovation that allows that to happen – rather than subscribing to or being prescribed with what is tried and tested?’ Yes and no. I’m getting at that point somewhat in the article, but feel that it is far safer to use the ‘tried and tested’ methods in order to do best by our learners. Can we continue to trial and prescribe methods with little evidence at the risk of damaging the young people we serve? Sometimes I feel that we may be widening the gap, not narrowing it for our learners.


The essence of the article isn’t about me being right, because I am open to different ideas and accept that there isn’t a best method – everything works for someone and something works for everyone. The article is really a reflection of my feelings about the FE sector. A sector that buys into the latest fads and gimmicks and rarely challenges the top down prescription of these. We need to be able to explore the evidence and rationale for the methods and if we decide that it isn’t right for our learners in our contexts, then dump it – not follow the crowd.


The purpose of FE is always going to be debatable as it has become a ‘Jack of all trades and master of none’ (are we that different to schools by the way? – I’m not so sure). I think you make some salient points about being the bridge to employment and that it may be better ‘looking different’, but it is what it is and we have to do the best we can for the learners that we serve. Let’s use what we know to our advantage.


Thanks for being my kryptonite and taking the time to read the article and post a well-articulated reply which has challenged me greatly.


The TES and I

I feel very fortunate to have been asked to write for the TES. Every few months I write an article and with every piece, try to diversify my writing in an attempting to be critical and challenging to the views and opinions of others. This, coupled with my blog allows me to develop my own views and hopefully help others to develop theirs too, rather than being constrained to views that are imposed within their institutions.


I am passionate about giving practitioners a voice in their establishments. They need to be provided with opportunities to explore research, challenge their own beliefs and question the views of others – not just be puppets. I am currently working in an environment where I feel liberated, where I can write freely, without fear of being disciplined and (feel I) am developing at an exponential rate as a result.


Here are the TES articles written over the last 14 months for your perusal. Some you may need to be a subscriber of the TES for, but most are accessible, Enjoy!
08.04.16 – Let’s explode a few myths about teaching methods

17.01.16 – Why catchphrase could hold the secret to formative assessment

30.10.15 – The three essential ingredients of truly expert teaching

02.10.15 – Freedom is key in lesson observation

09.07.15 – Diving into the ‘pool of development’

28.05.15 – The elephant in the room

20.02.15 – Banging your head against bad CPD

Let’s Flip the Focus – Knowledge v Skills

I’ve been contemplating this post for a number of months now. The paradox of teaching generic skills in Further Education (FE) to enhance prospects. These are developing thoughts only and I welcome feedback.


The majority of learners recruited in FE are those that didn’t necessarily achieve the highest grades at school. This lack of achievement may be as a result of many factors, but focusing purely on cognition, is likely to be as a result of not remembering or knowing enough (about the curriculum). Interestingly, rather than focusing on providing these individuals with knowledge, FE has seen a huge emphasis on delivering sessions that are learner centred, and that focus on the development of generic employability skills such as critical thinking, problem solving, independent thinking and creativity (perhaps due to external pressure?). These learner centred practices may involve experiential learning activities (role play), project based learning (group work), inquiry based learning or something of a similar ilk. Much of these practices are informed by the work of Vygotsky, Rogers and Dewey to name a few.


I’m not going to lie, I actually like and make use of some of these approaches from time to time, but recognise that they do focus on generic skill development, rather than explicitly aiming to increase knowledge. Although recognised as essential by Sweller, he asserts that generic skills can’t actually be taught.

‘Generic skills are far more basic and far more important than domain-specific knowledge, but they do not need to be taught because we have evolved over countless generations to acquire them effortlessly and unconsciously simply by membership of a society.’

This quote identifies the fact that the aforementioned generic skills are formed and developed naturally, and cites Geary’s (2012) definition of these skills as being ‘biologically primary knowledge’. Further to primary knowledge, biologically secondary knowledge is that which can only be acquired through instruction according to Sweller:

‘Biologically secondary knowledge is knowledge we have not specifically evolved to acquire but that we need for cultural reasons. We will not acquire such knowledge automatically and indeed, we invented schools and other educational institutions precisely in order to teach biologically secondary knowledge because otherwise it tends not to be learned.’

This is the knowledge contained within curriculum. The core ‘domain specific’ information needed by learners (or perceived to be needed by those writing the curriculum). The knowledge that without acquiring, may lead to learners not being able to problem solve or think critically about.


A wealth of research has explored the differences between experts and novices and found that their cognition is different. Experts are able to draw upon a wealth of knowledge from their long term memory to enable better problem solving and critical thinking skills (Willingham, 2009).  I often use the example of myself when discussing this with my trainee teachers. If you were to ask me to solve a problem with an individual’s exercise programme or diet, I could make use of my prior knowledge to provide a suitable solution. Ask me to solve a problem with a car engine and you’ve lost me. I can only just remember how to open the car bonnet!


By teaching learners to be experts through learner centred, guided discovery, higher order activities, or whatever else you want to call it, we are not improving their knowledge as well as we might through explicit instruction or methods that have shown to improve long term memory. I always use the question ‘how do learners know what they need to know?’ when talking about learner choice and inquiry based learning and the research summarised by Hattie appears to support this notion (though only looking at achievement outcomes (o.04 and 0.31 respectively)).


In summary, teaching lessons through learner led methods in FE may not be as productive as we think. In fact, maybe we should flip the focus our attention on ensuring that learners really have learnt the knowledge needed and are given ample opportunity to reinforce this. I have written about how you might do this here. Then and only then might we find that their ‘generic skills’ develop.




If you’ve been following my blog and twitter, you’ll have noticed I am gaining more interest in the research that underpins our practice as teachers, particularly around memory. My latest find is perhaps not as powerful as some of the other research available, but is something to consider nonetheless.

  • Have you ever been in a meeting and doodled on the agenda?
  • Have you ever had feedback from an observer saying that learners weren’t learning as they were doodling on their notepads?
  • Have you ever been bored in a lecture and doodled in your margins?


Well you may be surprised to hear that doodling can actually improve short term memory and concentration!


Research conducted by Andrade examined 40 participants in a monotonous mock telephone message for the names of people coming to a party. Half of the group was randomly assigned to a ‘doodling’ condition where they shaded printed shapes while listening to the telephone call. The doodling group performed better on the monitoring task and recalled 29% more information on a surprise memory test.


Look, I’m under no illusion that the small number of participants and the method of assessing concentration and short term memory is open to criticism, but it’s worth noting this:

Just because someone is doodling whilst information is being shared with them, it doesn’t mean that they’re not learning – equally, it doesn’t mean that they are. But at least if anyone challenges you about your own, or your learners doodling, then you can point them in the direction of this.


Why do we ignore the evidence in FE?

Evidence based practice has been somewhat of a revelation to me and my practice. I don’t take everything as gospel, but do look at the strategies that time and time again have shown to be effective. If I think they could work for me in my setting, then I will try to adopt them – why wouldn’t I?


The problem is, Further Education and Skills, moreover, external organisations (Ofsted), agencies and training companies promote practice that is not always informed by evidence. In fact, they promote quite the opposite. Let me give some examples:


Example 1: Individualised Instruction – On so many occasions have I heard comments like this: “there was not enough personalised learning in the session” or “learners were working at the same level and pace so the lesson did not meet their needs”. I’ve even uttered similar things myself (more to conform with expectations than actually believing it). I regularly hear of top-down expectations in sessions for learning to be differentiated to meet all learner needs through learning outcomes and learning activities, but in terms of opportunity cost, evidence shows that this is largely ineffective (not including special education):

‘Individualising instruction does not tend to be particularly beneficial for learners…the average impact on learning tends overall to be low, and is even negative in some studies, appearing to delay progress by one or two months.

This is not to say that differentiation isn’t important. I have blogged my views previously and agree with a lot of Amjad Ali’s post on differentiation. Both posts show the importance of teaching to the top and supporting all to get there. For this to occur, you need to respond to what is in front of you at that point in time. No amount of planning for individualised learning activities will do this in my opinion.


Example 2: Student Control Over Learning – ‘Learner autonomy’, another term bandied around freely without considering the evidence. Do learners really know what they need to know? I suggest not and the evidence supports this, with Hattie finding an effect-size of 0.04 – negligible. This links with the aforementioned really, giving a range of task choices is probably not going to add much value to the session, despite what you may be told.


Example 3: Raising Aspirations – If I hear aspirational target one more time!… I get it, I totally do. Those that are disadvantaged should be supported to overcome these dreadful statistics:

‘33.5% of pupils eligible for FSM achieved at least 5 A*- C GCSEs (or equivalent) grades including English and mathematics compared to 60.5% of all other pupils. This is a gap of 27.0 percentage points.

36.5% of disadvantaged pupils achieved at least 5 A*- C GCSEs (or equivalent) grades including English and mathematics compared to 64.0% of all other pupils, a gap of 27.4 percentage points’

However, trying to raise aspirations isn’t the answer. Though the evidence here is limited, it does show that there is no causal link between aspiration and attainment. I’ve said before that we’ve gone target setting mad. A key comment taken from the report which certainly applies to FE and Skills is:

‘The attitudes, beliefs and behaviours that surround aspirations in disadvantaged communities are diverse so generalisations should be avoided.’

I am not saying don’t encourage learners to aspire to be better, but be wary of any cross-school/college interventions or strategies, particularly when there is a new ‘buzzword’ attached.


To summarise, the aforementioned information is not fact, but evidence suggests that we need to be wary of these common and encouraged practices that actually have little impact according to evidence. My next post will focus on what we should pay more attention to – the strategies that have demonstrated a positive impact on learner achievement.


Coming out of my ‘secret teacher’ closet

Around 18 months ago, disgusted by the Ofsted Annual Review (2014), I decided to write back to Ofsted to put a few things straight about English and maths in the FE and Skills Sector. The Guardian immediately contacted me to see if I would like to write for them and I duly obliged. At the time, I wasn’t publishing under my real name as I had recently been in a bit of bother at work about my social media exploits (or not). For this reason, I remained anonymous and wrote the article under the ‘Secret College Tutor’ pseudonym.


At the time, I must have written what many in the FE and Skills sector were thinking, as the views and shares went through the roof. I decided to remain quiet for the most part (only telling my nearest and dearest). It wasn’t until Geoff Petty (an education inspiration of mine) re-tweeted the article yesterday that I realised how proud I am of what I put together, and it is for this reason that I am coming out and telling everyone that it was me all along.


You can view the article here. If you like it, continue to share. Also feel free to comment on this blog post.



If you can’t read minds, choose words carefully.

I wrote a post a couple of years ago about learning and performance, inspired by David Didau’s own blog and a workshop I attended that really opened my eyes. This blog post was removed after causing a bit of controversy with a previous employer. I have decided to repost it due to a number of things. 1. I feel as a sector we are at a point where challenging beliefs is more acceptable. 2. I have read/heard a lot of comments lately that have been made with authority, which reinforce messages which are wrong (particularly on LinkedIn).

I mean no disrespect to any professional colleagues, but I feel we all need to reflect on this post and select our language carefully when talking about teaching and learning.

This post was written in May 2014:

I have come across several comments lately and I have come to the decision that we (educators) need to be careful about the language we use when we talk about learning. Now I am no linguist, but hear me out…

“Learning was clearly taking place as all learners were engaged”
“They were learning well in the session which was evident in the amount of written work”

Do comments like these sound familiar? 

If ever a definition of learning could be agreed, it would certainly involve something about knowledge acquisition and probably something to do with long term memory and being able to retrieve information (that’s me drawing upon numerous information sources and amalgamating them). 

Of course, we cannot see what knowledge has been acquired and stored in the long term memory in a snapshot observation. How can we? How could we possibly see the amazing (or not so) neural connections that learners are making between their environment and long term memory? 

At Pedagoo London earlier this year I had the pleasure of listening to David Didau (@learningspy). He gave a great presentation about learning and performance. One example used emphasised how we see learner performance rather than learning in the classroom. He began by asking:

“Who knows what the Capital of Poland is?”

Most of the audience put their hand up. To one lady who didn’t know, David informed her that it was Warsaw (teaching). He then asked her to tell him what the capital of Poland was. She of course immediately answered Warsaw. She had performed! 

Had she learnt? Well it isn’t clear. I don’t suppose David ever tracked her down to find out if she can still answer that question, but when I have used this exact example, the people I ask later down the line have nearly always forgotten. They didn’t learn from me, despite at the time me thinking they had.

This simplistic way at looking at learning and performance is really quite useful. This brings me back to the comments that we see on observation reports. Whether graded or ungraded, learning walk or formal observation, we cannot make a judgement on ‘learning’ without continued observation over time, discussion with learners, examination of learner work etc etc. In all honesty, we may never understand what learning has occurred. With so many other variables in a young persons life – TV, family, friends and the internet how could we possibly know?

What observers tends to use in lessons as ‘evidence’ for learning are some of the things listed below. Prof Robert Coe (2013) informs us that these are poor indicators for learning:

Poor Proxies for Learning 

(Easily observed, but not really about learning) 

1. Students are busy: lots of work is done (especially written work) 

2. Students are engaged, interested, motivated 

3. Students are getting attention: feedback, explanations 

4. Classroom is ordered, calm, under control 

5. Curriculum has been ‘covered’ (ie presented to students in some form) 

6. (At least some) students have supplied correct answers (whether or not they 

really understood them or could reproduce them independently) 

Coe (2013) states that ‘learning happens when people have to think hard’. This is corroborated by Nuthall (2007), Willingham (2009) and Brown, Roediger and Mcdaniel, (2014). So when learners appear ‘stuck’ in lessons rather than doing all or some of the above, there may indeed be a lot more learning taking place. 

Learning is a complex beast. As observers, the language we use needs to be clear and abstain from comments such as those above. For all of those observers out there – do not say that you saw learning taking place. Choose your words wisely. 

Learning is invisible and what we think may help learning, is probably not. 
In summary, when we talk about teaching and learning, we have to be careful with the assumption that learners are or are not learning. Until we develop some sort of ability to mind read, we will never truly know what has been learnt. In the meantime, we need to make do with the collection evidence over time to enable us to come close to making such claims and be mindful of the fact that even then, we may be wrong.