Evidence based practice has been somewhat of a revelation to me and my practice. I don’t take everything as gospel, but do look at the strategies that time and time again have shown to be effective. If I think they could work for me in my setting, then I will try to adopt them – why wouldn’t I?
The problem is, Further Education and Skills, moreover, external organisations (Ofsted), agencies and training companies promote practice that is not always informed by evidence. In fact, they promote quite the opposite. Let me give some examples:
Example 1: Individualised Instruction – On so many occasions have I heard comments like this: “there was not enough personalised learning in the session” or “learners were working at the same level and pace so the lesson did not meet their needs”. I’ve even uttered similar things myself (more to conform with expectations than actually believing it). I regularly hear of top-down expectations in sessions for learning to be differentiated to meet all learner needs through learning outcomes and learning activities, but in terms of opportunity cost, evidence shows that this is largely ineffective (not including special education):
‘Individualising instruction does not tend to be particularly beneficial for learners…the average impact on learning tends overall to be low, and is even negative in some studies, appearing to delay progress by one or two months.
This is not to say that differentiation isn’t important. I have blogged my views previously and agree with a lot of Amjad Ali’s post on differentiation. Both posts show the importance of teaching to the top and supporting all to get there. For this to occur, you need to respond to what is in front of you at that point in time. No amount of planning for individualised learning activities will do this in my opinion.
Example 2: Student Control Over Learning – ‘Learner autonomy’, another term bandied around freely without considering the evidence. Do learners really know what they need to know? I suggest not and the evidence supports this, with Hattie finding an effect-size of 0.04 – negligible. This links with the aforementioned really, giving a range of task choices is probably not going to add much value to the session, despite what you may be told.
‘33.5% of pupils eligible for FSM achieved at least 5 A*- C GCSEs (or equivalent) grades including English and mathematics compared to 60.5% of all other pupils. This is a gap of 27.0 percentage points.
36.5% of disadvantaged pupils achieved at least 5 A*- C GCSEs (or equivalent) grades including English and mathematics compared to 64.0% of all other pupils, a gap of 27.4 percentage points’
However, trying to raise aspirations isn’t the answer. Though the evidence here is limited, it does show that there is no causal link between aspiration and attainment. I’ve said before that we’ve gone target setting mad. A key comment taken from the report which certainly applies to FE and Skills is:
‘The attitudes, beliefs and behaviours that surround aspirations in disadvantaged communities are diverse so generalisations should be avoided.’
I am not saying don’t encourage learners to aspire to be better, but be wary of any cross-school/college interventions or strategies, particularly when there is a new ‘buzzword’ attached.
To summarise, the aforementioned information is not fact, but evidence suggests that we need to be wary of these common and encouraged practices that actually have little impact according to evidence. My next post will focus on what we should pay more attention to – the strategies that have demonstrated a positive impact on learner achievement.