Principle to Practice 2: The Worked Example Effect, Generation Effect and Element Interactivity

Introduction:

My second principle to practice blog post focuses on a paper by Chen, Kalyuga and Sweller

The Worked Example Effect, the Generation Effect, and Element Interactivity

What was the paper about?

Research exploring the worked example effect has demonstrated that model examples which provide full guidance on how to solve a problem more often results in better test performance compared to providing no guidance during problem-solving. Before we go on, it is important for us to define (in lay terms) some of the key terms used in this paper:

  • Worked examples are model step-by-step processes that provide full guidance to learners on how to solve a problem.
  • The generational effect is the requirement for learners to be actively involved in the generation of their own understanding of material with little guidance from a teacher (e.g. a problem solving task).
  • Element interactivity focuses on the complexity of new learning material in relation to prior knowledge and the external environment. For example, if given the problem x-3 = 5, novice learners may need to handle (x, -, 3, =, 5) as separate components in working memory – this has a high element interactivity. Whereas more expert learners are more likely to know the process of calculating this type of equation (add 3 both sides and why), thus a low element interactivity (example adapted from the article).

The authors use cognitive load theory as a lens for their research. They assert that information is stored in the form of schemas in long-term memory and that working memory has limited capacity but draws heavily on long-term memory to ease the working memory burden. When presented with novel and complex problems to solve (high element interactivity), if there is insufficient prior knowledge in long-term memory, the working memory can easily become overwhelmed and so it relies on something called the ‘borrowing principle’ e.g. expert instructions and/or worked examples to ease the burden. In contrast, when the problem can be solved by utilising long-term memory resources (low element interactivity), the borrowing principle becomes redundant.

 

What was the aim of the paper?

The authors sought to explore the benefit of worked examples with increasing expertise. They suggest that the advantage of worked examples may decrease or even reverse to a disadvantage because with increasing expertise, studying worked examples becomes a redundant activity. Furthermore, increases in expertise should have the same effect as decreases in element interactivity.

 

What did they do?

This research involved two experiments:

Experiment 1 investigated the relationship between levels of guidance and levels of element interactivity using 33 Year 4 primary school learners studying geometry topics that were either high or low in element interactivity for these students. High-element interactivity materials were used to test for the worked example effect by comparing studying worked examples (high guidance) with problem solving (low guidance). Low-element interactivity materials were used to test for the generation effect by presenting learners with answers to memory questions (high guidance) or having them generate answers themselves (low guidance).

It was hypothesized that high guidance (worked examples) would be superior to low guidance (generated problem solving) using materials high in element interactivity, whereas low guidance was predicted to be superior to high guidance with materials low in element interactivity. The results of Experiment 1 confirmed this hypothesis.

 

Experiment 2 also tested for an interaction between guidance and element interactivity with older, more expert learners using similar materials to those of Experiment 1. It was hypothesized that the interaction should be reduced or eliminated using students who had a reduced requirement for worked examples (high guidance). 36 Year 7 students were randomly assigned to groups using the procedure of Experiment 1. All students had previously studied the area and perimeter formulae used in this study to test for the generation effect. Similarly, all students had been taught to solve the problems used to test for the worked example effect approximately a year previously. Therefore, Year 7 students were regarded as relative experts with respect to the formulae as well as the problems used in Experiment 2.

Results of this experiment supported the hypothesis that the worked example effect reversed with increases in expertise. Increased guidance had a similar negative effect on both higher and lower element interactivity material. In other words, in contrast to Experiment 1, the generation effect was better for both lower and higher element interactivity material.

 

What is the key principle of the paper?

Low guidance during instruction (the generation effect) is more effective for knowledgeable learners with expertise. High guidance (the worked example effect) is more effective for novice learners.

 

What does this look like in practice?

Novices need more guidance and are more likely to benefit from worked examples to chunk new learning, thus supporting their understanding. The example below shows what a typical ‘non-worked example’ task sheet is like, compared to a ‘worked example’, with notes:


WE

Click image to view larger


*This is a simple example in maths. Worked examples can be used across a range of subjects and a quick Google will reveal them in subjects such as English, PE and Geography to name a few.

 

To conclude, when you initially assess learner knowledge, where they are relatively novice, consider how you will support them to more effectively learn new content by using worked/model examples. If dealing with relative experts, consider using more problem based tasks (more to follow on this).

 

Once again, if I have misunderstood anything, feel free to let me know. If you have any examples you’d like to share, please leave them in the comments below.

 

 

 

Advertisements

Principle to practice: the split-attention effect

Introduction

After Mike Tyler’s excellent presentation at the recent #FEShowcase18 where he explained Soderstrom and Bjork’s work and made clear links between their principles and practice, I was inspired to explore a range of research articles on learning and memory and present it in a similar fashion. I’ve been off the blogging scene for a while and hope to reinvigorate the blog by producing at least one of these short posts per week over the summer period, using research to make clear and practical suggestions to support teachers.

Here’s my first one by Chandler and Sweller (1992):

THE SPLIT-ATTENTION EFFECT AS A FACTOR IN THE DESIGN OF INSTRUCTION

https://pdfs.semanticscholar.org/d81d/3e4514b69578aaa769d888adade75355b563.pdf

What’s the paper about?

This paper centres on the impact of instruction. Using cognitive load theory as a basis for the work, the authors argue that many methods of instruction are ineffective as they involve greater extraneous load (irrelevant things that impact on our very limited working memory e.g. fancy presentations). When information is presented by two different sources, e.g. text and diagrams, learners have to split their attention. This can cause extra ‘load’ in working memory as they are having to make sense of two different sources of information. The authors call this the ‘split-attention effect’.

What was the aim of the paper?

Essentially, the aim was to determine the impact of the split-attention effect on learning

What did they do?

Their research involved two experiments:

  • Experiment 1 compared conventional text and diagram instructions (e.g. the text instructions were above a diagram they related to) with physically integrated instructions (the text instructions were integrated into the diagram) for 20 engineer apprentices learning a milling process. Post-test results for the integrated instructions were considerably higher than the conventional group.
  • Experiment 2 focused on 20 Psychology students who had to answer questions on a traditional research paper versus a paper with methodology and result section integrated. Once again, the integrated format was far more effective than conventional formats.

 

What is the key principle of this paper?

Not only should diagrams and text be integrated, the evidence is strong that learning can be enhanced by integrating mutually referring, sources of purely textual information (e.g. the method and results section of a paper).

What does this look like in practice?

The example below shows the flow of blood through the heart. Image A is an example of a typical worksheet that one might find in a classroom. This requires the learners to split their attention back and forth between the image and the text. Image B on the other-hand is integrated; the text accompanies the diagram and this reduces the unnecessary load on working memory as learners do not have to switch between the text and diagram.

BLOOD 2
Image A
BLOOD
Image B

So… how might you use the split-attention effect research to support your teaching?

*I’m happy to be corrected on any misunderstanding. Feel free to comment.

Interpolated Testing – What is it exactly and what are the benefits?

It seems I may have misunderstood interpolated testing in a recent blog post. I assumed, by definition, that interpolated testing meant that there was switching between new and old learning in the testing of learners (or quizzing).  For example, a typical starter quiz where a teacher would ask questions on previous learning, whilst also assessing the learner understanding on the new.

road-sign-361513_960_720.jpg

This understanding was corrected (or confused further?) in a recent lecture on interpolated testing by Dr Philip Higham of the University of Southampton. The talk was fascinating and raised several more questions I wish to consider, particularly in relation to the conflict between desirable difficulties and cognitive load theory (more to follow on this).

 

The aptly titled ‘PowerPointless’ began with Philip espousing desirable difficulties and the “metacognitive illusions” that exist in learning – massed practice, fluency, lecturer style, testing is bad etc.

 

Philip then explained a series of experiments that he has been working on with a PhD student. Each of these investigating the impact of slide handouts during lectures. Six lab-based experiments were conducted and pre-recorded lectures given to various groups:

  • Group A – The control group where learners were asked to observe the lecture without taking notes
  • Group B – This group were provided with lecture slides and could annotate these as they wished
  • Group C – This group took notes from the slide for themselves
  • Group D – This group were asked to take notes as if they were for a friend (it was suggested that the notes would be clearer and better organised through doing this)

The various experiments changed variables such as speed of presentation, fluency of presentation and used various topics. Learners were tested immediately after each experiment and then sat a delayed test one week later.

 

Results consistently revealed that note taking (of any kind) was significantly better than not taking notes and using the slides provided for the lecture (Group A and B). This is significant for all teachers who provide a copy of slides to their learners. Think about how you can encourage learners to take their own notes during sessions – I wrote a blog about the Cornell method a few years back that may be of use for this.

 

The experiments progressed further, and Group E was added; these individuals were the ‘interpolated testing’ group. This group experienced the lecture in short intervals of around ten minutes before being asked to generate their notes in a retrieval type manner (this is what is referred to as interpolated testing – and the literature that I have read to date typically uses this approach. A short introduction to new learning immediately followed by a retrieval of this new learning). As a side note, is this not just part of what teachers do for formative assessment? (perhaps formative assessment is so effective due to the retrieval aspect rather than feedback?).

The results showed that the retrieval and generation of notes (Group E) had an improved impact on immediate and delayed (1 week) test results compared to other groups.

 

All groups were then provided with 8 weeks of revision time using the same lecture handouts containing all answers. Following this, they were tested on the material. Results showed no significant difference between those that took their own notes and those that did the interpolated testing. However, the results did show that the quantity of time that learners revised for during the 8 weeks, was significantly lower for those that did the interpolated test (Group E).

These findings are significant:

  • Taking your own notes is highly effective for improving long term retention of information
  • To reduce study time, learners are better off learning via interpolated testing

 

It is worth noting that much of the literature is positive (see Szpunar et al) on interpolating, specifically for improving long term retention and motivation of learners. The reasons suggested by Davis et al for interpolated testing not being as effective as first thought, is due to:

  1. The learners spending more time thinking about the prior learning and correcting this, over moving to the new learning
  2. The more times that there is switching between old and new learning, the more the task switching effect will occur, thus impeding the new learning

In spite of this, there is a suggestion that ‘test potentiated learning’ (recalling prior knowledge) is actually beneficial to learning and this supports the acquisition of new information. This obviously needs further study, but suggests that teachers need to be mindful of how often they are switching between the delivery of new learning and the retrieval of it during sessions.

 

*NB. These were some of my notes and inferences from the lecture. Data and information is my interpretation of that which was shared. A huge thanks to Dr Philip Higham for sharing this information and challenging my thinking.

I would like to explore my initial thoughts on interpolated testing a little further, as I expected a delayed retrieval, rather than retrieval immediately after encoding. This allows for greater forgetting which one would think is better… anyway, more thought needed on this. 

Retrieval and encoding: getting off to a good start

Since developing my understanding of formative assessment and retrieval practice, I have always attempted to ensure that my lessons begin with a clear recap of prior learning (retrieval), coupled with questions about forthcoming learning which allow me to identify gaps in knowledge to support my delivery. It makes sense to kill two birds with one stone, but alas, I may have been doing this wrong for some time.

Before I continue, it is at this juncture that I feel the need to distinguish between two distinctly separate processes that I will be talking about in this post:

‘Encoding is the process of moving information into your long term memory (LTM) via your working memory (WM) i.e. learning new things

Retrieval is the process of pulling information out of your long term memory and into your working memory. This strengthens the memory in the long term.’

(Definitions courtesy of Adam Boxer)

Last week educational psychologist Daniel Willingham shared a research paper (here) arguing that frequent switching between retrieval practice and encoding impairs new learning – even Willingham himself appeared shocked in his tweet.

dw

It is clear from the paper that whilst retrieval practice strengthens learning, it is reported that ‘interpolated’ testing can sometimes impair new learning. In essence, having a starter quiz with mixed learning (prior and new) may not actually be as useful as one might have thought.

I have been able to make some sense of this thanks to a blog by Adam Boxer (here), who reminds me of some of the key differences between encoding and retrieval:

ab

Now, whilst a quiz at the start of a session isn’t really going to support encoding during the asking of questions, it will create extra cognitive load when the teacher clarifies the responses. Moving between retrieval of old information and encoding of new creates an undesirable extraneous load, due to the effort required to switch between the two distinctly separate processes (encoding and retrieval). Furthermore, it is suggested by Davis et al (2017) that mixing retrieval practice with encoding might bias learners’ attention towards relearning the old information, thus impeding new information being learnt.

As a result of this information, it is suggested that retrieval practice (i.e. recapping prior knowledge) be done separately to the delivery of new learning. Your learners may benefit from a separate recap quiz and initial assessment, or by omitting the initial assessment altogether and just concentrating on retrieval prior to the delivery of new information… I’ll certainly be revisiting my practice.

Intro to Working Memory and Cognitive Load

I don’t get the time to update the blog much these days, but here’s some of my slides from a session I did last week with trainee teachers. If they are of any use to you, feel free to use:

 

Special thanks to Oliver Caviglioli for his design work that inspired the design of these slides.

Think about thinking hard

I recently stumbled across this statement in Coe’s excellent ‘Improving Education‘ publication and it really hit home:

Some research evidence, along with more anecdotal experience, suggests that students may not necessarily have real learning at the top of their agenda. For example, Nuthall (2005) reports a study in which most students “were thinking about how to get finished quickly or how to get the answer with the least possible effort”. If given the choice between copying out a set of correct answers, with no effort, but no understanding of how to get them, and having to think hard to derive their own answers, check them, correct them and try to develop their own understanding of the underlying logic behind them, how many students would freely choose the latter? And yet, by choosing the former, they are effectively saying, ‘I am not interested in learning.’

Coe goes on to inform us that ‘learning happens when people have to think hard‘. But how do we ensure that learners are both thinking hard, and putting effort into their learning? Easier said than done eh?

download (1)

Here’s some ideas for you to think about using with learners at the start of the academic year:

  1. Teach students about the importance of hard work and effort: Now this is no easy feat. Marzano informs us that this can have a high effect of achievement and suggests sharing examples of personal experiences or those that learners can relate to. He also suggests that learners self-assess their effort in lessons when self-assessing achievement against success criteria – not something I have tried myself, but certainly one to consider.
  2. Establish routines early: For those working in an FE college, most learners are joining your class with no idea as to what to expect. they will be in new surroundings, with new people and this is a great opportunity to establish high expectations in the classroom – Start as you mean to go on! If you have learning activities that require little effort, or if learners are allowed to put little effort in, then guess what? Yes, that will be the routine for the year.
  3. Find out what learners know and use the information: Initial assessment is crucial, but I’m not talking the whole sticking the learners on a computer to complete a maths and English IA to determine… well, not-a-lot. What I’m talking about is finding out what the learners know about your subject. Give them an advanced organiser to help them identify current knowledge and how this fits with information they’re going to learn. Use what they know to help them make sense of new information, to challenge misconceptions and to give a clear direction to the learning that they’re about to embark on.
  4. Organise information: Building on from the above, the more organised the information that learners are dealing with, the better. Provide a range of concrete examples to explain abstract concepts and use both verbal and visual information simultaneously (dual coding) to reduce cognitive load. Cognitive science research also indicates the benefits of revisiting information on several occasions over the term/period of learning (distributed practice) to enhance retention. There are many other strategies that have shown time and time again to be effective – summarised clearly for teachers by the learning scientists (every teacher needs this in their life).
  5. Test learners regularly: As with the above, our memory trace is improved when we have to work hard to retrieve information from long term memory, thus improving retention. Therefore, we should aim to test learners frequently through mini quizzes and self testing. This not only supports retrieval practice, but it also allows both teacher and learners to identify strengths and any misconceptions that learner have, thus allowing for appropriate intervention.

All of the above are simple ‘off the shelf’ strategies that may help to increase the effort and ensure that learners are working and thinking hard in your classrooms. They are not silver bullets and may work better in some situations than others, but all are worth considering – particularly as the new term is about to begin.

 

Learning my Craft pt 1.

I’ve been reflecting on where it all began for me as a teacher. At 16, I left school with six GCSEs above grade C and didn’t think that further study was for me, so I embarked upon a career in the leisure industry. I worked for a couple of years as a lifeguard, swimming teacher and fitness instructor before going back into education. When I think about it, it was during this period that I learnt most about the craft of teaching. Let me explain why:

images

Like many activities, both gym based exercise and swimming involves a range of motor skills. From the breaststroke technique, to performing a bench press, both involve complex motor skills and for novices, both can be difficult to master. Whilst learner confidence is an important ‘affective’ characteristic in both environments (particularly in swimming, which I might blog about at a later date due to its relevance to FE learning), once a level of confidence is developed, the teaching of a new skill can be done with efficiency and impact. However, the teaching of a skill can also be very inefficient and ineffective. In this post I hope to share some of the theories/strategies that I learnt early on in my career which have helped me to hone my craft and I’d like to think are the more efficient/effective approaches.

 

Further Education (FE) caters for a diverse group, which makes it challenging when recommending particular teaching strategies. Last year I blogged about the different approaches one might take with 3 learners.  There are many technical subjects where the vast majority of learning is skill based (procedural knowledge to the cognitive scientists). When one learns a practical (motor) skill, for example, welding, sewing, cutting, drilling etc, according to Fitts and Posner (1967), there are certain stages that one goes through in order to develop ‘automaticity’. A summary can be found in the table below:

190tab_Main
Image Source

STAGE 1: Cognitive Stage Huber (2013) states that the cognitive stage is:

‘verbal–cognitive in nature (Schmidt & Lee, 2005) because it involves the conveyance (verbal) and acquisition (cognition) of new information. In this stage, the person is trying to process information in an attempt to cognitively understand the requirements and parameters of motor movement.’

In other words, this involves the learner making sense about how to perform a skill. In order to do this, they need to see what ‘good looks like’ (blog to follow). To see this, they require explicit instruction by a competent individual. In the case of a teacher, the most effective way of doing this is to accurately model the skill and explain each step clearly. This is supported by research in the fields of fitness and gymnastics where it was found that effective modelling improved performance over other methods of instruction/development. Of course, as McCueeagh, Weiss and Ross note, there are many other factors to consider when modelling skills, e.g age and stage of learners, but if we think about principles of cognitive load theory, clear, chunked explanations and a combination of coherent visual and auditory information (dual coding) are proven techniques for supporting knowledge acquisition.  When I think back to my fitness instructor course in the early 00’s, effective modelling and instruction was inherent. The main strategy adopted when supporting gym users with new exercises/equipment was NAMSET:

  • N= Name of the Exercise – the name of the skill is outlined by the teacher
  • A= Area of the body worked – the teacher identifies the area of the body that is being worked
  • M= Muscles used – the teacher uses the correct anatomical terminology for muscles used
  • S= Silent demonstration – the teacher demonstrates the new skill in silence
  • E= Explanation of the exercise – the teacher explains the skill in small steps, with key points of consideration.
  • T = Teach the exercise – the teacher supports the learner as they complete the skill

Whilst I didn’t always follow this to the letter, I used the principle to instruct clients and found that they often managed to grasp techniques quickly. Incidentally, I hadn’t heard about cognitive load theory until around 18 months ago, but had been implementing key principles in my instruction. As with any new information, one needs to manage cognitive load and the NAMSET steps allow for this. I’ve placed in bold, the sections that are perhaps most relevant to teaching any new skill.

  1. Name the skill/task. What will you be showing and why? Giving reason and purpose to any new skill is likely to improve the focus.
  2. Where possible, demonstrate how to do it in silence. This allows the learner the opportunity to observe and self talk. I’d like to explore this a little further if I’m honest. I’m not sure that this should come before or after the explanation. Thoughts?
  3. Explain whilst demonstrating. This uses both the visual and auditory pathways to working memory (dual coding) if the explanations are clear and concise. Using complex terminology and excessive information risks losing the focus of learners, and/or overloading their working memory.  What are the key points for consideration? How can you explain the process clearly and concisely?
  4. Allow learners to complete the skill independently, but guide as required.  This is an opportunity for learners to apply their new knowledge and carry out the procedure themselves. As they do, the teacher should guide, reinforce key points and question the learners to ensure accuracy.

It is this early stage of skill development that the learner is likely to make quick gains in their performance of the task (as outlined by Fitts and Posner above), so this is arguably the most important stage for a teacher to consider when introducing new and complex practical skills.

In summary, this post has focussed on the early stages of learning a new motor skill. The discussion is supported by Kirschner, Sweller and Clark, whose work with novice learners found that minimal guidance during instruction is less effective and less efficient than explicit instruction. Here we can see that this stage of learning a new skill requires a lot of teacher input, but this needs to be done so with accurate modelling and clear explanations. My next blog post will focus on stage 2 and 3 of Fitts and Posner’s model, where the teacher begins to move towards the role of a coach to support learners with fluency/automaticity with their skills.

 

 

 

Remove your headphones!

It’s revision season. Exams are nearly upon us and learners up and down the country are locked away in their rooms revising (I hope they took on board my advice with the do’s and don’ts of revision).

college-library-girl-with-headphones-studying-with-music.jpg

When I was revising for my GCSE’s back in the late 90’s, we only had one television in the house and I didn’t have a mobile phone, so I’d be in my room testing myself against the OCR revision guides for each subject. This didn’t prove very fruitful in all honesty, but I would dread to be revising in the modern world – Facebook, Twitter, Snapchat, Instagram, Whatsapp, Phones, TVs, Laptops, iPads and iPods. You name it, there are so many distractions that face young people today.

 

What’s the problem?

Due to the problems associated with memory, and the subsequent distractions students face, this can limit the cognitive resources that can be allocated during the learning process. Salame and Baddeley found that the auditory pathway (phonological loop) is susceptible to negative effects of speech and other sounds. In other words, when there are noises in the room, beeps from the phone, the TV on in the background, the music etc, it increases the cognitive load, thus impeding the ability of working memory. What’s worse, when we are reading, we aren’t using the visual pathway (visuospatial sketchpad), we are actually using our auditory pathway as a result of ‘self-talk’. This is largely corroborated by the work of Alley and Greene who also found that individuals are pretty rubbish at judging just how much their working memory is impaired by irrelevant sounds. So when learners are telling you that having their headphone in is helping them to concentrate, they’re likely to be wrong.

 

What does this mean for teachers?

There is a real need for teachers to promote effective study strategies to learners and this starts in the classroom.

  • Learners should be encouraged to work in silence during independent practice – this includes removing phones, tablets, or anything else with a sound… even peers.
  • I recommend strongly that learners are not allowed to use headphones when working independently – even if they think it helps them.
  • Encourage learners to follow the ‘dos’ on my revision guide, and of course, ignore the ‘don’ts’.
  • When at home, learners should be encouraged to revise in a ‘distraction free zone’. TV off, phone in another room.

 

 

Why I do PowerPoint

There’s been a bit of a hoo-hah on Twitter today about PowerPoint (PPt). I think it began following this post from Jo Facer, which makes some fair comments. This led to a share of a previously written, more balanced argument by Robert Peal. I certainly agree with points in both, but not all. Here’s why I think we shouldn’t be so hasty in dismissing the use PPt:

Microsoft_PowerPoint_2013_logo.svg

1. It provides a structure for lessons – note the term lessons. I often have a PPt that spans more than one lesson and based on the content that needs to be taught. I don’t see a problem with planning via PPt, so as long as the time spent is on thinking about the order/structure of content. Taking the time to think about the structure helps to organise my thoughts and enables me to move information around to suit the needs of the class. It’s as if I am putting my schema to paper (figuratively speaking). I could use other means to do this, but the PPt serves as a prompt during the session and means that the risk of learners missing out on crucial information is minimised.

2. The ‘visual’ argument – there’s no denying the vast body of research supporting Paivio’s Dual Coding Theory. I used to be guilty of putting reams of text on slides, which I proceeded to read to my learners and wondered why they never remembered anything. The issue was that whilst I read aloud and learners read the text (self talking), all information was entering working memory via the verbal pathway. Having developed a (basic) understanding of the theory, I began to change my approach, ensuring that more visuals were used to support explanations rather than text. Where visual information can’t be used, I keep text to a minimum, emphasising key points only. Having the visual means that the two pathways to working memory are being used, thus less of of burden for the learners (as shown below). PPt is a platform that enables me to quickly create or add visuals, meaning that all I have to concentrate on is explaining it clearly.

Picture1Picture2.png

3. Animations – I’m not talking the swirling and whirling of individual letters which take ages to create sentences. No, I’m talking animations to grab learners attention, to direct them to important components of visuals as they are being discussed. I have blogged about this here, but the Clark and Lyons research is a much more comprehensive read on this. Whilst there are many other ways to direct attention, PPt can be used really effectively to do so.


4. Everything in one place – Another benefit of PPt is that I can place my quiz, my content, links to reading, learner task instructions etc all in one single place. I can upload this to the Virtual Learning Environment and if learners wish to access anything, it’s all there for them. The fact that everything is in one place also helps keep my OCD in check.


5. Aesthetics – I must admit, I am guilty of putting too much time into the aesthetics of my PPts. I have got better at making the information less of a burden on the working memory; gone are the GIFs, the tenuously linked images, and text heavy slides. In spite of this, I still like to have clear, crisp, well designed slides. The fact that I put effort into making my resources look nice probably won’t get me any thanks from anyone, but with the care I place, I know that the spellings will be correct, the animations will support the learners at the right time and (I’m going to throw this out there) it’ll probably engage the learners a little more (by engage, I mean grab their attention). Whilst this probably makes no odds to the learning, it’s far better than my handwriting on a white board.


To summarise, bad PPts are bad. Similarly, bad teachers are bad; as are bad pens, bad textbooks and bad technology. There is another way and I strive to be at the opposite end of the continuum.

Questioning questioning 

Since Geoff Petty shared his ‘which questioning‘ strategy with me around 6 years ago, I have been on a mission to hone my questioning. It is a great little activity that really gets you thinking about making effective use of questions. To this day, I use an adapted version of the activity with my own trainees. Indeed, I often focus observation feedback on the development of questioning as an essential formative assessment approach.

deep-thought-1296377_960_720

It’s easy to see why this is the focus of many teachers up and down the country. Hattie’s synthesis of classroom experiments (2015) found questioning to have a modest, but positive effect size of 0.48 and the resulting classroom discussion a huge 0.82.

The thing is, I’ve found more and more that trainees are focusing too much on questioning individuals (they do it well), and less time on the instructing or allowing learners to practise. It seems that ‘the question’ has taken precedent over ‘the answer’.

I observed a session recently where the teacher insisted on working their way around the class with questions, yet many of the learners didn’t have sufficient prior knowledge to allow them to explore understanding through discussions. It appeared that the opportunity cost of such a strategy was not as fruitful as one might have thought. Due to questioning being a strategy held in high regard, I can understand why they persisted, but it just didn’t help the learners. Instead, the group lost interest rather quickly and low level disruption ensued.

Were the teacher to use questioning more efficiently (second time I’ve used this term in as many posts), through a selection of multiple choice questions which can be answered by all in a short time, the teacher may have realised that the learners required some input/guidance to increase knowledge and enable greater participation in discussions.

Arguably a good starting point for thinking about questioning in the classroom is to ask yourself what the purpose is. Is it to assess learner knowledge/understanding, or is it to teach learners something through discussion? Perhaps it is both, but the main reason should influence the type of questions used. Personally, I use questioning as an assessment tool and the quicker I am able to assess ALL learners the better, so that I can identify gaps in knowledge that need filling. I’m not dismissing questioning as a means to generate good class discussion, but appreciate that time is of the essence with our learners and we should aim to maximise every last drop of it.