I’ve been writing a lot about cognitive architecture and instructional design of late (here, here and here). Arguably, the goal of instruction is to help students be able to explain information that they’ve retained and to transfer this to solve problems. Here I want to discuss a method of instructional design which may be highly effective in supporting learners with not only their retention of information, but also transfer.
‘a common instructional practice is to provide a lengthy verbal explanation, such as a textbook passage or a classroom lecture…[in doing this]… instructors may believe…[that this]… fulfills their responsibility to provide information to the learner…[however]… this practice is not very efficient for many students.’
In their work, Mayer and colleagues conducted three experiments with a group of college students to explore the effectiveness of different instructional approaches to share a scientific explanations:
Experiment 1: Students read a summary that contained a sequence of short captions with simple illustrations depicting the main steps in the process of lightning. Students recalled these steps and solved transfer problems as well as or better than students who received the full text along with the summary or the full text alone.
Experiment 2: Taking away the illustrations or the captions from the summary reduced its effectiveness.
Experiment 3: Adding additional text to the summary reduced its effectiveness.
From the results of the experiments, it was concluded that multimedia learning that is concise, coherent, and coordinated, aids explanation recall and problem solving transfer. It is suggested that the reason for this is simply because summaries reduce the load on the cognitive system, enabling learners to carry out the cognitive processes necessary for meaningful learning, similar to that of dual coding.
In layman’s terms, an effective method of instruction is to provide learners with a storyboard of a process that contains both visual and text information, being mindful of the three ‘C’s:
Conciseness: only using a few images and sentences in the storyboard.
Coherence: Images and sentences should be presented in a cause-and-effect sequence
Coordination: Images should be presented next to its corresponding sentences
So, I asked a colleague of mine, Mike Tyler to trial this approach and see how he found and here’s the result…
Of course, it’s easy for me to sit here and say that sound instructional design should look a certain way, but there are implications in doing this. We know the opportunity of using resources like the above is good, but at what cost? I asked Mike a few questions following his work to untangle this a little.
1. How long did it take you to produce?
Mike’s response: ‘It took a couple of hours… I used PowerPoint to make the pictures then grouped them, saved each one as a png and imported each into Word. Finally I saved it all as a pdf. Basically, it was a full-on mission, as they say! I’d do it by hand next time and scan / upload.’
2. In producing it, how did you know what to put where and what information to omit/include?
Mike’s response: ‘I included only what was minimally necessary to make sense of the process. i.e. osteobalsts, osteoclasts, etc). I had able Level 3 learners in mind.’
3. Do you think you will use this when teaching the process in future? Why?
Mike’s response: ‘I will probably give this a go in the future. I might try it for Anatomy & Physiology this coming term, maybe in a lesson on the energy systems.’
I’m going to be following up with Mike once he’s used this with his learners and will keep you posted by updating this post, but in the meantime, why not have a go yourself?
I’ve written previously about the differences between expert and novice learners. You may want to read this if you haven’t already, as it provides a good base for this post.
For this post, I’d like to start with a question. We have three learners; what instruction is likely to be most beneficial to them?
An A’ level physics learner with ten A-A* grades, including GCSE Physics
A BTEC Level 2 Sport learner with a range of GCSEs at grade D or below
A Level 1 Automotive adult learner with no formal qualifications, but many years experience of working on their family and friends cars
Of course, the answer isn’t quite as clear cut as one would like to think. I want to highlight that there is no ‘best way’ to teach in FE, all of these learners are very different and will require a different approach. Let’s explore each learner with some suggestions as to what might be more appropriate:
It can be assumed that the A’ level Physics learner will have a sound understanding of the foundations of Physics. They are moving towards developing ‘expertise’, so will have well structured schema in this subject. So how best will they learn?
Kalyuga et al propose a phenomenon known as ‘the expertise reversal effect’ which I have attempted to depict below. In essence, when we have a solid foundation of knowledge in a subject, we need less guided instruction. The reason for this is that studies exploring the effect of guided instruction on experts have shown a negative impact, with some theorising that it is due to a greater extraneous cognitive load (basically too much non-relevant information), interfering with existing schema. So with this in mind, providing learners with less guidance and more opportunity to work independent of the teacher on problem solving and inquiry based tasks may be more effective.
With the BTEC Sport learner on the other hand, we can assume from their GCSE profile that their schema is less organised compared to the A’ Level learner. So how will they best learn?
They probably have little knowledge of sport studies in general and therefore will require more guided instruction. As I mentioned in my previous post, without sufficient prior knowledge, minimally guided instruction is largely ineffective. To enhance the guided instruction, one should attempt to use approaches that reduce the burden on the working memory. This might include taking advantage of the ‘dual coding‘, by providing learners with visual and auditory information, and through ‘chunking‘ the learning coherently.
The adult learner poses a more complex issue. They are likely to have some structure to their schema around automotive through their experience of working with cars. There are however, likely to be some misconceptions and potentially ‘bad habits’ as a result of this experience. How best to approach these learners then?
Through any instruction, this learner is likely to experience what Waxer and Morton call ‘cognitive conflict’, which essentially means the uncertainty we have when faced with new information that contradicts what we believe already (our current knowledge and experience). In terms of instructional methods, Bell and colleagues found positive results in using a constructivist approach to teaching called ‘diagnostic teaching’. This type of teaching involves:
‘lessons typically begin with a problem that exposes the variety of students’ existing thinking. Students are then confronted with the cognitive conflicts that emerge from these different ways of thinking. New insights are constructed through reflective discussion, leading to deeper understanding. This approach is challenging for teachers but research shows that it develops connected, long-term learning in their students.’
This blog by Nick Rose finds conflicting evidence in the research for teaching cognitive conflict, showing the benefits of both minimally guided instruction (diagnostic teaching) and guided instruction. In light of this, I would argue that the abovementioned learner will require a combination of both types of instruction. In the first instance to correct misconceptions and clarify understanding, explicit guided instruction is recommended. The learner is not an expert and therefore needs to have strong foundations built and reinforced. Following this, they can challenge cognitive conflict through a constructivist approach such as diagnostic teaching.
Key to planning instructional design is knowing your learners. This isn’t a case of their learning style or any other nonsense about how they like to learn, nor is it about trying to make the learning activities more ‘learner-centred’ because it is in-vogue. Rather it is about finding out as much as possible about what the learners already know and how secure they are with this. If we can do this, it will assist us in designing effective instruction to maximise future learning – whether this be fully, or minimally guided.
A blog is an identity, a way to express yourself, a way to think freely, to reflect and to share this with a large population. Your voice isn’t lost in the myriad of levels within an individual institution, but instead, one is considered an equal amongst colleagues from across the sector. I have learnt a vast amount through writing/reading and taking part in dialogue around my own and other’s blog posts, so am an advocate for celebrating those that currently blog and for encouraging others to start blogging in order to broaden the knowledge pool we can access.
After receiving several requests lately asking for FE specific bloggers, I have took it upon myself to collect information about all blogs by FE practitioners (well those that replied to my request anyway). I have ordered them alphabetically for ease of reference.
Link to blog:
Summary of blog content (What is it about? Who is it aimed at? How often do you post?):
An eclectic mix, but generally SpLD/SEND, SEND Law & parental perspectives. Aimed at those not wishing to tick generic boxes & those wishing to understand SEND legal duties. I post erratically, but fairly frequently.
Aimed at teachers of English for Academic Purposes, Study Skills, teachers interested in CPD. I provide free worksheets & lesson materials, as well as discussion posts on key issues in EAP and teaching study skills. I blog every couple of weeks, sadly not as regularly as I would like!
The blog has a teaching, learning and assessment focus, and using a wealth of research aims to question and challenge widely held beliefs in FE primarily, but also education generally. I post twice weekly. One a written piece and one an interview with an educator.
Teaching and education support for teachers of art and textiles. I post every term (not as often as I’d like to!) There are student friendly resources on the site and directory of artists to support project research. I often get emails asking for advice in designing and writing curriculum plans and projects for exam groups. . More than happy to help 😊
A new sharing blog to create a platform for FE staff to share the good stuff, show and tell the positives and provide links to the things that have worked so that others can benefit. We post regularly throughout term time.
This blog has posts on my CPD work in FE but there are also posts on teaching & learning, leadership and technology. It’s aimed primarily at a wide FE audience, although posts also have relevance outside of this sector. I post when I have something to say or share.
Reflections on lifelong learning, FE contexts, technologies and mobile learning, literacies, culture, meandering nomadic posts. Posts are irregular these days – once a month, more or less. The aim is more personal: to help assimilate things I learn – a reflective process, rather than try to prescribe (I hope I’m not cantankerous. It was created for my MA digital cultures course and a spur of the moment name!).
Thoughts, views and news on using technology to support, enhance and enrich teaching, learning and assessment. Aimed at staff in FE as well as HE mainly of benefit to teachers and managers. Posts roughly once a week.
Further education and lifelong related posts; aimed at educators from all sectors and trainee teachers. Specific interest in posthuman approaches to curriculum and identity/diversity issues. Post every month or so.
A clarion call to dancing princesses, anti-heroic leaders and all who are interested in education for a social purpose. Also stuff about thinking environments and other pro-social approaches to critical pedagogy. Can be ranty but will always give you the feels.
Education, education, education – here I curate my articles and blogs about all aspects of education (pedagogy and leadership; primary, secondary, FE, and HE). I post sporadically but usually several times a month.
Using problem-based, experiential, and inquiry-based teaching is likely to be ineffective with novices.
My last post explored the difference between novices and experts, demonstrating that they think and act very differently due to a contrast in their knowledge and experience in a subject area.
In Further Education, specifically in vocational areas, learners arrive with little to no knowledge/experience in their subject. Take Engineering, Automotive, Hair and Beauty and Construction for example – likely to have never been studied previously. Then there are subjects where there may be prior knowledge/experience but many misconceptions, for example English and maths. Therefore, learners are arguably still novices when they join us…
The thing is, there seems to be an obsession in FE to teach learners as if they are experts. CPD sessions across the country are riddled with the promotion of minimal guided instruction methods such as: discovery based, problem based, experiential and inquiry-based learning. I get it, I really do. We are trying to reach an audience that is getting harder to reach, so if we can make the learning interesting and give learners more autonomy, then we might just crack the problem…
‘You will assume the role of a Wella colour expert and figure out what is wrong with Deirdre’s highlights’
The problem is, we are not doing them any favours by doing this. Once learners have a solid foundation and begin to develop expertise, then these approaches to learning may be very effective, as they can draw upon prior knowledge/experiences to assist them with their learning. Novices on the other hand don’t have this knowledge and experience to draw upon. In fact, it is likely that they will have misconceptions about the subject that, when applied to a problem based activity, may result in further confusion.
Future posts will examine effective methods of guided instruction, but for now I introduce you to a paper by Kirscher and colleagues explains in greater depth why minimally guided instruction is not an effective instructional design for those with limited knowledge. I have attempted to summarise this visually below:
So to end my post. Many of our learners are novices and need guided instruction. When they are experts, we can reduce the guidance we give.
*Also, I’d like to add that I’m not completely averse to this type of instruction on occasion, when I feel that learners have sufficient knowledge.
This week I stumbled across a fantastic article online written by a self-taught card counter (Steve Pavlina) who, when reflecting on the Blackjack table was able to draw upon some lessons for life. I read this article and it immediately resonated from an educational perspective too.
Steve begins the article by outlining his fascination with the game and went on to outline how he became an expert at beating the casino:
‘I bought a book on blackjack, learned the rules of the game, memorized the basic strategy, and then studied a simple +/- card counting system. It took a heck of a lot of practice and was tedious to learn, but I eventually felt comfortable with it…Between Vegas trips I studied blackjack and card counting ever more deeply. I read 10-12 books on the subject and mastered different counting systems (Thorpe, Uston, Revere, etc.). I practised advanced counting systems that keep a side-count of aces. I drilled myself until I could count down a deck of cards in under 14 seconds. I learned to vary the play of hands according to the count, memorized optimal strategies for different rule sets, and learned the subtleties of the game that would increase my edge even the slightest degree. We’re talking a total edge of maybe 1%.’
Steve made some observations whilst playing. Below I have attempted to make sense of these through an education lens.
1. Novices will make correct decisions most of the time – It was observed that most of the time (80-90%), novices would make the same decisions as an expert, but cumulatively that 10-20% they make incorrect decisions have a big impact on their losses.
In education, we may assume that learners are learning well if, in most cases, they answer questions correctly, or produce a lot of work. Aside from these being generally poor proxies for learning (Coe, 2014), learners themselves may also believe that they’re doing well; mistaking their ability as superior to what it is (the Dunning-Kruger effect). This is dangerous because it’s the bits they may be getting wrong that cumulatively have a considerable impact on future learning (the 10-20%). Taking even the smallest misconception forward could make future learning less clear and more difficult.
For example, upon taking students into my Biology class, I have found many to arrive with the belief that all arteries carry oxygenated blood. Whilst in the vast majority of instances this is correct, it is a misconception that could cause confusion when later learning about pulmonary circulation, where in fact the pulmonary artery carries deoxygenated blood. The misconception should be corrected to ‘arteries carry blood away from the heart’, thus removing the confusion about oxygenated/deoxygenated blood. So what I’m getting at here is that we as teachers are supporting our learners’ development from novices to experts by not making assumptions about learning (as a result of insufficient assessment) and not allowing misconceptions to leave our classrooms.
2. Novices miss golden opportunities – It was observed that novices lost more money on the blackjack table due to a lack of understanding about when to gamble more and when to go bust; instead they tended to play it safe. Experts on the other hand would go bust more often and gamble high when the time was right. They used their knowledge of the odds to their advantage.
Daley found in her research of novice and expert learning that novices are ‘scared to death [and] terrified of making mistakes’, and that they want to be told what it is they needed to know in their learning. They are risk averse and as such don’t like to put themselves in positions where they may make a mistake. On the other hand, experts adopted a more constructivist approach to their learning, assimilating new information with old through experience, and because of a solid base of prior knowledge they were more inclined to know when to make calculated risks (or take golden opportunities). This is why it is essential that there is sufficient hand holding and teacher led instruction to ensure that the learner is provided with the key knowledge that they need, in order to develop into experts. Effective scaffolding should be slowly removed over a series of weeks/months to enable learners to become less dependent on the teacher and support their transition towards being an expert.
3. Novices don’t put in the time to fully understand the game – Novices don’t take the time to master the basics, whereas experts put in hours of practice and understand the basics and the more nuanced elements of the game.
Deliberate practice is crucial to becoming an expert according to Ericson et al who states that ‘many characteristics once believed to reflect innate talent are actually the result of intense practice’. Many novices (myself included) may be subject to the Dunning-Kruger effect so are misinformed and feel that they may not need the practice to master something. Our duty as teachers is to not only provide time to practise, but also encourage learners to understand the benefits of doing so (more on this below).
4. Experts are more disciplined – Experts tend to be more consistent in making decisions and taking action. Experts understand that you can make the correct decision and still lose, but they focus on making correct decisions, not on trying to force a particular outcome
In his book, David Didau (2015) informs us that ‘we are predisposed to examine the surface structure of a problem rather than recognising that its underlying deep structure is the same as something we already know’. In essence, when approached with a new problem, unless we are an expert, we are less likely to make links with existing knowledge and prior experiences to solve a problem. Novices simply don’t have sufficient information to draw upon and so can’t make informed decisions, thus focusing on the detail, whereas experts are more likely to focus on the structure of a problem and take a more consistent approach. For example, if given a maths problem to solve, the expert may think of similar problems they’ve faced and compare the structures to help them make sense of the information, whereas a novice may just try to tackle the problem without an idea of what they’re trying to find, or what the outcome might be. With this in mind, teachers need to be modelling explicitly how to approach problems making use of prior knowledge, before scaffolding problems for learners with support mechanisms that can be removed once experience is acquired.
5. Private victory precedes public victory – Experts spend a lot more time practising, which takes tremendous patience. Their real victories aren’t at the blackjack table, but in their homes practising.
As mentioned above, expert performances only arise through dedicated and deliberate practice. This according to Ericsson et al requires motivation and perseverance, which in itself is problematic, particularly if we want learners to engage in deliberate, directed practice outside of the classroom.
‘Deliberate practice is not inherently enjoyable and that individuals are motivated to engage in it by its instrumental value in improving performance. Hence, interested individuals need to be engaging in the activity and motivated to improve performance before they begin deliberate practice.’
So our role as educators is to establish an environment where learners focus on long term improvement through having a high self-efficacy for learning. To avoid learned helplessness and to encourage a high self-efficacy we should guide students towards success through modelling, scaffolding and giving sound feedback to help move them forward.
In summary, to support our learners from novice to expert we need to treat them as a novice initially and not as an expert. If we try to teach our novice learners to be scientists by giving them inquiry based science projects to complete, or treat them as hair stylists by placing them straight into a hair salon, they will act as novices (Kirschner et al). I believe, based upon what I have written (here, here and here) that the following approaches should be taken to support our learners to become experts:
We are experts in the subject matter ourselves
We plan the learning to maximise long term retention (distributed and interleaved practice)
We model correct practice and chunk the learning to reduce cognitive load
We scaffold difficult concepts to enable learners to more easily understand, before slowly removing the support mechanisms to allow greater independence
We provide regular opportunities for retrieval practice
We provide learners with sufficient time and space to practise, hone their skills and take necessary risks
We support the transfer of knowledge and skills within the subject through well planned and scaffolded activities.
We conduct regular checks on all learners’ understanding which goes beyond that of superficial questioning/observation
We provide task-oriented, rather than ego-oriented feedback in a timely and specific manner to move learning forward
We involve learners in their own assessment and one another’s against clear success criteria
We actively encourage learners to practise beyond the classroom through challenging homework that feeds into future lessons
Special thanks go to Oliver Caviglioli for his brilliant visuals to support the text.
“The lesson objectives could have been written a bit more measurable.”
My questions I want to answer in this blog post are the following:
What does that statement mean?
Do learning objectives need to be written so that they’re measurable?
How should we write lesson intentions to maximise learning?
Or should I set myself an objective? By the end of this post I will:
Be able to identify what a measurable lesson objective is.
Be able to analyse the impact of measurable objectives.
Be able to identify different methods of writing lesson intentions.
The comment in the title was made during a recent joint observation. Whilst on the surface it appeared to make sense, upon reflection, I’m not convinced by it and would like to explore it further.
Current educational ideologies (particularly in vocational education) lead to a primarily product based curricula, whereby meeting behavioural objectives forms the basis of our teaching, with teachers accountable for and judged on their ability to produce results as opposed to the more developmental, process based curricula. I don’t necessarily have a problem with this, in fact, I’m broadly in favour of this type of curricula, but whatever curriculum approach is chosen, I do have a problem with being told that lesson objectives/outcomes/intentions/anything else you want to call them, should look a certain way. Many in the FE and Skills sector (perhaps education more generally) see learning as a linear and singular process of moving learners from A to B in a lesson and once they’ve achieved a desired outcome, then it is assumed that the learners have learnt. This view is wrong. Learning is liminal (particularly in post-16 education), with learners in a continual stage of development towards a longer term outcome over a series of lessons. I’ve said before that if ever a definition of learning could be agreed, it would certainly involve something about knowledge acquisition and probably something to do with long term memory and being able to retrieve information. Therefore, learning does not happen in isolated lessons.
As David Didau (2015, p.279) notes:
‘all too often our learning intentions are lesson menus: here is what you should know or be able to do by the end of today’s lesson. Unless we have very low aspirations for our students, they are unlikely to do more than merely mimic the understanding or expertise we want them to master.’ David goes on to say that ‘if we were to share our intention for students to learn threshold concepts, then we could tell them that it might take them weeks to wrap their heads around such troublesome knowledge’.
‘learning outcomes cannot be defined with the kind of precision that has been supposed, that they stand in need of interpretation within a context… The idea, currently popular—that first year degree students must describe, second year students must explain and evaluation should characterise their work in the third year—must be replaced with the idea that these activities are visited and revisited as the students’ progress and in accordance with the requirements of the subject matter.’
Indeed Hattie (2012) cites a very low (0.12) effect-size for behavioural objectives. With the aforementioned in mind, the whole notion of measurable objectives in a single lesson is beginning to look absurd.
According to Hattie (2012) however, there are five essential components to learning intentions and success criteria to support effective learning, these are: challenge, commitment, confidence, high expectations; and conceptual understanding. I have said before that I’m not convinced about success criteria here, but in Wiliam and Thompson’s (2007) work, they hold the work of Wiggins and McTighe (2000) in high regard. This work advocates a two-stage approach to creating and sharing the learning intentions with learners. This includes clarifying the learning goals (what is worthy and requiring understanding?), and establishing success criteria (what would count as evidence of understanding?). Perhaps it is the success criteria where things become measurable?
I think about my own practice as a teacher trainer. If I were to teach say, formative assessment and set my objectives as:
‘To understand the 5 key strategies of formative assessment’
This is certainly worthy of understanding. Then if I were to make the success criteria measurable:
‘learners will be able to identify 5 key strategies for formative assessment’
‘learners will be able to explain the 5 key strategies of formative assessment’.
Whilst this is measurable and learners may be able to do both of these by the end of the lesson, this would be merely a performance and not learning, moreover, there may be learners that can critically analyse 3 strategies (beyond the success criteria) and know very little about the other 2. Does this mean that the lesson has failed? Of course not – it all seems rather short sighted and restrictive.
‘teachers may prefer short-term goal measurement because it is easier to understand and it guides instruction more directly by providing information about when to progress from one skill to another [however] short-term goal measurement may be misleading: While students master a series of instructional objectives, progress on more global indices of achievement may be limited, failing to reflect this gain’.
For this reason,I ask myself whether learners would benefit more from a question or testing a hypothesis over a series of lessons:
What makes formative assessment effective? (question)
Formative assessment is only effective when feedback is provided (hypothesis)
With something like the above, the lesson intention is broad in a sense that it allows for a range of outcomes in the lesson and over a series of lessons, but tight enough to focus the learners on the content and be clear with what they’re learning about. Clarity is key. I believe that learners should know what they are doing in the lesson and why they are doing it. I mean you wouldn’t bake a cake without knowing the kind of thing you’re after and you wouldn’t go on a journey without knowing the destination, but does the way you write this on your lesson plan or white board really benefit anyone? It becomes a tick box approach – something we need to move away from in education.
Oh by the way, did we all meet the lesson objectives?
Didau, D. (2015). What if everything you knew about education was wrong? Carmarthen, Wales: Crown House Publishing Limited.
Hattie, J. (2012). Visible Learning for Teachers: Maximizing Impact on Learning. Oxon, UK: Routledge.
When presenting new information to students in lessons, how can we best support the retention of this information?
This post features a note taking strategy that may be new to many, the Cornell method. In essence, the strategy involves learners dividing their paper into two columns with a row across the bottom (as shown):
The learners should then take the following steps in order to take notes on the lesson:
Step 1 (right hand column): Take clear and concise notes on lesson content. Sentences should be no longer than 5-10 words long and the notes taken should not be copied verbatim from the presentation, unless a crucial point.
Step 2 (left hand column): Here is where learners are given time at key intervals or at the end of the session to create questions or cues that clarify meaning, or reveal information about the notes. For example:
Step 3 (bottom row): Summarising the notes in the bottom row helps to consolidate understanding of them. This is best done after the class. Encouraging learners to elaborate on the points made in the notes column will help to reinforce meaning, or identify gaps where further study is required.
Step 4: Learners should regularly revisit notes, covering the right hand column and testing themselves on the questions in the left hand column.
Why might this method effective?
In his book ‘Classroom instruction that works‘, Robert Marzano shares with us the features of effective classroom instruction. With an average effect-size of 1.0, note taking methods are really worth exploring further in our classrooms. According to Marzano, effective note making involves the learner summarising information being shared by deleting and substituting the information in order to create their own meaning. The Cornell method certainly allows for this, both in the notes section (step 1) and the summarising section following the lesson (step 3). Beecher (1988) examined the research on note taking and found mixed results, with one such issue with note taking being that learners tend to copy verbatim. In doing this, learners are not engaging in the synthesis of information being presented. However, if used properly, the Cornell method overcomes this by restricting the information that learners write and requiring them to summarise well.
Despite the mixed results found, Beecher’s subsequent findings on reviewing notes makes for interesting reading:
‘The research findings on whether note-taking promotes encoding have been mixed. Hult et al. (1984), for example, found that note-taking does involve semantic encoding; but Henk and Stahl (1985) found that the process of taking notes in itself does little to enhance recall. They found, however, that reviewing notes clearly results in superior recall.
As we know from the work of Dunlosky et al (2013), self-testing is a highly effective study method. Therefore, if we combine the self-testing as the review element of note taking, we may be on to something. The Cornell method does exactly this. Not only do learners benefit from the creation of questions, but they are then able to self-test by using them – win win!
I’m certainly going to be trialling this approach to note taking with my learners in the coming months, why not do the same with yours? Let’s make note taking more meaningful and use the methods that are likely to be more effective for learning.
Avid readers of my blog will know that I’ve developed a real liking for cognitive science. I’ve summarised key reading previously, here. Another key reading for teachers, I feel is ‘Principles of Instruction‘ (Rosenshine, 2012). Not only does this draw upon cognitive science, but it also takes the best from classroom research and cognitive supports. I believe that any research that draws upon and finds themes across a range of evidence is worth a read to support teaching practice. In this blog post I aim to summarise each of the principles identified in the research, with some practical application.
A year ago I restarted my blog as a way to find my voice again. Blogging really has helped me to develop as a practitioner. Not only does it allow me to reflect on my practice, but it also means I have to read A LOT! I’m not complaining, but boy have I read. Reading more has given me more knowledge about education related topics. More knowledge has allowed me to be more critical of these topics and this has supported me to improve as a ‘research informed practitioner’. Moreover, I’ve been able to support others with accessing the research by writing about it in layman’s terms.
Upon starting my blog again, my intention was to write a blog per week, but I fell short of my target with a mere 49 (including this one). I’ve had nearly 12,000 views (which is a drop in the ocean compared to some, but I’m happy with it for the first year). This coming year will see me exceed my goal, with a weekly post, in addition to a new feature (coming soon). Furthermore, I am branching out to other social media platforms to increase the views. This isn’t about numbers so much, but about creating more dialogue around my posts and supporting others to access crucial information about learning.
In celebration of the 1st year, I’ve chosen a selection of key posts that I feel have had the biggest impact on both myself and others. Some have been popular and well read, others not so, but all very valuable:
My first blog post ‘Less haste, less speed‘ – this was the start of my new blog and developed upon a theme that I had written about in a previous blog. The post questions why we are always in a rush to teach learners information and for them to make quick progress with it. This set the scene for the blog and it has been viewed 234 times.
A lot of effort went into ‘Schemes that make a difference‘ – in this post my aim was to support teachers with their planning by drawing upon a sound research base. I wrote this alongside a training session that I was planning. The post and the training session has been cited by many as being really useful to them. This post has been viewed 672 times.
My most popular post ‘Formative assessment – is it a silver bullet‘ – I think I almost broke the internet in the first 4 hours of it being published, with over 400 views. It drew upon research to provide a critical analysis of Wiliam’s 5 key strategies to formative assessment. Whilst my views have changed slightly, the post is one of my favourites and an easy read for understanding how to do formative assessment well. It has been viewed 1023 times.
My famous post ‘Observations – is the boot on the wrong foot‘ – This was published and within a couple of days, the TES FE editor contacted me to feature it in the paper for the following week. It is an alternative view of observation based upon my experiences with my daughter. Viewed 464 times.
The most dialogue generated post ‘Action research: A recipe for disaster?‘ – This post generated a lot of discussion on Twitter and WordPress (well, for me anyway). I don’t necessarily agree with everything I say in it, but intended on it being thought-provoking and a little contentious in – which it was. It has 479 views.
Most useful to trainee teachers post ‘Applied and simplified – Top 20 principles‘ – In my role as a teacher trainer in FE, I work with individuals that have fantastic subject knowledge, but lack pedagogical knowledge AND the time to separate the wheat from the chaff. Therefore, it is my job to support them in accessing key information, and what better than summarising a key piece of research from cognitive psychology. I have actually got my trainees doing a similar task (summarising key research) and feel that this is essential to their development. Viewed 174 times.
So stay tuned to my blog because it is only going to get better – here’s to another year!
My previous two posts here and here have explored the first 15 principles from the Coalition for Psychology in Schools and Education’s Top 20 Principles from Psychology for Teaching and Learning. Here I move onto the final 5 principles, once again, trying to provide simple application of each.
Principles 16-17 focus on how the classroom can best be managed.
Principles 18-20 focus on how to assess student progress.