Building interleaving and spaced practice into our pedagogy

This post builds on the conclusions I reached in the post Forgetting is necessary for learning, desirable difficulties and the need to dissociate learning from performance. Make sure you’re up to speed with that post before reading on.

In my department we have historically done blocked practice from year 7 to mid-year 11 and then switched to interleaving from Christmas in year 11 onwards in the form of weekly past paper homeworks. According to Bjork and Rohrer’s research we should see greater sustainability in learning outcomes from the interleaving part of our practice. I thought I’d dig into our own assessment tracking data to see if it agrees with their claims- it does!

The following graph shows the mean assessment level (in terms of progress to go to meet Minimum Expected Progress) of our 2013-14 cohort as they progressed from starting GCSE maths in year 9 to their final GCSE grade:

KS4 whole cohort mean progress tracker

It would appear that the story in our assessment data agrees well with Bjork’s and Rohrer’s findings on interleaving and spacing. This cohort went on to achieve excellent progress and attainment when compared with national averages (80% A*-C), but it was only when they started the interleaved practice sets of questions that the retention and transferability of their learning began to build substantially. Interleaving is effective because it gives students experience in selecting a strategy to solve a problem as well as executing the strategy. In blocked practice we give students the strategy and they don’t gain experience in selecting it.

I plan to bring interleaving and spacing forward in the department’s practice right to day one in year 7 in our new SOWs. There will still be blocked practice during the early stages of learning, but students will get interleaved questions at regular intervals right through the five years. Currently, I’m thinking of this being in the format of open-book end-of-unit activities that recap the content of the whole topic, then another to recap topics taught cumulatively to that point. Bjork says low-stakes/high-frequency is important for these to be successful, hence these being open-book activities. Spacing is inherent in interleaved practice, but I am also planning for starters to be used for retrieval events of previous material.

There still remains the issue, as discussed in the previous post, that if we are doing this correctly, in-lesson performance of students will be lower. As Rohrer showed, an interleaved-taught unit showed significantly less in-lesson progress, but ultimately three times the long-term retention and transfer of student learning. Managing staff/student expectations isn’t going to be easy.

Food for thought.

Detail behind the data:

Students sat full GCSE exam papers during the assessments. The grade boundaries on all assessments up to the exams were higher than the final GCSE exam turned out to be (35 for a C) by approx 0.3 levels. This would have the effect of reducing the interleaving gains to 1.0 levels over the ⅔ of a year- still considerable and a step change from the rate of progress during blocked practice.

Work with us at Wyvern College? Full-time maths teacher vacancy. Deadline 5th Feb 2015

logoWe have a vacancy for a full-time maths teacher in our dept at Wyvern College, Fair Oak, Hampshire. We are a hard-working, friendly, supportive, high-achieving department and our cohorts regularly achieve 80%+ A*-C in GCSE Mathematics. We combine the best of traditional pedagogy with critical, measured adoption of research-supported new ideas. We have a very strong local reputation for excellence in both pedagogy and student outcomes.

We are looking for a highly-driven, reflective practitioner to join our team from Sep 2015 as we evolve to meet the requirements of the curriculum reforms, by providing greater maths contact time for our students.

Please visit this link for the job information pack and application form.

Helen Lo, one of our full-time Maths teachers, has provided the following information to give you an overview of life at Wyvern:

What inspired you to teach Maths?

“The fact that you’re sharing an essential and relevant life tool that you know students are going to need and use in everyday life means that teaching Maths is exceptionally satisfying. You know that your students can achieve more in life if they fulfil their potential in Maths and being able to help them do this holds massive personal rewards”.

What makes Wyvern a great place to work?

“Wyvern is a very high-achieving school where there is a constant desire and drive to improve. At the same time, there is an incredibly supportive and understanding atmosphere that stretches throughout all departments. It is, in short, a friendly, supportive and enjoyable place to work and to achieve.”

Why work in Wyvern’s Maths department?

“You know that you are working at the forefront of Maths teaching, not only in the area but also in the country. Innovation and new ideas are embraced while successful and proven teaching strategies and techniques are continued and enhanced. We are a very supportive and cohesive team where hard work is accompanied by great fun every day. Why wouldn’t you want to work in one of the region’s top achieving Maths departments?”

If you share our passion, energy and commitment, then you may be just the person we are looking for to join our inspirational team and to make a difference to our students.

Deadline for applications is 5th Feb 2015.

Forgetting is necessary for learning, desirable difficulties and the need to dissociate learning and performance

How many questions should I give students to work on after my instruction? Should I group all the questions together or space them over time? Should I ‘block’ questions on the same topic together or should I mix them with questions from other topic areas? Is ‘over-learning’ an efficient strategy for boosting student outcomes? Should I always be using high-frequency formative assessment techniques to guide my instruction? Is it possible, and if so, how do you measure the learning that has happened in a lesson? What does best practice look like in an assessing-without-levels world?  Within a progress context, are rapid and sustained mutually inclusive or exclusive?

We live in times of enforced, but relatively unguided change where schools are asking themselves questions about the fundamentals of pedagogy, learning and assessment. As I work on the evolution of my own department’s schemes of work (and the pedagogy I want these to promote) the above questions and more have been at the forefront of my thinking.

The beauty and intellectual intrigue of trying to understand learning stems from many sources: the difficulty to define it; the complexity and often non-intuitive strategies in creating conditions that nurture it; and the impossible, yet relentless focus on trying evaluate and optimise it quantitatively. Every teacher has their view and I’ve often found these differ more broadly in experienced colleagues than in those new to the profession. This is not a criticism, quite the opposite- it is the result of reflective thought after sufficient time and experience realising that ‘the fundamentals’ they learned during their training rest on boggy ground. In my own training, AFL was the non-negotiable, silver bullet to effective learning in the classroom. Now I’ve had a few years working with AFL, I’ve experienced how it can be a double-edged sword if the subtleties are not appreciated. I’ve seen many teachers (myself included in the early years) and government initiatives mistake students’ instantaneous performance for learning, through misunderstanding what AFL’s limitations are. Debate is healthy and useful, but the plural of anecdote is not data. Many middle and senior leaders are currently, in part because they have been challenged to do so by government policy, searching for self-evident teaching truths on which to rebuild their systems and pedagogy upon.

Unknown

The profession has duly looked to the academic education research community for inspiration and authority on which to distil effective practice from the vast, turbulent-cylcical ocean of fashionable ideas and possibilities. The emergence of Tom Bennet’s ResearchEd community is a natural consequence of numerous teachers simultaneously dipping their toes into educational research findings and wanting to collaborate. I am one of those teachers and I write to share with you some significant CPD I have undertaken over the last year to try to gain insight into potential answers to the questions at the start of this article.

I became interested in the role of memory in maths education following a visit to King Solomon Academy where I met Kris Boulton. I learned about how this school, and its maths department under the leadership of Bruno Reddy, had designed a maths education curriculum that has subsequently resulted in their first cohort achieving 95% A*-C. The pedagogy they developed within their department was based on numerous academic sources relating to cognitive science. Once such source was the work of Robert Bjork, Distinguished Professor of Psychology at the University of California, Los Angeles. I have read his work myself over the last year and am at a point where I am beginning to be able to apply it in my own, and my department’s practice.

bjork_robert_webBjork is known for his framework that conceptualises learning within the context of memorisation called The New Theory of Disuse. The work builds on research by Thorndike in the early twentieth century and the observation that learned information fades away over time. Bjork supersedes Thorndike’s Theory of Disuse with his new theory because of research that showed memories do not disappear completely, they instead only become inaccessible over time. Bjork’s New Theory of Disuse puts forward the notion that anything learned (a memory representation) can be thought of as having strengths based on two indices: storage strength and retrieval strength.

Storage strength- reflects how inter-associated a given representation is with other memory representations. It is the depth of learning. Once accumulated, storage strength is never lost- the information remains in memory as evidenced by recognition, priming and, especially, relearning.

Retrieval strength- current ease of access. It is how primed or active an item’s representation is as a consequence of recency or current cues. Information in memory, no matter how over-learned becomes inaccessible with a long enough period of disuse. Retrieval strength falls over time. Recall of the memory representation builds retrieval strength.

Put simply, when we learn something, the depth of understanding to which we have learned it will never recede. Deep learning stays deep. However, unless we regularly recall it, the learning will become more inaccessible as time goes on.

You could therefore plot any memory representation on a two-dimensional plane of storage vs retrieval strength. Examples of typical representations in the four quadrants of the plane would be:

Low-storage/ high-retrieval- What you had for lunch yesterday. You remember it because it is recent, but you’ll soon forget it because you haven’t linked it to other memory representations- it wasn’t that important to you.

Low-storage/ low-retrieval- What you had for lunch this day eight months ago. Same as above but now with low retrieval strength because time has elapsed. The memory has become almost inaccessible.

High-storage/ high-retrieval- The birthday dates of your children. They mean a lot to you (are connected to many other memories) and you recall them regularly.

High-storage/ low-retrieval- The names of people in your Year 1 primary school class. You have forgotten them because you haven’t recalled them recently, but shown a list you could pick them out (storage strength hasn’t been lost and retrieval strength can be quickly rebuilt with recall).

We obviously want students to have memory representations of their learned material in the high-storage/ high-retrieval quadrant. The argument for a mastery, rather than KS3/4 spiral-based curriculum fits within this framework because of the time it creates to do deep learning. Teach-once-deep (but regularly recall) rather than teach-twice-shallow (and not have time to do much recall) makes sense if storage strength is only ever cumulative. Single 5 year curricular are becoming more popular and Bruno Reddy was amongst the first well-known of in the UK to adopt the format.

So, The New Theory of Disuse has been discussed within a memory representation context, the next step is to consider the implications it has on understanding what real ‘learning and progress’ look like in a maths classroom.

Corresponding to storage vs retrieval strength, there is a time-honoured distinction in academic learning research stretching back to the early twentieth century between learning vs performance.

Performance (retrieval strength)- what we can see, observe and measure at the current time in the maths classroom. It’s what I can see in students’ books or on their mini-whiteboards when I’m asking them a question similar to what I’ve just taught them to answer.

Learning (storage strength)- what I have to try to infer rather than what I can measure. The question of whether learning has happened is: “have those relatively permanent changes happened that will support my performance in the long-term.” Learning judgements are focussed on both retention and transfer (to different applications and contexts).

There is a severe danger that current retrieval strength (performance) can be interpreted as storage strength (learning). Bjork’s and others’ work shows that current performance is often a very poor indicator of whether learning has happened. The dissociation between the two can result from things such as predictability or current cues that are there now but won’t be later. These can prop up performance and give the impression rapid learning has happened when it hasn’t. In a 2014 talk at Harvard University, Bjork cites relatively old research that has shown there can be considerable learning in the absence of performance. In more recent research they have shown the converse to be true- you can have considerable increases in performance with virtually no resulting learning.

It is my belief that many experienced teachers have an understanding of the difference between performance and learning and its importance. I would however like to raise the question of whether some contemporary systems and common practices fail to dissociate the two? These include: lesson observation, work sampling, AFL (if the limitations are not understood) and potentially, assessment without levels. However, before I elaborate further on this point, it is important to understand in more detail about the interaction between retrieval and storage strength (they are dependent variables) and also research that has observed the interaction between the two in real world classrooms.

Ebbinghaus was the first to publish on the effects of how forgetting helps learning because frequent recall builds retrieval strength (and storage strength, although Ebbinghaus didn’t distinguish between the two) which then slows the rate of forgetting. However, subsequent research has shown more sophisticatedly that storage strength and retrieval strength are interrelated. As you recall a memory representation, both the retrieval and the storage strengths increase. The degree to which they each increase is dependent upon their relative strengths at the time of recall. Increments in retrieval strength are: a decreasing function of the item’s current retrieval strength, but an increasing function of the item’s current storage strength. The deeper you have learned something previously, the faster you ‘relearn’ it. Conversely, the higher the current retrieval strength of a memory representation, the smaller the increments in storage strength (i.e. learning). Forgetting becomes necessary to reach a new level of learning. Something that is completely accessible (high retrieval strength) is completely unlearnable (cannot raise storage strength) in the sense of getting to another level of learning above that reached already. In other words, if it is memorised by rote alone through repetition that is too high-frequency, the ‘learning’ gains (storage strength increments) rapidly decrease in size. Therefore, conditions that reduce retrieval strength and build storage strength can enhance learning. Bjork refers to these as desirable difficulties because they prevent retrieval strength growing too quickly which would reduce learning gains. In summary, because forgetting enables learning, conditions of instruction that appear to create difficulties for the learner, slowing the rate of apparent learning, often optimise long-term retention and transfer, whereas conditions of instruction that make performance improve rapidly often fail to support long-term retention and transfer.

English: Hermann Ebbinghaus

English: Hermann Ebbinghaus (Photo credit: Wikipedia)

One such desirable difficulty is The Spacing Effect. Given a constant number of questions that you ask students to complete, they will have better long-term recall if you space the practice with time intervals (in the order of days) in between rather than mass practice where they would do all of the questions during one session. Mass practice is advantageous if you’re measuring performance (short-term) rather than long-term learning. If you do mass practice trials your students will appear to be learning rapidly in comparison with spaced practice, where performance (retrieval strength) will be lower. However, if you do spaced practice, the storage strength (learning) grows faster than if you do mass practice. Bjork cites research by Professor of Psychology at the University of South Florida, Douglas Rohrer that has demonstrated the long-term learning benefits of spacing over massing specifically within the context of maths education.

In a 2007 paper, The shuffling of mathematics problems improves learning, Rohrer & Taylor published results of two experiments, one of which looked at the performance of undergraduate (non-maths specialist) students who were subjected to lessons on a maths topic unfamiliar to them. Half of the students (spacers) did spaced practice (over two weeks) whilst the others (massers) did mass practice of the same number of questions as the spacers, but in a single session. The students all sat an assessment one week after their last practice session. The spacers outperformed the massers scoring an accuracy of 74% vs 49% on the assessment.

Rohrer goes further in his research to attempt to understand if there is an optimum time interval between spaced practice sessions to maximise long-term recall. In a 2008 paper, Spacing effects in learning- a temporal ridgeline of optimal retention, Rohrer et al publish results from experiments that have been synthesised into mathematical functions that give the recall success in terms of both the study gap and the test delay. The variables are dependent, i.e. there is no single study gap that produces optimal recall, it varies according to the test delay. However, certain generalisations can be made. Spacing over single-day intervals is too short- retrieval strength gets boosted too-high, too-fast and storage strength growth is quickly limited. If the test is a reasonable length of time in the future, say 200-300 days, the optimum spacing is approximately 20 days. Over shorter test delays, say 70 days, the optimum spacing is approximately 10 days. Fortnightly recall seems a practical conclusion for practitioners who operate within the real world to adopt.

uweb_cas_usf_edu__7Edrohrer_pdfs_Rohrer_Taylor2007IS_pdf

Rohrer also tested the strategy of over-learning to see if it benefits long-term retention and transfer. Over-learning is defined as giving students significantly more practice questions to complete than other students. In short, whilst there were short-term performance gains, long-term learning was no better. I.e. if students do 5 questions correctly they will retain their learning just as well as if they do 30 similar questions. Over-learning is an ineffective use of time.

The implications for lesson design and curriculum planning are clear, but I will hold back from elaborating on them until I have discussed other desirable difficulties in due course.

However, before we go any further, it would now be appropriate to discuss the omni-present mantra of rapid and sustained progress within the context of the aforementioned research findings. Quite simply, rapid and sustained progress is an oxymoron! If you raise performance (retrieval strength) rapidly, you sacrifice possible learning (storage strength) and sustainability. If you perform highly on a topic too quickly, the automaticity you attain limits the possible long-term learning (storage strength) gains. The storage strength increment is a negative function of the current retrieval strength. Lessons that show the most progress (higher performance) result in sub-optimal retention and transferability (learning). To maximise long-term learning we need to limit performance (retrieval strength gains) in order to optimise storage strength gains. Lessons need to be of a high-challenge nature that prevent automaticity forming too early. This builds storage strength which then ensures subsequent recall events will see large gains in retrieval strength. Do grade 1 practitioners, whose classes show the most progress in lessons always get the best student outcomes when it comes to exam time? Do you know ‘solid grade 2′ practitioners whose classes do just as well, if not better in exams? I’ve been aware of this generically for a while now, but the realisation that retrieval and storage gains are negatively correlated does provide more clarity- the ‘grade 2′ teachers that don’t strive for rapid in-lesson progress have students with more sustained learning. There are no quick fixes to long-term learning- surprise, surprise. In fact, it’s worse than that- the findings of this research show that shortcuts are actually destructive to learning. Buy rapid performance today, pay with sustainability in 12 months’ time! The implications for lesson observations are considerable.

Ofsted no longer grade individual lessons. There is a movement within many schools at present to stop grading lesson observations. This seems logical given Bjork’s message that performance needs to be dissociated from learning. Using one to infer the other has been shown to be unreliable. By observing the performance gains of students in lessons, can we infer reliably the size of the learning gains? Can the observer accurately predict what the students are going to retain and be able to apply to a different context six months after the lesson? Having considered this for months, I cannot think of any way in which learning in any observed lesson can be reliably and accurately measured if we, as the research advises us, dissociate it from performance.

Ending the grading of lessons would potentially have advantages resulting from the focus of non-judgemental debrief conversations being based on strategies for maximising learning, rather than performance, the later of which is common in a good-to-outstanding-led lesson grading culture. In a no-grading, good-to-outstanding culture the conversations could instead be more centred on topics related to maximising learning (long-term retention and transfer), rather than ways to rapidly (and to the detriment of learning) raise performance over short time intervals? “What are you going to do going forwards to ensure that what was covered in today’s lesson becomes learned, i.e. there is long-term retention and transfer of the material?” In the very least, if progress is going to be graded, a ‘progress over time’ judgement is much more desirable than a ‘progress in a lesson’ grading.

Back to Bjork’s desirable difficulties. Using high-frequency, low-stakes testing specifically as learning events (even without feedback) have been shown to have significant positive effects on storage strength gains. The low-stakes part is critical to the effectiveness of this desirable difficulty. Study-test-test-test is more effective than study-study-study-test.

Another highly effective desirable difficulty that increases storage strength gains is contextual interference, one example of which is interleaving. Most practice sets of questions that we get students to work on during study are blocked into topics. We teach them a strategy and they answer a series of questions, all of which require the same strategy. Interleaving is when, instead of blocking, you give students a mixture of questions on different topics (that include and precede today’s lesson). Research findings into the effects of interleaving in maths education say that if my students are learning lots of things, I will maximise long-term retention and transfer (learning) if I arrange the instruction to maximise the possible interference between them, i.e. don’t do blocking. This is counter-intuitive but well-researched. Rohrer believes that one reason interleaving is effective is that it gives students experience in selecting a strategy for answering questions. In blocked sets of questions, the same strategy is repeatedly used and so students only gain experience in executing the already modelled strategy. Transfer in learning requires students to select strategies, before executing them. If all they ever do is blocked activities they never experience the need to select strategies until assessment time.

In a 2007 paper, Rohrer & Taylor had students learning to use and apply different maths formulae to solve problems. Some students did blocked practice, some did interleaved questions. When measured at the end of the lesson(s) students who did blocked practice outperformed the interleaved practice students with an accuracy of 89% vs 60%. However, and very importantly, when tested after a one week time delay the percentages were 20% vs 63% respectively! The interleaved practice students outperformed the blocked practice students 3:1 on a delayed assessment! The lessons that would have been judged to show the most rapid progress resulted in significantly less sustained learning. The lessons in which performance was lower resulted in triple the learning gains. In numerous other studies, Rohrer and others have replicated these findings and have shown why it is imperative we separate learning from performance and not use one to infer the other. If we are talking about the difference between good and outstanding lessons, we must take a long-term perspective, evaluating and discussing strategies and pedagogy that are learning-enhancing rather than performance-enhancing focussed.

uweb_cas_usf_edu__7Edrohrer_pdfs_Rohrer_Taylor2007IS_pdf

Desirable difficulties are desirable because responding to them successfully engages processes that support storage strength gains. They limit gains in retrieval strength and prevent learning becoming automatic too early which would limit potential further increases in storage strength. However, they become undesirable difficulties if the learner is not equipped to respond to them successfully. For example, the student that has low-working memory will struggle if questions early in learning a new topic are interleaved and they are trying to simultaneously select between numerous potential strategies. In this case, if they are placed into cognitive over-load by interleaving, and they don’t have the required self-control and resilience, it would have negative effect. Optimal storage strength gains require sub-optimal performance during lessons and students (and teachers) need to be comfortable with this and remain motivated to get the benefit. This is a considerable challenge. I’ve always found showing students their progress to be a good way to boost their intrinsic motivation. Making lessons more challenging through the introduction of desirable difficulties, in the knowledge that performance of students, if we are maximising learning gains, will be lower, is a hard-sell to students who are motivated by seeing rapid performance gains.

Within the community of teachers currently reading educational research, learning styles has become a bit of an in-joke. They are repeatedly cited as an example of a seemingly intuitive idea that isn’t supported by research evidence. Bjork explains they are based on the meshing assumption that if learning is aligned to a particular personal format preference, it is easier to acquire and thus you will consequently accumulate more of it. The meshing assumption, that easier learning results in more learning, is false for reasons already discussed- limiting retrieval strength gains with desirable difficulties maximises storage strength (learning) gains. Learning styles are the opposite of a desirable difficulty. They are a quick-win for performance, and consequently a loss for learning.

If Bjork’s work is accepted as contextually relevant and applicable for secondary maths education, I believe the following implications seem logical deductions:

  • Maximal learning (storage strength gains) requires the limitation of retrieval strength gains, particularly early on in instruction, through the use of desirable difficulties such as: interleaving, spacing and high-frequency/ low-stakes testing. Nearly all maths textbooks have massed, blocked question sets. This needs to change. As Bjork points out, the same questions could be used, it is just the ordering that needs to change. Students need to be experiencing mixed-topic question sets regularly, not just during assessments. This gives them the necessary experience in selecting strategies in addition to executing strategies.
  • Rapid and sustained progress is an oxymoron- the two are inversely proportional. We should concern ourselves with understanding how to generate sustained progress and this should be the focus of pedagogical practice, discussions, interventions and performance management systems.
  • We should dissociate learning and performance. This means not using performance measures to infer learning gains. Learning cannot be measured within the time-frame of a lesson. Work samples of students’ books does not tell you what the students have learned, only their in-lesson performance that day. Using AFL to assess in-lesson concepts currently being taught, only measures performance, not learning and so don’t use success on an exit ticket as proof of long-term retention and transfer (learning) of the material from that lesson. Using AFL to guide instruction within lessons is the right thing to do, but don’t use it to infer learning that has occurred within the current lesson. Assessment of learning rather than performance should feature a time delay from when the material was last covered and/or be contextually different, thus including a measure of transferability. In an assessing-without-levels world we must ensure we assess in a time-delayed way and be transfer-focussed if we are to avoid previous mistakes with formative assessment learning judgements being based on performance rather than learning (APP etc).

Finally, it should be reiterated that there is a natural tension between the motivation boost of students seeing high performance gains and the reality that slower ones lead to better learning. As Bjork puts it:

“If someone gave me a new course and said ‘do everything you know how to do to make students’ long-term memorisation of key concepts the best’, I could give that a big try; or if they said, ‘do everything you know how to do to get the highest course ratings’, I know something about that too; but what’s awful is that they would not be the same course. They would be quite different courses”.

There is clearly a need for students to understand the ideas of desirable difficulties, to some degree, if we need them to be comfortable with lower performance in harder lessons today, the benefits of which won’t pay off for many months or years ahead. Many cultural and systemic expectations about what effective lessons look like may need reconsidering too. Teaching that facilitates outstanding student learning is different to teaching that facilitates outstanding student performance.

For more info on Robert Bjork’s work see: Go Cognitive

For Doug Rohrer’s publications see here.

CGP Maths Buster- a superb new learning and revision resource for GCSE maths

CGP Maths Buster

CGP are well-known for their excellent GCSE revision guides. Now they’ve taken their offering to a whole new level with GCSE Maths Buster. The £6 DVD ROM for PC & Mac is a comprehensive, interactive revision tool for GCSE maths students featuring:

Levelled practice- work your way through the entire maths curriculum ‘levelling up’ to unlock new content to study.

Timed tests- Take an on-screen timed test to identify your strengths and weaknesses. The software then suggests a revision plan tailored to your needs.

Practice questions- 55,000 exam-style practice questions. For each one view a worked example or see the notes from the CGP revision guide relevant to the topic.

Video tutorials- Watch video lessons for each topic.

Keep track of your progress- The software records your progress in each topic and provides an overview at any time.

Challenges- As you work through Maths Buster you unlock challenges such as ‘sudden fail’, ‘time trial’ and ‘against the clock’.

Practice papers- Print off practice papers. Complete answers and step-by-step video solutions are provided.

I’ve had a play with the software myself and am impressed with it. There are lots of revision offerings out there these days, but what I particularly like about Maths Buster is that it is comprehensive- it provides for all stages in the learning journey from video lessons, through practice questions and then onto exam preparation. The ‘smart’ side to the software that records individual students’ progress and then tailors its recommended lessons to their individual needs is very useful too.

For more info including a video introduction to the product and an online interactive demo visit https://www.cgpbooks.co.uk/School/mathsBuster

Bar modelling- a powerful visual approach for introducing number topics

Building on my recent post about a taxonomy for deep learning in maths, I have been trying to think a bit deeper myself about what each type of ‘deep learning link’ might look like. In particular, I have been researching and putting a lot of thought into what effective ‘visual models’ look like for the ‘key nodes’ I have previously identified as the most important foundation maths knowledge for students to master before starting their GCSE maths course. These are principally number topics.

Last year I became aware of the Singapore Maths Bar Modelling approached have recently found the time to research it further. I bought some Singapore textbooks and read about the work of Dr Yeap Ban Har. This video, featuring Dr Ban Har shows an exemplification of the approach for a typical functional maths problem:

Maths No Problem

In short, I really like the approach and am convinced it could enhance my own practice significantly by giving students powerful, but simple visual models they can draw upon and use to solve problems. I have been experimenting with some of the models in my lessons this year and have seen the positive effect they have had on student understanding of topics. What these visual models give you is an entry point when teaching a topic that all students seem able to grasp. It presents the concept in its rawest, simplest form without the distraction of lots of words or mathematical notation. The diagrams don’t replace the eventual algorithmic methods, but they provide an entry point where students seem to understand what it is they are trying to solve; something that often gets clouded when algorithms are presented to early on.

In primary education in Singapore, maths teachers follow a Concrete-Pictorial-Abstract (CPA) sequence when teaching maths topics. They start with real world, tangible representations, move onto showing the problem using a pictorial diagram before then introducing the abstract algorithms and notation.

The particular power of the bar modelling pictorial approach is that it is applicable across a large number of topics. Once students have the basics of the approach secured, they can easily extend it across many topics.

I have spent some time putting together some pictures showing how the approach can be used for different topics. They are not teaching slides, but rather ‘notes for teachers’ to demonstrate how this single model can be adapted to be the diagrammatic entry point for many topics.

To start with students are given blank (bar) rectangles (on plain paper) and then get used to dividing the bars into halves, thirds, quarters etc:

Slide01

They can then calculate a fraction of a quantity by first drawing the fraction in the bar, showing the length of the bar to be the quantity and then calculating the length of the shaded part:

Slide02

Again, you’ll end up at ‘divide by denominator, multiply by numerator’ eventually, but this does show the concept of what’s going on very nicely and is a good route into showing where the algorithm comes from.

Next up, equivalent fractions:

Slide03

Then simplifying fractions:

Slide04

A ‘fraction wall’ (as many teachers use traditionally in England) can be used for ordering fractions:

Slide05

Adding fractions with the same denominators:

Slide06

Adding fractions with different denominators:

bar modelling concept sketches

Slide08

Multiplying fractions:

Slide09

Dividing by fractions (works ok so long as you have integer answers). That’s enough to get across what is going on… Then you can lead into the method…

Slide10 Slide11

Slide12

Converting mixed numbers and improper fractions:

Slide13

Next up, understanding place value in decimal numbers. This approach lets you deal with lots of misconceptions like 0.62 not being larger than 0.7 etc. Importantly, this is now taking the bar model and putting a decimal number line onto it. This forms the basis for many topic models that follow:

Slide14

bar modelling concept sketches

Now they have an understanding of how the decimal number line works, and they can draw bar models for fractions, they can combine the two on one diagram to convert between fractions and decimals:

Slide16 Slide17

Next they can learn that percentages are hundredths, and in doing so can put a percentage number line under the bar model:

Slide18

They can now combine the fractional bar model with both the decimal and percentage number lines directly underneath it to convert between fractions, decimals and percentages. They draw the fraction bar first, then put on the decimal increments (by dividing 1 by the denominator) and finally put on the percentages (by dividing 100 by the denominator):

Slide19

Similarly to fraction of an amount earlier, you can use this approach to introduce percentage of an amount. Starting with showing how to find 10% of a number (and thus why you divide by 10), it serves as a nice way into multiplicative reasoning approaches to ‘build off the 10%’ to find other percentages:

Slide20

Other percentage topics then follow, such as percentage increase and decrease:

Slide21

By putting both the decimal and percentage number lines on the bar model for this, you can clearly show where percentage multipliers come from, including the ones less than unity for percentage decreases:

bar modelling concept sketches

You can introduce calculating a percentage change using the bar model approach:

Slide23

It’s a particularly nice diagrammatic way in to teaching reverse percentages:

Slide24

Once the above techniques have been mastered, it can be used for showing how compound interest works as follows. Particularly nice is if you turn the bars so they are vertical, it shows the exponentially increasing relationship:

Slide25

It works for ratio too:

bar modelling concept sketches

 

Slide27 Slide28

I’ve used fraction walls for years, but it is the inclusion of decimal and percentage number lines built onto the fraction bars that is new for me. It opens up diagrammatic routes into so many topics and in such a coherent, simplistic way. The universality of the approach is what particularly impresses me; from humble beginnings of shading rectangles, the same model leads all the way up to reverse percentages and compound interest. If done in the correct order, there is a beautiful journey of progression all using one simple model. Each topic builds off the last logically as the model is manipulated in different ways.

The visual models won’t ever replace the algorithmic approaches, but instead will I hope provide my students with a better understanding of ‘what is going on’ when we are at the early stages of learning a topic. I hope their conceptual understanding is improved and this in turn enhances their procedural understanding through it giving it a purpose and something visual to hook onto. If they can ‘see the bar picture’ for problems with simple numbers it is my hope that the algorithmic approaches that follow that will enable them to solve problems with more challenging numbers with stick better. If they can represent a problem by drawing one of these models, they may have a better understanding of what the problem wants them to do.

I plan to develop resources to support teacher explanations and student activities in these topics in the coming months with the support of keen beans in my department and will share them with you when they have been tried and tested. There are no silver bullets in education, but this does represent a decent step forward for my teaching. Much to learn still!

Functional area questions

I put together the following as an activity for a low-attaining year 8 class coming to grips with area. They’ve done area of rectangles and triangles last lesson and so I’m trying to link it to real world contexts this lesson.

The idea is they cut out the shape for which they need to find the area, they then choose a covering for it from the second sheet and cut that out too. They stick both next to each other in their book and then calculate the area and then the total cost of the covering required. I’ve kept it all per sq m to begin with. With the cans of paint they’ll need to calculate how many cans to buy given the coverage of each can. The final slabs choice requires more thought as they are a bulk-buy and not in sq m units. It’s also a compound shape.

Functional area questions floor wall coverings patios

Functional_area_questions_floor_wall_coverings_patios__page_2_of_2_

Download here

“Table of coordinates” rather than “table of values”?

A very simple idea this one…

I walked into a colleague’s class the other day to see her teaching a first-lesson on straight line graphs. She’d called the table of values a table of coordinates instead. I thought this was a simple idea, but one which seemed to make more sense to the students, many of whom are low-attainers at present, than calling it a table of values. It’s a subtle difference, but all these little things matter when you’re introducing a new topic!

Craig Barton’s DiagnosticQuestions.com website levels up

home-sample-quiz

TES Maths Adviser, Craig Barton founded a website last year called DiagnosticQuestions.com. I have written previously about this being an excellent AFL tool for teachers to assess students’ prior learning and misconceptions about a topic. In addition it is also useful to assess learning at the end of lesson.

DignosticQuestions.com just went live with a whole host of new features including:

  1. Import students and classes
  2. Assign quizzes to your students which they can answer on their phones, tablets or computers
  3. Instantly get data back from your students, including their explanations for their answers, giving you real insight into their thinking and misconceptions
  4. Compare your students’ performance with other students from your school, or from students all over the world

More information, along with a suite of useful videos explaining how to use the new features is available on Craig’s blog, here.

Student response to teacher feedback

Students responding to the feedback we give them is an important part of the ‘feedback loop’ and something I’ve often found challenging to get going well with classes. This year I’m trying to be much more specific about my expectations by giving students the following place mat explaining the tasks I expect them to complete in responding to my feedback:

Responding to feedback mats- GMTI

Click here to download a pdf version of the mat.

The idea is a class set of these are printed onto A3 and laminated. Students get these out a few times over a term, particularly after you have marked their books.

Feel free to try them out with your classes. Let me know how you get on in the comments section below. If you have a really good example of student work where they’ve been used well send a pic by email to my address in the ‘contact’ section and I’ll upload it to this post.

If you like this post, you might also be interested in my self/peer assessment writing frames.

Going SOLO on the journey towards deep learning

The Structure of Observed Learning Outcomes Taxonomy (SOLO) is a notion that describes the stages of learning that students go through to reach a real depth of understanding on a topic. It outlines the journey from surface to deep learning. SOLO is John Hattie’s taxonomy of choice and is currently being studied in depth at his Visible Learning Labs (Osiris Educational Outstanding Teaching Conference, 2014). It is seen by Hattie and other academics as having many advantages over other taxonomies, in particular that of Benjamin Bloom. Quoted advantages over Bloom’s Taxonomy include:

  1. The SOLO Taxonomy emerged from in-classroom research whereas Bloom’s Taxonomy was theorised from a proposal by a committee of educators
  2. SOLO is a taxonomy about teaching and learning vs Bloom’s which is about knowledge
  3. SOLO is based on progressively more challenging levels of cognitive complexity. It is argued this is less clear within Bloom’s Taxonomy.
  4. It is claimed that educators and students agree more consistently which level a piece of student work has reached on the SOLO Taxonomy than on Bloom’s Taxonomy
  5. SOLO is more simple to understand and apply than Bloom’s making it more accessible for students to grasp, even primary phase.

Whilst interesting from a mostly academic perspective, these advantages are unlikely to grab the coal-face busy professional teacher and convince them to go SOLO in their planning. I had the same thought originally until I understood how incredibly simple SOLO is and that it seemed to ‘work’ for a maths classroom a lot better than Bloom’s Taxonomy does. It makes sense to me as a good summarisation of what I have learned from experience as the way students learn in maths.

Image via pamhook.com. Check out her fab website with lots more info on SOLO

Image via pamhook.com. Check out her fab website with lots more info on SOLO

SOLO works by students progressing from surface learning to deep. On a particular topic they can be at any of these particular levels:

Pre-structural- The student has no understanding of the task. They completely miss the point.

Uni-structural- The student has ‘learned’ one aspect of the topic.

Multi-structural- The student has ‘learned’ more aspects of the topic. However, they see each of the aspects as independent and unrelated

Relational- The student understands the links and relationships between the different aspects of their previous learning within the topic.

Extended abstract- The relational learning is so well understood students can now start using this to conceptualise further learning outside of the topic domain.

Describing progression within a maths topic such as angle facts this could look like:

Pre-structural- The student has heard other people talk about angles.

Uni-structural- The student can estimate the size of an angle

Multi-structural- The student can estimate the size of an angle, measure the size of angles and has learned the angle facts

Relational- The student understands how estimation can be used as a way to check that they’ve read off the correct scale when measuring an angle. They understand how to use angles in parallel lines rules to prove angles in a triangle add up to 180 degree. From this they can then derive the sum of the interior angles in an octagon.

Extended abstract- Students can apply their angle fact knowledge to solve geometrical problems where the angles are algebraic expressions and the solution requires the formation and solution of equations.

The pre-structural, uni-structural and multi-structural levels are considered ‘shallow’ learning, relational and extended abstract are ‘deep’ learning. One of the most important things to understand about SOLO is that it describes a journey. You have to progress through the levels and cannot jump straight to deep learning. As Hattie put it recently, “you can’t do the deep stuff until you know the shallow stuff.” You can’t link things together until you have things to link together. This is one of the reasons ‘problem-based learning’ strategies score so poorly in his Visible Learning rankings (0.15 ES where 0.4 is mean average). In Visible Learning, he wrote, “…this is a topic where it is important to separate the effects on surface and deep knowledge and understanding. For surface knowledge, problem-based learning can have limited and even negative effects…” You can’t solve problems until you are fluent in the skills required to solve the problem. It’s like trying to solve a jigsaw without having the pieces.

A consequence should be that teachers know it is ok to have some ‘shallow learning’ lessons where students are simply trying to acquire and become more fluent and accurate at the skills. I wonder sometimes, particularly with newer teachers that expectations in some schools for formal observations to feature open, higher-order thinking questions and tasks, that those teachers think this is where every lesson should be pitched. Progress should be viewed as successfully moving through-or-between any level in the taxonomy. ‘You need the range of lessons,’ should be the message to new teachers; dedicated skill-and-drill practice has its place as do higher-order thinking lessons. They complement each other.

However, for the remainder of this post I want to focus on the deep learning stages: relational and extended abstract. Rather than getting bogged down with edu-jargon, the message is simple- deep learning is about links. Firstly, links to other things within the same topic, then to things outside the topic. 

I’ve been trying to think of all the different types of links that we can use to design deep learning resources for a topic. The intention is to produce a prompt-sheet or checklist of things I (or any GMTI readers!) could refer to when planning a deep learning lesson. The following list will not be comprehensive, but is the result of this idea bubbling away in my mind for the last six months. Please feel free to add, build or challenge my thoughts in the comments section. Here goes, first draft!

I think ways we can build ‘deep learning’ into our lesson planning include:

  1. Link to a concrete representation- How can the problem be represented using physical equipment? This could also be a kinaesthetic representation using other senses e.g. hearing or touch.
  2. Link to a visual diagram- How can the problem be represented using a diagram?
  3. Link to metaphors- What different metaphors can be used to describe the concept? What are each metaphor’s limitations?
  4. Link that helps you understand another part of maths within the same topic better- How can one idea help in understanding another within the same topic?
  5. Understand the limitations- Does it work for all types of numbers? What range of values in the answers would we expect? Where does the maths still theoretically work, but in real life it becomes impossible?
  6. Understanding ‘dynamic variation’- What role does each part play? If I double this, what happens to the answer? If I halve that, what happens to the answer? Which part has the largest effect?
  7. Reverse engineering a question- Can students create a challenging question that has a specific answer and also meets additional criteria you set them?
  8. Comparing different solution methods in terms of their efficiency- What different solution methods are there and when would each one be more efficient that the other?
  9. Historical links and significance- Where did this maths come from? Who discovered it? Was it discovered out of necessity to solve a particular problem or just as a curiosity?
  10. Link to a real life context including other subject areas- What real life examples and contexts can we ask questions about? What other subjects within school can we link this to?
  11. Link that helps you understand another area of maths outside of the topic better- How can one idea help in understanding another within another topic domain in maths?

In the coming weeks I intend to build example resources demonstrating these ideas. I hope to create both the prompt sheet and examples of resources for particular topics in the hope to spark debate. For now, however, I think this is enough to open the discussion. Please do contribute in the comments section below!

Have I missed any ways of  forming links? I am working on the assumption that a worksheet resource based on building 11 different types of links will develop deeper learning that 11 questions on the same type of link. Is that reasonable? The 11 suggested types of links are in a particular order as I think this reflects the movement from relational to extended abstract. Do you think the order is right?

With thanks to:

http://pamhook.com You should certainly check out her fab website for more detailed info on SOLO.

Famous numbers sequences card sort

Here’s a card sort I made to introduce my new year 7s to famous number sequences. Famous number sequences card sortThe idea is students cut out and group the cards into 6 famous numbers sequences:

  1. Square numbers
  2. Cube numbers
  3. Triangle numbers
  4. Fibonacci Sequence
  5. Even numbers
  6. Odd numbers

Each group should contain:

  1. Name
  2. Pictorial representation
  3. The numbers
  4. A fact about the sequence
  5. An explanation of how the sequence is produced

Click this link to download the resource as a pdf: Famous number sequences card sort

*Thanks to Wikipedia and the BBC for the images.

Vocab Blitz- improving student literacy

Literacy is every teacher’s fight, no matter what subject we teach, even maths. Textbooks often have keywords and, sometimes, definitions. However, rarely will students learn vocabulary from just these, in my experience; they need to work with the vocabulary.

Two activities I tried last year were “keyword bingo” and “taboo words”. In the former you read definitions whilst the students circle them on a bingo grid. In the latter students have to describe a keyword without using other “taboo words” written on their card.

Students enjoyed these activities but I’m not sure the tasks were that effective at increasing the students’ fluency with the vocabulary. The reason is that the activities were not leading the students to do what we ultimately want them to do- use the vocabulary within self-constructed sentences with automaticity.

An INSET we received a couple of years ago emphasised the importance of developing students’ spoken fluency before then working on getting it down on paper. “Talk like a book” was the mantra. Since building this into my practice I have been trying to find an activity which can bridge between “talk like a book” and getting students to write with vocab fluency. I think I’ve found it and I’m going to call it “Vocab Blitz”…!

The idea is you give students a paragraph that contains waffle instead of the concise vocabulary, for example:

The shape I’m describing has four straight sides. Two of the sides point in the same direction and wouldn’t cross if they carried on forever. You can’t fold the shape in half so each side perfectly matches up with the other. If you turn the shape around it won’t look the same as it did at the start until you’ve gone all the way around.

The idea of Vocab Blitz is that the students need to rewrite the paragraph with as few words as possible by substituting the waffle words with mathematical vocabulary, for example:

A trapezium is a quadrilateral. It has one pair of parallel sides, no lines of symmetry and no rotational symmetry.

The winner is the student who retains all the information from the original paragraph but rephrases it in correct English with as few words as possible!

I like this activity because it mirrors what students do when “talking like a book.” They often waffle at first then improve their sentence by inserting mathematical vocabulary in subsequent efforts. Vocab Blitz is essentially copying this process but in written form. It encourages students to improve on written drafts. Once they’ve done a few Vocab Blitz activities they could transition to writing the first drafts themselves then improving them…

Experiments with Visible Learning showing big promise…

This post builds on my last one about a model for assessing without levels using the principles of John Hattie’s Visible Learning philosophy. Make sure you’ve read my last post before this one.

The first thing I’d like to share is that my worries about time constraints preventing me from using this model were unfounded. After some terrific Twitter feedback I realised the pre and post learning assessments could be done for homeworks. The students see the point in them and so approach them with more focus than they do a traditional homework.

The pre-learning assessment results have significantly impacted my teaching. See below. These are the results from a pre-learning assessment I did with a year 9 class at the start of a topic on area and perimeter. These results changed how I approached the teaching of the lessons on this topic. See the colour coding. There were three lessons I didn’t need to teach at all as the students remembered those from KS3. The lessons I’ve coloured blue were taught using collaborative learning strategies (Kagan etc). Half the students knew how to do them so I used these students to teach the others (reciprocal teaching is a big yes from Hattie). The yellow lessons were taught using direct instruction methods (not dirty words in Hattie’s world). High quality formative feedback genuinely leading my planning.

Microsoft_Excel_-_13H_Area_and_Perimeter_Asessement_Results

Then came the post-learning assessment after the lessons:

Type__File_folderDate_modified__14_07_2014_15_03Size__34_6_MB__Files__3_3_Drawing_Linear_Graphs______and_Microsoft_Excel_-_13H_Area_and_Perimeter_Asessement_Results

The first thing that struck me was how poorly students had retained their knowledge from the sectors lessons. Know thy impact is Hattie’s mantra. This truly showed me, brutally, how poor my impact had been in those lessons. Bad for the ego, but great for an objective reflection to see how well you’ve taught things. This also confirmed my long-held belief about how using work in students’ exercise books to make APP-style decisions about learning progress is totally unreliable. The students’ books were full of beautiful sectors calculations, all presented well and accurate. By looking at their books you would conclude their ‘learning’ of sectors was as good as their ‘learning’ of all the other topics. It wasn’t. They didn’t learn the sectors work well at all despite what their books would tell you.

This naturally forced me to raise the question ‘why was their learning from the sectors lesson so poorly retained?’ I considered various options such as:

  • was it the quality of my explanation?
  • was it the lesson structure?
  • were my activities incorrectly pitched?
  • had I overloaded their working memory during my explanation?
  • had I rushed this topic and not used AFL correctly to tell me when to move on?

There of course, could be endless other explanations, but I concluded with confidence that the way I built up the explanation was at fault. I went into the lesson with the hook ‘how many ghosts can Pacman eat?’ to engage pupils with the idea of thinking about what proportions of circles comprised the compound shapes of Pacman and the ghosts. From that I went into the sector formula etc. I realised in hindsight the step here was too quick and it is likely the students didn’t see the link and so were confused when working with the formula. I think the use of an early context ‘fogged’ their focus on learning how to use the formula and where it came from.

I then retaught the lesson taking a different approach (talking about fractions of circles without angles first before then developing how you can use angles to describe these) and progress was clearly much better after a summative check a couple of days later. The alternative explanation had worked.

The yellow objectives in the table were hit regularly during starters over the following week to ‘iron out’ the final few issues a small number of students were still having with them.

It was naturally disappointing to see that students hadn’t retained as much as in-class AFL told me they had. It hurts the pride somewhat, but once you get over that and accept this is the impact you’ve had, you feel enlightened and know exactly where to focus your intervention efforts. I feel closer to the ‘learning pulse’ than ever before and can ‘visibly see’ (Visible Learning!) what I’ve taught well or not in a way I never have before. I felt empowered to target my next few lessons to plug the gaps, knowing exactly where they were.

The biggest revelation I’ve had from this so far is that in-class AFL (using mini-whiteboards etc), even if done using summative testing does not tell you what has gone into students’ long-term memory. It tells you what has gone into either working memory or long-term memory but not which one. It doesn’t tell you what they’re going to retain. The only way to find out what went into long-term memory is to summatively asssess a few days later. In-class AFL is essential for checking understanding that day and directing lessons at key hinge points, but it is no substitute for summative assessment a finite time after the lesson to find out what was ‘learned’ rather than what was just ‘understood and copied’. You need both. Making progress judgements by looking in exercise books is unreliable for the same reasons as in-class AFL.

My pride was bruised by the post-learning assessment above, but the more I live ‘Visible Learning’, the more I understand what Hattie’s talking about as high-impact practice.  I have so much more clarity on what students have learned than ever before. I know when lessons have gone well. I know when they’ve gone poorly despite my ego telling me otherwise. I feel like I’ve been teaching blind until now! I’m ‘plugged in’ to the ‘real learning’ more than ever and make fewer assumptions. I teach based on what they know rather than what I assume they know. It’s made me a more reflective practitioner and more objective.

There’s certainly something in this Visible Learning!

Next step is to trial this with all my KS4 classes next year to see if the model will work across the attainment spectrum in a sustainable way.

Fun times.

 

How do we make John Hattie’s “Visible Learning” work in maths?

visiblelearningVisible Learning is John Hattie’s mantra. I’ve written previously about being a big fan of Hattie’s work about what affects achievement. He’s collected just about every piece of academic research and collated a ranked taxonomy of  factors that affect achievement. Visible Learning is his suggested approach to teaching in a way that incorporates many of the significant drivers of achievement.

He sums up what Visible Learning looks like using the analogy of teaching someone to abseil. The main features of the learning being:

  1. The success criteria is explicit. The learners not only know what they’re trying to achieve, but also what success looks like. Giving learning objectives isn’t enough, they need to see what success looks like.
  2. The tasks are challenging. Trusting in the rope to support your weight requires real trust. Learners find it extremely challenging, but when they complete it the first time they get a real sense of satisfaction and have the hunger to repeat it again.
  3. Student expectations. Ask students to state how they think they will do before they start a task. Human nature is for them to play safe in their prediction. When they exceed their forecast, their belief in themselves as a learner increases. This ratchets up over time and their expectations of themselves rise. This has by far the highest ranking effect size in all of Hattie’s findings.

Know thy impact is another of his mantras. He argues that formative assessment is vital in quality teaching and teachers should constantly be using evidence to reflect on the impact their practice had on their learners. Hattie says assessments are more for teachers than students; they are for you to find out what you taught well and to whom. This reflective, evidence-based mindset, he argues captures the essence of what educational research concludes has a high-impact on achievement. Teaching is to D.I.E for, says Hattie: Diagnose what they do/don’t know, Intervene, Evaluate your impact. Repeat.

I’ve been experimenting with an approach to teaching maths that meets the Visible Learning criteria and allows me to systematically know my impact by using the D.I.E philosophy. This has been the biggest step forward in my practice this year and the learners really like it. The problem is, I can’t see a way of embedding it permanently in my practice. Let me explain…

At the beginning of a topic students are given this sheet that gives the learning objectives.

1

The students then make a prediction of how many questions they will get right on the pre-learning assessment and enter this number into the table.

The students then sit the pre-learning assessment:

Screenshot_13_06_2014_19_58

This is class-marked and the students then highlight the appropriate smileys on the tracker sheet (after the pre-learning assessment, not before it) to record their success against each learning objective on the pre-learning assessment. They update the score in the table with what they actually got. Finally a grid is passed round with the students’ names vs the learning objectives. They tick and cross the grid so I then have a record of how each student did on each question.

I plan my lessons based on this pre-learning assessment feedback. I structure my lessons based on who knows what. I know exactly where to pick up from to avoid teaching things they really already know.

After the series of lessons student predict how they will do on the post-learning assessment, sit it, record their actual score, update the smileys and set targets about anything they still haven’t mastered.

The students love it. Comments include: “It really helps me understand what I’ve got to learn”, “It makes me realise that I am actually making progress in maths even when my grade on the exams isn’t going up every time”, “It makes the things I’ve got to learn a lot clearer” and “the assessments help you figure out what you do and don’t know. Stuff I thought I knew, I found out I didn’t and the other way round too. It’s been really useful”.

There are other things I really like:

  1. Using formative feedback from assessments has allowed me to be much more diagnostic and really impacted on my lesson planning. Rather than making assumptions about where to pick up from, I know without any doubt now. Things I would have spent whole lessons on before are now little-and-often starters. The pitch of my activities has changed. When I can see they have visited things before I set deeper learning activities much earlier than I would have previously. What was a chatty class are suddenly much more focussed and on task more regularly. I’m pitching work much better than ever because of the better than ever information I have on what students already know.
  2. I’ve never been a fan of giving students a list of learning objectives and asking them to RAG (red/amber/green) how good they think they are at them. RAG-ing doesn’t account for student ignorance! The conscientious, high attaining students underestimate what they can do and the overly-confident, lazy ones overestimate. By doing the RAG-ing based on what they can do on an actual assessment, the results are much more accurate. This is confirmed by the score students predicting they will get prior to the pre-learning assessment and the actual score they get being different; significantly so in many cases.
  3. I think the approach captures the spirit of the forthcoming assessing without levels reforms. The pre and post learning assessments document progress in a formative portfolio. Targets can be set based on accurate diagnostic assessment of their weaknesses. Progress is very clear to external observers, me the teacher and to the students.
  4. Supplying the learning objectives in conjunction with the pre-learning assessment really gives students focus from the start. They see the learning journey and buy into it. “I can’t wait until we get to the trapezium lesson”, was one student’s comment after the pre-learning assessment. I’ve never had students consistently looking forward along their learning journey like this before.

All sorted then, the world is fixed! No. I just can’t see a way of making it work all the time. I’m not trying to be defeatist. It’s not the preparation involved, it’s simply the time required to run it. There are 30 modules on the scheme of work, 352 learning objectives and it is already a struggle to cover the course content within the allotted time. Two extra lessons for the pre and post learning assessments mean 60 more lessons needed over the KS4 course. It’s simply not possible to sacrifice 60 lessons to this and still cover the course content. Do you agree?

This has been an issue causing me great frustration recently. The learning is so good with this system. It encapsulates so much of what Hattie says really boosts achievement and I have seen at first-hand why. Visible Learning is really, really good learning.

I want this to be something that is practical. I can’t see where the 60 lessons are going to come from. Schools just can’t do it. In theory, because you don’t need to teach everything (some topics students already know are identified on the pre-learning assessment) there are some savings here, but it’s not enough. Perhaps set the pre and post learning assessments as homeworks? That would have pitfalls! Where’s the time going to come from?!

I’m out of ideas currently. Do you have any? How could I make this work? Please make any suggestions you have in the comments section.

John Hattie- Why are so many of our teachers and schools so successful?

visible-learning-infographic-john-hattie-studie-460x400

I’m a big fan of John Hattie‘s work. Many people have big ideas to improve achievement in education. What makes John’s work unique is that he tells you by how much things improve achievement. In a time-pressured world, what makes his work so useful is that he tells you what to prioritise and focus on, and importantly, what to ignore.

I was fortunate to see him in the flesh a couple of weeks ago at the brilliant Osiris Educational Outstanding Teaching Conference 2014. He’s a guy certainly knows his stuff, worth listening to, and cuts through the fads and fashions to the real evidence data on what works. His conclusions are heart-warming.

For a summary of the main messages of his work, here are some great videos:

Visible Learning Part 1- Disasters and below average methods

Visible Learning Part 2- Effective Methods

Why are so many of our teachers and schools so successful?

Enhanced by Zemanta