Famous numbers sequences card sort

Here’s a card sort I made to introduce my new year 7s to famous number sequences. Famous number sequences card sortThe idea is students cut out and group the cards into 6 famous numbers sequences:

  1. Square numbers
  2. Cube numbers
  3. Triangle numbers
  4. Fibonacci Sequence
  5. Even numbers
  6. Odd numbers

Each group should contain:

  1. Name
  2. Pictorial representation
  3. The numbers
  4. A fact about the sequence
  5. An explanation of how the sequence is produced

Click this link to download the resource as a pdf: Famous number sequences card sort

*Thanks to Wikipedia and the BBC for the images.

Vocab Blitz- improving student literacy

Literacy is every teacher’s fight, no matter what subject we teach, even maths. Textbooks often have keywords and, sometimes, definitions. However, rarely will students learn vocabulary from just these, in my experience; they need to work with the vocabulary.

Two activities I tried last year were “keyword bingo” and “taboo words”. In the former you read definitions whilst the students circle them on a bingo grid. In the latter students have to describe a keyword without using other “taboo words” written on their card.

Students enjoyed these activities but I’m not sure the tasks were that effective at increasing the students’ fluency with the vocabulary. The reason is that the activities were not leading the students to do what we ultimately want them to do- use the vocabulary within self-constructed sentences with automaticity.

An INSET we received a couple of years ago emphasised the importance of developing students’ spoken fluency before then working on getting it down on paper. “Talk like a book” was the mantra. Since building this into my practice I have been trying to find an activity which can bridge between “talk like a book” and getting students to write with vocab fluency. I think I’ve found it and I’m going to call it “Vocab Blitz”…!

The idea is you give students a paragraph that contains waffle instead of the concise vocabulary, for example:

The shape I’m describing has four straight sides. Two of the sides point in the same direction and wouldn’t cross if they carried on forever. You can’t fold the shape in half so each side perfectly matches up with the other. If you turn the shape around it won’t look the same as it did at the start until you’ve gone all the way around.

The idea of Vocab Blitz is that the students need to rewrite the paragraph with as few words as possible by substituting the waffle words with mathematical vocabulary, for example:

A trapezium is a quadrilateral. It has one pair of parallel sides, no lines of symmetry and no rotational symmetry.

The winner is the student who retains all the information from the original paragraph but rephrases it in correct English with as few words as possible!

I like this activity because it mirrors what students do when “talking like a book.” They often waffle at first then improve their sentence by inserting mathematical vocabulary in subsequent efforts. Vocab Blitz is essentially copying this process but in written form. It encourages students to improve on written drafts. Once they’ve done a few Vocab Blitz activities they could transition to writing the first drafts themselves then improving them…

Experiments with Visible Learning showing big promise…

This post builds on my last one about a model for assessing without levels using the principles of John Hattie’s Visible Learning philosophy. Make sure you’ve read my last post before this one.

The first thing I’d like to share is that my worries about time constraints preventing me from using this model were unfounded. After some terrific Twitter feedback I realised the pre and post learning assessments could be done for homeworks. The students see the point in them and so approach them with more focus than they do a traditional homework.

The pre-learning assessment results have significantly impacted my teaching. See below. These are the results from a pre-learning assessment I did with a year 9 class at the start of a topic on area and perimeter. These results changed how I approached the teaching of the lessons on this topic. See the colour coding. There were three lessons I didn’t need to teach at all as the students remembered those from KS3. The lessons I’ve coloured blue were taught using collaborative learning strategies (Kagan etc). Half the students knew how to do them so I used these students to teach the others (reciprocal teaching is a big yes from Hattie). The yellow lessons were taught using direct instruction methods (not dirty words in Hattie’s world). High quality formative feedback genuinely leading my planning.

Microsoft_Excel_-_13H_Area_and_Perimeter_Asessement_Results

Then came the post-learning assessment after the lessons:

Type__File_folderDate_modified__14_07_2014_15_03Size__34_6_MB__Files__3_3_Drawing_Linear_Graphs______and_Microsoft_Excel_-_13H_Area_and_Perimeter_Asessement_Results

The first thing that struck me was how poorly students had retained their knowledge from the sectors lessons. Know thy impact is Hattie’s mantra. This truly showed me, brutally, how poor my impact had been in those lessons. Bad for the ego, but great for an objective reflection to see how well you’ve taught things. This also confirmed my long-held belief about how using work in students’ exercise books to make APP-style decisions about learning progress is totally unreliable. The students’ books were full of beautiful sectors calculations, all presented well and accurate. By looking at their books you would conclude their ‘learning’ of sectors was as good as their ‘learning’ of all the other topics. It wasn’t. They didn’t learn the sectors work well at all despite what their books would tell you.

This naturally forced me to raise the question ‘why was their learning from the sectors lesson so poorly retained?’ I considered various options such as:

  • was it the quality of my explanation?
  • was it the lesson structure?
  • were my activities incorrectly pitched?
  • had I overloaded their working memory during my explanation?
  • had I rushed this topic and not used AFL correctly to tell me when to move on?

There of course, could be endless other explanations, but I concluded with confidence that the way I built up the explanation was at fault. I went into the lesson with the hook ‘how many ghosts can Pacman eat?’ to engage pupils with the idea of thinking about what proportions of circles comprised the compound shapes of Pacman and the ghosts. From that I went into the sector formula etc. I realised in hindsight the step here was too quick and it is likely the students didn’t see the link and so were confused when working with the formula. I think the use of an early context ‘fogged’ their focus on learning how to use the formula and where it came from.

I then retaught the lesson taking a different approach (talking about fractions of circles without angles first before then developing how you can use angles to describe these) and progress was clearly much better after a summative check a couple of days later. The alternative explanation had worked.

The yellow objectives in the table were hit regularly during starters over the following week to ‘iron out’ the final few issues a small number of students were still having with them.

It was naturally disappointing to see that students hadn’t retained as much as in-class AFL told me they had. It hurts the pride somewhat, but once you get over that and accept this is the impact you’ve had, you feel enlightened and know exactly where to focus your intervention efforts. I feel closer to the ‘learning pulse’ than ever before and can ‘visibly see’ (Visible Learning!) what I’ve taught well or not in a way I never have before. I felt empowered to target my next few lessons to plug the gaps, knowing exactly where they were.

The biggest revelation I’ve had from this so far is that in-class AFL (using mini-whiteboards etc), even if done using summative testing does not tell you what has gone into students’ long-term memory. It tells you what has gone into either working memory or long-term memory but not which one. It doesn’t tell you what they’re going to retain. The only way to find out what went into long-term memory is to summatively asssess a few days later. In-class AFL is essential for checking understanding that day and directing lessons at key hinge points, but it is no substitute for summative assessment a finite time after the lesson to find out what was ‘learned’ rather than what was just ‘understood and copied’. You need both. Making progress judgements by looking in exercise books is unreliable for the same reasons as in-class AFL.

My pride was bruised by the post-learning assessment above, but the more I live ‘Visible Learning’, the more I understand what Hattie’s talking about as high-impact practice.  I have so much more clarity on what students have learned than ever before. I know when lessons have gone well. I know when they’ve gone poorly despite my ego telling me otherwise. I feel like I’ve been teaching blind until now! I’m ‘plugged in’ to the ‘real learning’ more than ever and make fewer assumptions. I teach based on what they know rather than what I assume they know. It’s made me a more reflective practitioner and more objective.

There’s certainly something in this Visible Learning!

Next step is to trial this with all my KS4 classes next year to see if the model will work across the attainment spectrum in a sustainable way.

Fun times.

 

How do we make John Hattie’s “Visible Learning” work in maths?

visiblelearningVisible Learning is John Hattie’s mantra. I’ve written previously about being a big fan of Hattie’s work about what affects achievement. He’s collected just about every piece of academic research and collated a ranked taxonomy of  factors that affect achievement. Visible Learning is his suggested approach to teaching in a way that incorporates many of the significant drivers of achievement.

He sums up what Visible Learning looks like using the analogy of teaching someone to abseil. The main features of the learning being:

  1. The success criteria is explicit. The learners not only know what they’re trying to achieve, but also what success looks like. Giving learning objectives isn’t enough, they need to see what success looks like.
  2. The tasks are challenging. Trusting in the rope to support your weight requires real trust. Learners find it extremely challenging, but when they complete it the first time they get a real sense of satisfaction and have the hunger to repeat it again.
  3. Student expectations. Ask students to state how they think they will do before they start a task. Human nature is for them to play safe in their prediction. When they exceed their forecast, their belief in themselves as a learner increases. This ratchets up over time and their expectations of themselves rise. This has by far the highest ranking effect size in all of Hattie’s findings.

Know thy impact is another of his mantras. He argues that formative assessment is vital in quality teaching and teachers should constantly be using evidence to reflect on the impact their practice had on their learners. Hattie says assessments are more for teachers than students; they are for you to find out what you taught well and to whom. This reflective, evidence-based mindset, he argues captures the essence of what educational research concludes has a high-impact on achievement. Teaching is to D.I.E for, says Hattie: Diagnose what they do/don’t know, Intervene, Evaluate your impact. Repeat.

I’ve been experimenting with an approach to teaching maths that meets the Visible Learning criteria and allows me to systematically know my impact by using the D.I.E philosophy. This has been the biggest step forward in my practice this year and the learners really like it. The problem is, I can’t see a way of embedding it permanently in my practice. Let me explain…

At the beginning of a topic students are given this sheet that gives the learning objectives.

1

The students then make a prediction of how many questions they will get right on the pre-learning assessment and enter this number into the table.

The students then sit the pre-learning assessment:

Screenshot_13_06_2014_19_58

This is class-marked and the students then highlight the appropriate smileys on the tracker sheet (after the pre-learning assessment, not before it) to record their success against each learning objective on the pre-learning assessment. They update the score in the table with what they actually got. Finally a grid is passed round with the students’ names vs the learning objectives. They tick and cross the grid so I then have a record of how each student did on each question.

I plan my lessons based on this pre-learning assessment feedback. I structure my lessons based on who knows what. I know exactly where to pick up from to avoid teaching things they really already know.

After the series of lessons student predict how they will do on the post-learning assessment, sit it, record their actual score, update the smileys and set targets about anything they still haven’t mastered.

The students love it. Comments include: “It really helps me understand what I’ve got to learn”, “It makes me realise that I am actually making progress in maths even when my grade on the exams isn’t going up every time”, “It makes the things I’ve got to learn a lot clearer” and “the assessments help you figure out what you do and don’t know. Stuff I thought I knew, I found out I didn’t and the other way round too. It’s been really useful”.

There are other things I really like:

  1. Using formative feedback from assessments has allowed me to be much more diagnostic and really impacted on my lesson planning. Rather than making assumptions about where to pick up from, I know without any doubt now. Things I would have spent whole lessons on before are now little-and-often starters. The pitch of my activities has changed. When I can see they have visited things before I set deeper learning activities much earlier than I would have previously. What was a chatty class are suddenly much more focussed and on task more regularly. I’m pitching work much better than ever because of the better than ever information I have on what students already know.
  2. I’ve never been a fan of giving students a list of learning objectives and asking them to RAG (red/amber/green) how good they think they are at them. RAG-ing doesn’t account for student ignorance! The conscientious, high attaining students underestimate what they can do and the overly-confident, lazy ones overestimate. By doing the RAG-ing based on what they can do on an actual assessment, the results are much more accurate. This is confirmed by the score students predicting they will get prior to the pre-learning assessment and the actual score they get being different; significantly so in many cases.
  3. I think the approach captures the spirit of the forthcoming assessing without levels reforms. The pre and post learning assessments document progress in a formative portfolio. Targets can be set based on accurate diagnostic assessment of their weaknesses. Progress is very clear to external observers, me the teacher and to the students.
  4. Supplying the learning objectives in conjunction with the pre-learning assessment really gives students focus from the start. They see the learning journey and buy into it. “I can’t wait until we get to the trapezium lesson”, was one student’s comment after the pre-learning assessment. I’ve never had students consistently looking forward along their learning journey like this before.

All sorted then, the world is fixed! No. I just can’t see a way of making it work all the time. I’m not trying to be defeatist. It’s not the preparation involved, it’s simply the time required to run it. There are 30 modules on the scheme of work, 352 learning objectives and it is already a struggle to cover the course content within the allotted time. Two extra lessons for the pre and post learning assessments mean 60 more lessons needed over the KS4 course. It’s simply not possible to sacrifice 60 lessons to this and still cover the course content. Do you agree?

This has been an issue causing me great frustration recently. The learning is so good with this system. It encapsulates so much of what Hattie says really boosts achievement and I have seen at first-hand why. Visible Learning is really, really good learning.

I want this to be something that is practical. I can’t see where the 60 lessons are going to come from. Schools just can’t do it. In theory, because you don’t need to teach everything (some topics students already know are identified on the pre-learning assessment) there are some savings here, but it’s not enough. Perhaps set the pre and post learning assessments as homeworks? That would have pitfalls! Where’s the time going to come from?!

I’m out of ideas currently. Do you have any? How could I make this work? Please make any suggestions you have in the comments section.

John Hattie- Why are so many of our teachers and schools so successful?

visible-learning-infographic-john-hattie-studie-460x400

I’m a big fan of John Hattie‘s work. Many people have big ideas to improve achievement in education. What makes John’s work unique is that he tells you by how much things improve achievement. In a time-pressured world, what makes his work so useful is that he tells you what to prioritise and focus on, and importantly, what to ignore.

I was fortunate to see him in the flesh a couple of weeks ago at the brilliant Osiris Educational Outstanding Teaching Conference 2014. He’s a guy certainly knows his stuff, worth listening to, and cuts through the fads and fashions to the real evidence data on what works. His conclusions are heart-warming.

For a summary of the main messages of his work, here are some great videos:

Visible Learning Part 1- Disasters and below average methods

Visible Learning Part 2- Effective Methods

Why are so many of our teachers and schools so successful?

Enhanced by Zemanta

Brainfeed- fantastic educational videos app for kids

Brainfeed_Splash

I highly recommend you check out the Brainfeed app in the iOS appstore. It is essentially a collection high-quality educational videos about a wide-variety of topics. For example: How does the brain work? Does space go on forever? How can you outrun a cheetah? How big is the ocean? There are mathematically themed videos too.

The strength of the app is the quality of the videos and that you know they will be appropriate for your young learners. Rather than letting your children search on Youtube and potentially come across inappropriate material, you know with the Brainfeed videos that they’ll be entertaining, engaging, educational and appropriate.

They remind me a lot of the TEDEd project; videos I have regularly featured on this blog. Similar to these, the Brainfeed videos are top class!

Here are some sample screenshots:

IMG_3100

IMG_3114

Robocompass.com

There’s a new geometry tool in town and it goes by the name of Robocompass.com. It’s a tool that shows geometrical constructions in a 3D environment, rather than the 2D plan view used by Geogebra, Geometer’s Sketchpad etc. In addition to a wealth of common construction examples, you can program Robocompass to make your own constructions using an easy to learn language.

Here’s the video tour:

Enhanced by Zemanta

Don Steward’s MEDIAN blog- fantastic isometric/ plans and elevations/ nets resources

Don Steward keeps on churning out his amazing resources! I can’t recommend his blog, MEDIAN, highly enough. If this guy wrote the textbooks/ worksheets/ exams our curriculum would be so much more challenging (in a good way) and mathematically thought-provoking. He’s inspired my own practice a lot. Below are links to three brilliant sets of resources he’s produced recently. Click the links to visit his site and download the resources from there.

Isometric Pictures

Picture1-1 Picture1 Picture6 Picture8

Plans and Elevations

Picture13 Picture1-1 Picture5 Picture1

Net Tasks

Picture1 Picture2 Picture3-1 Picture3

 

Enhanced by Zemanta

Benoit Mandelbrot talking about fractals in the real world

Ever heard of the Mandelbrot Set? It’s a famous fractal discovered by Benoit Mandelbrot, the father of Fractal Geometry. In this fascintating TED talk he explains his Theory of Roughness and how fractals can be found all around us in: cauliflowers, the stock market, mountainous landscapes and much more…

Fractals and the Art of Roughness

;

Enhanced by Zemanta

The Difference Between Games-Based Learning and Gamification

James Paul Gee

James Paul Gee (Photo credit: Wikipedia)

A thought -provoking read about the difference between games-based learning and gamification from the Teach Thought blog:

The Difference Between Games-Based Learning and Gamification

I have experimented with gamification principles myself in recent years and am a big fan of the ideas. James Paul Gee is an authority on the subject and his blog is always worth a read: http://www.jamespaulgee.com/

This is a talk he gave about what education can learn from video games:

Enhanced by Zemanta