It’s not often I can claim to be ahead of the trend. Pretty much never, to be honest. But this time I think I’ve managed it, and so I’m going to make sure all my readers, at least, know about it.

Recently the TES “exclusively reported” – which means other sites paraphrased their story and mentioned their name, but didn’t link – that Cambridge Assessment was considering ‘crowd-sourcing’ exam questions. This would involve teachers sending in possible questions which would then be reviewed and potentially used in external exams. Surplus questions would make up a large ‘question bank’.

I suggested this. This is, in fact, pretty much entirely my idea. I blogged ‘A New Exam Board’ in early 2012 suggesting teachers contribute questions which could then provide a range of sample papers as well as external exams. So it is not, despite what Tim Oates claims, a “very new idea.” Despite the similarity to my original post I do, however, have some concerns.

Backwards Backwards Design

So instead of teachers basing their classroom activities on giving kids the skills and knowledge they need to attempt exam questions, we’re doing it the other way around? As I’ve written before, it’s not necessarily a bad thing to ‘teach to the test’ – if the test is a good one. Writing exam questions and playing examiner is a valuable exercise, both for teachers and students, but the questions that result aren’t always helpful in themselves. As my OT-trained partner would remind me: “It’s the process, not the product.”


Being an examiner is something that looks good on a CV. It shows you take qualifications seriously and have useful experience. How can teachers verify the work they put into this? How can employers distinguish between teachers who sent in one dodgy question and those who shared a complete list, meticulously checked and cross-referenced? What happens when two or more teachers send in functionally identical questions?


A related but not identical point. How is the time teachers spend on this going to be recognized financially? And should it be the teacher, or the school? Unless they are paid, teachers are effectively volunteering their time and professional expertise, while Cambridge Assessment will continue to pay their permanent and contract staff. (I wonder how they feel about their work being outsourced to volunteers…)


It’s hardly surprising at this early stage that the details aren’t clear. One thing I’m interested in is whether the submissions shared as part of the ‘questions bank’ will go through the same quality control process as those used in the exams. If so, it will involve time and therefore money for Cambridge Assessment. If not, it risks giving false impressions to students who use the bank. And there’s nothing in the articles so far to say whether the bank of questions will be free to access or part of a paid product offered.

Student Advantage

Unless there are far fewer ‘donated’ questions than I’d expect, I don’t think we will really see a huge advantage held by students whose teachers contributed a question. But students are remarkably sensitive to the claims made by teachers about “there’s always a question on x” or “it wasn’t on last year’s paper, so expect y topic to come up”. So it will be interesting to see how they respond to their teachers contributing tot he exam they’ll be sitting.

You’re Welcome

I look forward to hearing from Cambridge Assessment, thanking me for the idea in the first place…


It turns out that I’m really bad at following up conference presentations.

Back in early June, I offered a session on teachers engaging – or otherwise – with educational research. It all grew out of an argument I had on Twitter with @adchempages, who has since blocked me after I asked if the AP Chem scores he’s so proud of count as data. He believes, it seems, that you cannot ever collect any data from educational settings, and that he has never improved his classroom practice by using any form of educational research.

But during the discussions I got the chance to think through my arguments more clearly. There are now three related versions of my opinion, quite possibly contradictory, and I wanted to link to all three.

Version the first: Learning From Mistakes, blogged by me in January.

Streamlined version written for the BERA blog: Learning From Experience. I wrote this a while back but it wasn’t published by them until last week.

Presentation version embedded below (and available from if you’re interested).

I’d be interested in any and all comments, as ever. Please let me know if I’ve missed any particular comments from the time – this is the problem with being inefficient. (Or, to be honest, really busy.) The last two slides include all the links in my version of a proper references section.

Thoughts from the presentation

Slide 8: it’s ironic that science teachers, who know all about using models which are useful even though they are by necessity simplified, struggle with the idea that educational research uses large numbers of participants to see overall patterns. No, humans aren’t electrons – but we can still observe general trends using data.

Slide 13: it’s been pointed out to me that several of the organisations mentioned offer cheaper memberships/access. These are, however, mainly institutional memberships (eg £50/yr for the IOP) which raises all kinds of arguments about who pays and why.

Slide 14: a member of the audience argued with this point, saying that even if articles weren’t open-access any author would be happy to share electronic copies with interested teachers. I’m sure he was sincere, and probably right. But as I tried to explain, this assumes that (1)the teacher knows what to ask for, which means they’ll miss all kinds of interesting stuff they never heard about and that (2)the author is happy to respond to potentially dozens of individual requests. Anyone other than the author or journal hosting or sharing a PDF is technically breaking the rules.

Slide 16: Ironically, the same week as I gave the presentation there was an article in SSR on electricity analogies which barely mentioned the rope model. Which was awkward as it’s one of the best around, explored and endorsed by the IOP among many others.

Slide 20: Building evidence-based approaches into textbooks isn’t a new idea (for example, I went to Andy’s great session on the philosophy behind the Activate KS3 scheme) but several tweeters and colleagues liked the possibility of explicit links being available for interested teachers.

Just think… in a few weeks, you’ll have a new crop of brand-new Year 7 students. Shiny faces, uniforms without holes and a complete pencil case. For about a day.

So it’s nearly time to teach graphs.

You may have already seen the resources produced by the ASE on the Language of Maths in Science (LoMiS). If not, go download them for free and have a look. It’s worth it, really. For a quick taste, Richard Needham did a piece for the Royal Society of Chemistry a while back which is a great introduction to the aims of the project.

And here’s an approach I’ve come up with which you may find a useful beginning. It’s based on what I’ve done in lessons in the past with a final addition I’ve been discussing recently with delegates and colleagues at the SPN Oxford Summer School.

1 coordinates


1 Number Lines

Putting numbers in sequence on a line is something students start to do at a young age, long before secondary school. To be honest, if kids can’t put whole numbers in the right order then graphs are going to be a distant dream. I agree that decimals make this harder at times, but I’m working on something about that too. Next week, maybe.

So give students a list of values and ask them to put them on a number line in order. Add challenge by having them convert values between units first, or have different numbers of significant figures. Top half of image:

Number lines

2 Number Lines to Scale

They might do this automatically. If not, it shouldn’t be too hard to have them do so (image above). Once they have a scale sorted out for the line, placing laminated cards for your supplied values along it should be straightforward.

3 Number Line to Scale = Axis

If you now have your students put the two number lines (one from each set of values) at right angles, they should be able to see that they’ve defined each point.

4 Mathematical Axes Of Doom

Two wooden dowels from B&Q (other DIY stores are available), with insulation tape wrapped round at regular intervals. I deliberately chose different intervals. Next time, I’d probably use wooden dowels with rectangular cross-section, simply so they don’t roll. You could use metresticks but I wanted to avoid any numbers. The tape is all you need, really.

2 axes

Put them at right angles and you have a set of axes, with the intervals clearly marked. Add the coordinate cards – because students have used the idea of a coordinate system for a lot longer than they’ve used graphs to tell a story – in the right places. They’re easy to adjust, so there’s less stress. (Low stakes, yes?) And if they look from above, any pattern is clear and anomalies can be considered. They can even see the best-fit line.

3 plotted

Extension ideas; use larger or smaller cards to get over the idea of precision in the readings. There is a link here to the idea of error bars, something we don’t usually cover but may find useful.

Thoughts, ideas, suggestions? Please let me know in the usual ways.

NB: you get funny looks if you carry the sticks on to a train.

I’m sure many other bloggers have posted about this already, but in case it’s passed you by; the new GCSE specification is officially starting from September. Many schools, of course, started teaching from the draft simple because, if you’re delivering GCSE Science over three years, there was no choice. For various reasons I’ve been looking at the AQA version in quite a lot of detail (as my previous post explained) and I wanted to share a summary I put together for the new content. The new material come from both directions, KS3 and A-level. It’s probably worth me explaining this.

Until now, some material was taught at KS3 (assuming you followed the national curriculum matched to the much-lamented SATs) and then assumed for GCSE. Some of this is now explicitly examined as part of the exam at 16. You could, of course, claim that once it’s been taught at age 13 it wouldn’t need to be revisited. Which, in my opinion, is daft. Other material has been taught as part of A-level for years, but hasn’t been part of the KS4 specification for years – certainly in my teaching memory of a decade or so. This will be a particular issue for schools which don’t deliver A-level, because they won’t have equipment or experience.

Energy: less emphasis on heat transfer and no mention of U-values. Note the use of the ‘new’ energy language (stores and pathways/processes) plus extra equations.

Electricity: a few bits of new vocabulary and slightly developed maths eg now explicitly includes P=I2R. Static electricity now includes electric fields, so you might want to try out the oil and semolina demonstration which is a nice parallel to iron filings around a magnet.
Particles: quite a lot of added material. This includes the idea of latent heat and the associated equation, which I don’t think has been taught to this age group since the days of O-level. There’s also lots on pressure in fluids (including gases) and the relationship between P and V aka Boyle’s Law.
Radioactivity: now includes neutrons as nuclear radiation, which personally I think is quite helpful. The vocabulary used distinguishes between irradiation and contamination (You may find this explanation helpful), but there’s less detail on industrial uses.
Forces: Lots added to this topic. Scalars and vectors are now explicit and students must be able to resolve forces at right angle using scale drawings. Levers has been extended to gears. Pressure includes both the equation for a column of liquid (this PhET simulation might help) and atmospheric pressure. The suvat equations are introduced with v2=u2+2as. Students need to be able to find tangents on d/t graphs. There’s new vocabulary to do with inertial mass. Not just the relationships but the identities of Newton’s Laws are needed, as well as a surprising amount of recall of ‘typical values’ such as reaction times, walking and running speeds and so on.
Waves: the sound and light content, previously at KS3, is now examined including mixing of colours and transmission or absorption by filters. Sound includes ultrasound uses. ‘Perfect’ black body definitions and uses are expected. This cheap wave driver might be useful for the required practical.
Magnetism: this includes all KS3 content but extends electromagnetism to an equation previously saved for A-level, F=BIl. The ideas of induced potential and the generator effect are also covered. On a personal note, I’d consider teaching the transformers material twice, once as part of electricity and once here.
Space: many teachers are disappointed that this topic is reduced – and completely missing if students do ‘double’ aka Trinity rather than separate sciences. I’ve always found it a topic which engaged weaker learners due to the big ideas and lack of scary maths, and now they won’t get to see it.
Hope these links are helpful – please comment or email if you have better suggestions or any other thoughts.


As I mentioned in my previous post, I’ve recently been doing some freelance work in a local school. The role is short-term and has an interesting mix of aims, but one part is to work with Year11 students on data analysis questions. Now, obviously I’ve taught these skills before. But I’ve not previously used the OCR B specification before, which features a final data question worth ten marks. I know this is running out soon but thought it might be worth sharing what I’ve created.

Firstly, a plea to all exam boards. When you release Examiners’ Reports – which are really useful, please keep doing it – can you combine them with the markscheme for easy reference? It’s something I’ve done for a while but it would make much more sense for you to do it.




Predictably, the specimen paper isn’t a great example to use. I’ve not included the 2015 paper because many schools will be using it for preparation in controlled conditions. The links above are to my own copies in case OCR rearranges their site with the new specifications, and I’ve added the Section D page details to the filenames to make life easier for colleagues.

It seems a good time to remind you all that in the past I produced quite a few resources for looking at past exam papers, mostly AQA. The tags on the right should make it fairly easy to find them.

When we used these in class, one of the outcomes was that students put together a list of “things to try if you’re stuck”. Now, for many pupils this will have been built in to their teaching, but we all know that kids don’t always absorb what we’re hoping they will. I think the real value of this is to generate a list with your own students, but for your interest:

  1. Highlight or underline numbers in the question
  2. Draw lines from the axes at specified values so you can find the corresponding value
  3. If the question is about differences, you’ll need to add or subtract
  4. If the question is about rates or uses the word ‘per’, you’ll need to divide or multiply and you might need to think about gradient or slope

Comments and suggestions welcome, as always.



I’ve been pretty quiet recently – at least it feels like I’ve not been offering much to the conversation. There are several reasons, but a big part of it is that with paid freelance work I’ve really not been able to justify the time to do things for free. I’m not going to apologize for this because I’m sure you’ll all understand that without this work my family and I can’t go on holiday.
But I’ve missed you all, even if you’ve not been missing me.
This will be a quick post, hopefully to be followed up over the next week with another. I’ve been working in a school a couple of days a week, mixing teacher coaching with some intervention classes. It’s been interesting – and enjoyable, at least after the kids stopped swearing at me – so I thought it might be worth sharing a few things I’ve done.
I’m currently reading Mentoring Mathematics Teachers, effectively a collection of research papers published as a book. Now, I don’t teach maths – except in the process of getting the physics right – but I’ve found it really interesting. It’s mainly aimed at in-school mentors for pre-service teachers (PGCE, School Direct or similar) and NQTs. I’ve got a strong interest in how we can support teachers for a longer period than just a year, and in my day job we mentor ‘Early Career Teachers’ to the end of their second year post-qualification. I’m working through about a chapter a week, making notes in the margins, and really need to blog some of the ideas. So it was perfect timing to come to Chapter 9 by Lofthouse and Wright, about encouraging reflection by using a pro forma for observations. I’ve adapted it slightly with a fair bit of success and wish I’d been using it for longer.
As a physics teacher, I feel I should now make the point that teaching is a quantum process which is changed simply by the act of being observed. If you laughed at that, congratulations and please pick up your Physics Education Geek badge on the way out.
observation pro forma

Click for PDF version

There are four stages:

  1. The ‘observee’ defines one or two aspects they want to focus on, choosing a couple of questions for the observer to bear in mind.
  2. The observer makes notes of specific features in the lesson relating to these questions – no judgment, just facts.
  3. The observer poses questions based on these features to prompt reflection and discussion.
  4. Together, the colleagues plan future actions based on the outcome of these prompts, leading to questions for the next observed lesson.
The aim of this structure is to encourage reflective practice rather than “I saw X and you should try Y instead.” In this way both teachers gain from it as there isn’t necessarily a hierarchy in place. It would work just as well when an experienced teacher is observed by a novice, with the questions directing them towards interesting features of the lesson. I can also see it being useful for peer observation – and like all such activities, it would work best when well-separated from any kind of performance management process.
I should emphasize that this is my take on the process rather than a paraphrased version of the original. And, of course, I’m still tweaking it! Currently I’m following up soon after the lesson but wonder if leaving the sheet with the observed teacher so they can think about the prompts more deeply might be worthwhile. I’m numbering the evidence I see and then grouping them in the ‘Reflection Prompts’ section if appropriate – this helps me gather my thoughts and gives more than one relevant example.
EDIT: I recommend reading a great post by @bennewmark, Finding a Voice, for the issues that can arise when an observee tries to replan a lesson based on well-meaning comments from a colleague.
Please help yourself to the printable version, try it out and let me know what you think. Maybe everyone else has something better already – it’s two years since I had a lesson observed! But I’d appreciate, as ever, any feedback or suggestions.

I’m really starting to get annoyed with this, and I’m not even in the classroom full-time. I know that many colleagues – @A_Weatherall and @hrogerson on Staffrm for example – are also irritated. But I needed to vent anyway. It’ll make me feel better.

EDIT: after discussion on Twitter – with Chemistry teachers, FWIW – I’ve decided it might help to emphasise that my statements below are based on looking at the Physics specification. I’d be really interested with viewpoints from those who focus on teaching Biology and Chemistry, as well as those with opinions on whether I’ve accurately summed up the situation with Physics content or overreacted.

The current GCSE Science specifications are due to expire soon, to be replaced by a new version. To fit in with decisions by the Department for Education, there are certain changes to what we’ve been used to. Many others have debated these changes, and in my opinion they’re not necessarily negative when viewed objectively. Rather than get into that argument, I’ll just sum them up:

  1. Terminal exams at the end of year 11
  2. A different form of indirect practical skills assessment (note that ISAs and similar didn’t directly assess practical skills either)
  3. More content (100+ pages compared to the previous 70ish for AQA)
  4. Grades 9-1 rather than A*-G, with more discrimination planned for the top end (and, although not publicised, less discrimination between weaker students)

Now, like many other subjects, the accreditation process seems to be taking longer than is reasonable. It also feels, from  the classroom end, that there’s not a great deal of information about the process, including dates. The examples I’m going to use are for AQA, as that’s the specification I’m familiar with. At least partly that’s because I’m doing some freelance resource work and it’s matched to the AQA spec.

Many schools now teach GCSE Science over more than two years. More content is one of several reasons why that’s appealing; the lack of an external KS3 assessment removes the pressure for an artificial split in content. Even if the ‘official’ teaching of GCSE starts in Year 10, the content will obviously inform year 9 provision, especially with things like language used, maths familiarity and so on.

Many schools have been teaching students from a the first draft specification since last September. The exam boards are now working on version three.

The lack of exemplar material, in particular questions, mean it is very hard for schools to gauge likely tiers and content demand for ‘borderline’ students. Traditionally, this was the C-D threshold and I’m one of many who recognized the pressure this placed on schools with league tables, with teachers being pushed much harder to help kids move from a D to a C grade than C to B. the comparison is (deliberately) not direct. As I understand it an ‘old’ middle grade C is now likely to be a level 4, below the ‘good pass’ of a level 5.

Most schools start to set for GCSE groups long before the end of Year 9. Uncertainties about the grade implications will only make this harder.

The increased content has three major consequences for schools. The first is the teaching time needed as mentioned above. The second is CPD; non-specialists in particular are understandably nervous about teaching content at GCSE which until now was limited to A-level. This is my day-job and it’s frustrating not to be able to give good guidance about exams, even if I’m confident about the pedagogy. (For Physics: latent heat, equation for energy stored in a stretched spring, electric fields, pressure relationships in gases, scale drawings for resultant forces, v2 = u2 -2as, magnetic flux density.) The last is the need for extra equipment, especially for those schools which don’t teach A-level Physics, with the extra worry about required practicals.

Even if teachers won’t be delivering the new specification until September, they need to familiarize themselves with it now. Departments need to order equipment at a time of shrinking budgets.

I’m not going to suggest that a new textbook can solve everything, but they can be useful. Many schools have hung on in the last few years as they knew the change in specification was coming – and they’ve been buying A-level textbooks for that change! New textbooks can’t be written quickly. Proofreading, publishing, printing, delivery all take time. This is particularly challenging when new styles of question are involved, or a big change such as the new language for energy changes. Books are expensive and so schools want to be able to make a good choice. Matching textbooks to existing resources, online and paper-based, isn’t necessarily fast.

Schools need time to co-ordinate existing teaching resources, samples of new textbooks and online packages to ensure they meet student needs and cost limitations.

Finally, many teachers feel they are being kept in the dark. The first specification wasn’t accredited, so exam boards worked on a second. For AQA, this was submitted to Ofqual in December (I think) but not made available on the website. Earlier this month, Ofqual chose not to accredit this version, but gave no public explanation of why. Teachers are left to rely on individual advisers, hearsay and twitter gossip. This information would have given teachers an idea of what was safe to rely on and what was likely to change. It took several weeks for the new submission dates to appear on the website – now  mid-March – and according to Ofqual it can take eight weeks from submission to accreditation.

If these time estimates are correct, the new AQA specification may not be accredited until mid-May and as yet there is nothing on record about what was wrong with previous versions. Teachers feel they are being left in the dark yet will be blamed when they don’t have time to prepare for students in September

I think that says it all.



Get every new post delivered to your Inbox.

Join 180 other followers