Exam Paper Debriefs (Summer 2012)

I’m combining two resources into one post here, but hopefully they should still show up by searching. (He types, hurriedly adding some tags.) I’ve made two powerpoints, each matched to what I think are the easy marks available on the summer 2012 P1 and P2 exams from AQA. Useful as practice or as full mocks, I often have students go through them focusing on what they should all aim for, before checking through in more detail. Having students divide their missed marks (using this exam paper debrief pdf) into recall failures and method mistakes can be helpful.

If students are able, they could also be pointed towards the examiners’ reports, which are only available if you go through the subject link at AQA rather than the direct Past Papers route. If not, then this is our job anyway – perhaps something to consider as part of a backwards design approach?

P1 june2012 easy as ppt, for the P1 summer 2012 exam – see also my P1 summary activity.

P2 may2012 easy as ppt, for the P2 summer 2012 exam – see also my P2 summary activity.

And yes, before you ask – I am working on equivalent resources for more recent exams, hopefully to be done before we all need them for mocks. Although the summer 2013 papers haven’t shown up yet – is that because, without January 2014 papers to use, AQA are expecting those to be used as mocks too? Must check e-AQA… (adds to evergrowing to do list)

Finally; yes, I’ve been fairly quiet and quite down as of late; lots going on, I’ll be fine, send chocolate and coffee if feeling helpful. As that’s pretty much all I’ve been eating for a while, supplies are running low!



Generating Electricity (the YorkScience way)

If you’re a science teacher and not yet aware of the approach being trialed and developed as part of YorkScience, you should probably go have a look. My understanding is that it revolves around two linked ideas:

  • If we start by considering how we will assess competence, we can make sure we use appropriate activities to develop that competence.
  • Students should have access to everything we can provide about how answers are judged, so they can focus on what separates good and better answers.

To a novice, outsider or government minister, this may look like ‘teaching to the test’. The problem is that using that phrase shows the speaker thinks the test is no good. There’s no reason why we shouldn’t teach students to perform well in a test, if the test shows us something useful about the students’ knowledge and skill.

Anyway, political point made, let’s get on with the show.

I’ve produced (or more accurately, mostly finished producing) a resource which teachers can use to assess and develop understanding of generating electricity. There are two pdfs. The first is intended to make a booklet (4 sides of A4) which students write on over the course of a lesson and homework. The lesson plan would be as follows:

Starter: “Wind or Nuclear Power?” on the board, with cartoons or images if preferred.


  1. Students attempt multiple choice question in exam conditions. Allow time to fill in the ‘Why’ section too – perhaps ten minutes overall?
  2. Model the process of using the confidence grid using the first part of the exam question, ideally projecting the pdf and, by discussion, ticking appropriate boxes.
  3. Students work together in pairs or small groups to finish the grids.
  4. The second resource generating electricity diagnostic incomplete ms leads students through different (but overlapping) activities depending on which answers they chose. This is intended to mimic the teacher judgement which means you explain things in different ways depending on how and why a student made a mistake. This so far only has the first part (of four) completed.
  5. Discuss and compare the notes students have made to support them in each answer.
  6. Start (and probably set for homework) the notes summary on the last page of the booklet. This includes key words for prompts and gives some suggestions of format.


  • Which would you choose, Wind or Nuclear Power? Students must back up their opinion with a (revisable) fact.
  • What exam skills have we practised today?

I’m hoping to post the full version of the markscheme pages, as soon as they’re done. This may be completed as an extension activity by my triple group. 🙂 Comments and suggestions welcome, please let me know what you think.

Doing an ISA – Pre-Practical

There will be a second post in a few days, if I can fit it in between coughing, marking and spending time with my family. Please excuse the brevity, but it seems highly unlikely that my broadband connection – thank you Talk Talk – will last long enough for my usual wittering.

This is intended for those of us who teach GCSE Science with AQA, to help with the joy of an ISA. Of course we’ve no idea what format this will take once Gove’s messed around with it, but I can be fairly confident that even he couldn’t make it any worse. I’ve blogged before about the weaknesses I see with the current model, and what I’ve done to address them. Here’s the resources I’m currently using to try and help my classes. They should work, with tweaking of course, for any variant of the AQA Science courses. Click on the image for the presentation:

ISA preprac

I found that my students, despite having been shown the sample exam papers while they researched, struggled to include all relevant information on their Research Notes sheets. My solution was to produce an extra sheet with more detailed prompts, similar to those in the presentation above, which they could fill in. I had them keep the exam paper and markscheme open in an extra tab, and annotate their sheet with the linked question numbers for each fact. They then transferred their messy information to the official sheets, which of course acts as another rehearsal before the exam.

ISA preprac as .pdf

Please let me know what you think, good and bad. The ‘post-prac’ equivalents should be up by the end of halfterm, subject to the usual caveats.

6 Mark Questions

This is one approach to teaching the dreaded 6 mark AQA questions. I’d be interested in comments or suggestions, as ever. The powerpoint that goes along with it was set up for B1, but is obviously easily changed. 6 Mark Questions as ppt.


  • Recap key facts
  • Improve structure of answers to 6 mark questions
  • (Appreciate that it’s hard to write good 6 mark questions and markschemes)


Question on board, set timer running: “You have 6 minutes.”

I do it, We do it together

Ask what they think the aim of the lesson is.

6 mark questions may require explanations, examples to illustrate a specified concept, judgements of advantages and disadvantages, a description of a process or an experimental method. Marks are awarded for scientific content and the quality of the writing. This means key ideas must be clear and the explanation must make sense, the points in a logical order. Most students lose marks because their answers lack sufficient detail eg scientific vocabulary or because their answer is rambling or confused. Markschemes will usually include graded answers (low=1-2 marks, 3-4, 5-6) and examiners will decide which description fits best, then award the higher or lower score depending on the quality of writing. Aim for between 4 and 6 scientific points or steps in a process; if opposing viewpoints are needed include points for and against, or examples of plants and animals etc.

Introduce method:

  • Bullet point ideas
  • Number the points to give a logical sequence, adding or removing points.
  • Use this order to write coherent sentences.

Model with a new question, ask students to consider how they would structure their answer, show numbers, ask them to discuss possible sentences based on these points. Compare with each other, pick up on details needed by examiner.

You do it together

Give them more questions, have them discuss one in pairs while they attempt it. Collaboration should be about making suggestions and producing two different answers which can be compared, not one identical answer. You could give a choice or set it by rows. Go through example bullet points, discuss gaps, additions and exclusions. Elicit possible/useful connectives.

You do it alone

Attempt a question in exam conditions, following method. Compare to markscheme (ideally this one should be a past or sample question with specified allowed answers) and make specific improvements. Return to the original Starter question and annotate their answer, explaining why they would change various parts.


  • Have students write their own questions and markschemes for specific points in the syllabus. Linking this to higher order tasks via Blooms or SOLO may be useful.
  • Use the questions to play consequences where one student writes a question, one writes bullet points, one sequences and a last writes full sentences. This will end up with four complete answers which can then be discussed.
  • Give sample answers and have students mark them, first with and then without a markscheme. What do they forget? What level of detail is required?


UPDATE: A useful approach from @gregtheseal via twitpic, and I like the ‘CUSTARD’ mnemonic shared by @IanMcDaid. Thank you!

Ofqual’s Absolute Error

In science lessons we teach students about the two main categories of error when taking readings. (And yes, I know that it’s a little more complicated than that.) We teach about random and systematic error.

Random errors are the ones due to inherently changing and unpredictable variables. They give readings which may be above or below the so-called ‘true value’. We can make allowances for them by repeating the reading, keeping all control variables the same, then finding a mean value. The larger the range, the bigger the potential random error – this is now described as the precision of the reading. I sometimes have my students plot this range as an error bar.

A systematic error is an artifact of the measuring system. It will be consistent, in direction and size (perhaps in proportion to the reading, rather than absolute). A common type is a ‘zero error’, where the measuring device does not start at zero so all readings are offset from the true value. We sometimes calibrate our readings to account for this.

You can consider spelling errors due to sloppy typing as being random, while persistently misspelling a particular word is systematic.

So what does this have to do with Ofqual?

The recent issues with the scoring of GCSE English coursework – discussed on twitter with the hashtag #gcsefiasco – are a good example of errors causing problems. But if we use the scientific approach to errors, it is much harder to blame teachers as Stacey has done.

Coursework is marked by teachers according to a markscheme, provided by the exam board. (It’s worth remembering that apart from multiple choice papers all external exams are marked in this way too.) An issue with controlled assessments is that teachers are unavoidably familiar with the marking guidelines, so can ensure students gain skills that should help them demonstrate their knowledge. This is after all the point of the classroom, to learn how it’s done. To complain that we ‘teach to the test’ is like criticising driving instructors for teaching teenagers how to drive on British roads.

Once the work of all students in a  cohort has been marked, the department will spend some time on ‘internal moderation’. This means checking a random sample, making sure everyone has marked in the same way, and to the standard specified by the markscheme. Once the school has committed to the accuracy of the marks, they are sent to the exam board who will specify a new random sample to be remarked externally. If the new scores match those awarded by the school, within a narrow tolerance, then all the scores are accepted. If not, then all will be adjusted, up or down, to correct for a systematic error by the department. There will still be a few random errors – deviations from the ‘correct’ score on specific essays – but these will be fairly rare.

The exam board then converts the coursework score, using a top secret table, into a percentage of the available marks. You may not need to get everything perfect to get an ‘effective’ 100% on the coursework element of the course. And dropping 2 of 50 on the raw score, as marked by the teachers, may mean more than a 4% decrease after conversion. This table will be different for different papers because some exams are harder than others, but changes should be minimal if we want to able to compare successive years.

So what happened last summer?

Students who had gained the same raw score on the same coursework task, which had been marked to the same standard as confirmed by the exam boards during external moderation, were awarded different percentages by the exam boards depending on when the work was sent in. This was after sustained pressure from Ofqual, possibly because using the same boundaries in June as they had in January would have resulted in ‘too many’ higher grades. This was not about a small number of random errors in marking. This was not about a systematic error by some or all schools, because the boards had procedures to identify that. This was about a failure by the exam boards and Ofqual to discreetly fix the results the way they intended to.

It is a basic principle in science that you cannot adjust your results based on what you want or expect them to be. You might be surprised, you might recheck your working, but you can’t change the numbers because of wishful thinking. If there was an error, it was by the exam boards and Ofqual, who showed that they could not specify what work was equivalent to a C grade.

The procedures were followed in schools. The exam boards agreed that the controlled assessments were marked to their own standards. And yet Ofqual still claim that it is the fault of us teachers, who prepared our students so well for the controlled assessment that we are being called cheats.

I’ve blogged before about the weaknesses built in to the science ISAs. The exam board and Ofqual are either too busy to read what one teacher has to say – perfectly reasonable – or don’t have an answer. I don’t understand how it is our fault when their system approved what teachers did and how they marked.

So maybe we shouldn’t be marking controlled assessments at all.

PS (This is the cue for the unions to step in. And they won’t. This is why we need one national professional body representing teachers, using evidence rather than political rhetoric.)

Doing an ISA with AQA

I’ve managed not to blog about GCSE ‘reform’ – despite great temptation. If you’ve not seen them, then I suggest comparing three very different viewpoints (in style as well as opinion) from LKMCo, Tom Bennett and NAHT. When I have time I might update my previous post, from the last time Gove announced a major policy by leaking the details to the Daily Mail.

For now, a quick ‘ideas’ post about using ISAs for good science teaching, and hopefully enabling kids to achieve. This is partly in response to questions from @NQT_diary, as it’s spurred me to turn the draft into an actual readable item.

Teachers’ Notes

  • the ISA involves lots of paper – maybe your department will be organised, but double check
  • make sure you practise the actual experiment, if for no other reason than to generate the ‘sample data’ needed
  • remember that the markscheme is now ‘best fit’; compare with colleagues if needed to make sure you are consistent as a centre, as this is arguably the most important aspect come moderation day
  • you can share more than you think with the students


Perhaps somewhat idealistically, I try to use ISA teaching as a way to bring together lots of ‘bits’ of investigative science. Ideally, of course, you will have used all of the skills and language in regular lessons; that after all is the point. Make sure that KS3 pupils are familiar with at least some of the terminology. The practicals are straightforward (sometimes insultingly so) which means students can focus on their explanations and analysis. Make sure you are using the updated language; I have sometimes had pupils create their own version of this using a range of examples.

My Structure

  1. Introduction
  2. Research 1
  3. Research 2
  4. Preparation for planning exam (Section 1)
  5. Section 1 exam inc table
  6. Practical 1
  7. Practical 2 inc graph/chart
  8. Preparation for analysis exam (Section 2)
  9. Section 2 exam

There are lots of issues with the ISA, as I blogged a little while back. It is possible to use it effectively, but in some ways I feel the exam works against good teaching; this wouldn’t be a problem if it didn’t take so long!

Students will need to complete the ‘research notes’ pro forma to take into their Section 1 exam; I had them do a ‘rough’ version which meant they had lots of material to annotate while revising/preparing. How much you direct them to particular sites is frustratingly vague, but in my setting we provided a range of sourses, some deliberately not well-suited, to make sure they had to think critically. Once the table is marked you can provide a replacemetn if that suits the practical better, without penalty. This means they aren’t penalised if a poor table would stop them collecting useful data. After the practical, the data and graph/chart must be collected, and returned for the Section 2 exam. Along with a set of ‘sample data’ (you produce), the ‘Case Studies’ (supplied by AQA) and their Research notes. They need a big table.

While teaching I used GRR principles (skills development from literacy, more info coming soon) which focuses on productive collaborative work. This adds an explicit stage in the teaching of skills (rather than content):

  1. I do, thinking out loud
  2. We do together
  3. You do collaboratively
  4. You do individually

The same structure can be used for the preparation lessons for both exams, and this brings us to the most surprising part of the ISA. We can share the specimen papers with students, and the exams are very defined in style so that in many cases they are effectively identical to the specimen. So they can attempt the specimen questions, go through the markscheme with teacher support, then sit what they know will be a very similar exam about their own research and experiment.

This still seems weird to me.

The preparation for the planning and analysis exams can be done in similar ways:

  • Talk through the specimen context and model a possible question for them, linking to key definitions (5min)
  • Have them predict and write down 2/3 questions that could be asked about experiment or data (5/10min)
  • In small groups, give them part of the specimen paper and have them discuss main points (10min)
  • Write their answers individually to improve accountability (10min)
  • Go through markscheme, comparing good/intermediate answers, having them mark/annotate their answers (15min) If time, they could compare answers from students who had time to discuss with those who answered ‘cold’

This gives them the practice they need, as well as building the skills. Of course ideally we would use all these bits individually in other lessons! I’d love to hear from anyone with thoughts or comments about what I’ve suggested.

Enemies of Promise

This will be a short post, partly because I’ve got lots of other things on the go and partly because I’m too angry about what appears to have happened. I say appears because I truly hope things aren’t as they seem, for the sake of our students.

In January – and at points since then – Michael Gove has labelled teachers and others who criticise his plans as ‘enemies of promise’. This has been used despite the criticisms often being valid and fair, based on data rather than ideology, and often from those who clearly know far more about educational theory and practice than him.

It appears, from lots of conversation on twitter and in the press, that this year’s GCSE results show some unexpected features. Overall, they seems to be a little lower than in previous years, and one exam in particular seems to have affected English results. Students who completed the foundation controlled assessment papers in January needed a lower score to achieve a C than those who sat the equivalent exam in June. (This issue is one we have seen many times with the AQA Science equivalent, ISAs.) The difference is significant and means that many students nationally have failed to reach a Grade C despite being on track for it up until this point.

There are two issues here, one of which is immediately significant. Students who have failed to achieve a grade C in English will find that their next steps – college or sixth form courses, apprenticeships and so on – are now barred to them. This matters now. Many of them will have been expecting to confirm their education and training places in the next week or so. There is little time to address this problem, if things are really as unfair as they seem.

And things are unfair. Most teachers, most people, accept that more challenging courses are worthwhile. Students may not be happy with the idea, but the difficulty of achieving particular grades is effectively an arbitary choice. Changing it from year to year, or between exam boards, obviously makes comparisons and target setting much harder, but it is not unfair. Changing the grade boundaries, between the students sitting an exam and being given their grade – for students doing one particular course – is clearly very different. The press today have suggested it is like moving the goalposts not just during a football game, but after a penalty has been taken and before the ball crosses the line.

A cynic would suggest that the government see moving goalposts after the numbers are known is a standard political tactic.

The other issue – and today of all days, this must be seen as secondary to the plight of affected students – is that schools are judged on their results. Gove and the Department for Education can take greater steps to control what happens in a school if GCSE results drop below certain levels. A significant indicator are the the number of 5A*-C grades, including English and Maths, and the EBacc. Both of these will drop in schools which have had students marked down from a C to a D grade in English due to these eleventh hour changes.

I’m trying very hard not to be cynical. I don’t teach English, except in the sense that many teachers share favourite books, correct spelling or help with grammar. But like many others, I struggle to see the fairness in changing how students are graded, after they have studied and sat the exam. Their lower results will now make it easier for unpopular, non-evidence-based and rushed changes to be pushed through, including forced academisation. This means it is even more important to find out who ordered these grade boundary alterations.

Who are the enemies of promise now, Mr Gove?