Equation recall test

This was supposed to be a really quick job. For something I’m working on, I was looking at the equations students need to recall for the GCSE Physics exam (specifically AQA). And it annoyed me that they weren’t in a useful order, or a useful format for testing. So I’ve made a testing sheet, with pages for Energy, ‘mostly Electricity’ and Forces.

There are four columns, which are blank in the first three pages (for students) but completed in the answer sheet version. Because I’m good to you.

Download eqn testing sheets as PDF

Equation for…

I’ve given the word, not the symbol – thoughts? (Could/should that be another column?) I’ve removed a couple of what I see as duplications, and missed out momentum because I was thinking of this as for everybody. Plus it would have mean adding another row and I was sick of messing with formatting.

Which variables are involved?

For students to write in the variables in words, as a starting point. The idea would be that you can give partial credit for them getting part way there, because we should recognise the early stages of recall. You may off course have them skip this bit later on.

What are the symbols?

If they know the variables, can they write down what they will look like in the equation? This would be the other place for them to show they know what the ‘equation for…’ variable could feature in symbol form.

Equation

Formatted as best I can, in a hurry in Publisher. I’ve used the letters as listed on the formula sheet, p95 of the specification. Even when I disagree.

As ever, please let me know if/when you spot mistakes. Because it’s in Publisher I can’t upload the editable version here, but drop me a line in the comments if useful and I’ll send it your way.

Advertisement

Core Physics revision sites handout

This second post in a day will be even briefer than the last. After complaints from my Year 10 students that they couldn’t possibly be expected to find good websites by themselves – yes, I know – I produced a quick handout listing a few URLs and comments for them. I was going to put it on the VLE, but realised it would be much more likely to be used if they had instant access, so added QR codes and gave them printed copies. Of course they were very appreciative for me giving up my break this morning to make this for them.

Stop laughing.

Anyway, here it is as a pdf. It’s got two identical pages because that was the fastest way to print off A5 versions, although it does mean there’s a bit of wasted space.

revision sites pic

Now, as this has quite possibly saved you a few minutes, I have a request to make. Use two of those minutes to add to my portfolio. Simply follow this link and tick a few boxes, no names necessary, so I can show how what I do helps people outside my immediate school. Many thanks.

Doing an ISA – Pre-Practical

There will be a second post in a few days, if I can fit it in between coughing, marking and spending time with my family. Please excuse the brevity, but it seems highly unlikely that my broadband connection – thank you Talk Talk – will last long enough for my usual wittering.

This is intended for those of us who teach GCSE Science with AQA, to help with the joy of an ISA. Of course we’ve no idea what format this will take once Gove’s messed around with it, but I can be fairly confident that even he couldn’t make it any worse. I’ve blogged before about the weaknesses I see with the current model, and what I’ve done to address them. Here’s the resources I’m currently using to try and help my classes. They should work, with tweaking of course, for any variant of the AQA Science courses. Click on the image for the presentation:

ISA preprac

I found that my students, despite having been shown the sample exam papers while they researched, struggled to include all relevant information on their Research Notes sheets. My solution was to produce an extra sheet with more detailed prompts, similar to those in the presentation above, which they could fill in. I had them keep the exam paper and markscheme open in an extra tab, and annotate their sheet with the linked question numbers for each fact. They then transferred their messy information to the official sheets, which of course acts as another rehearsal before the exam.

ISA preprac as .pdf

Please let me know what you think, good and bad. The ‘post-prac’ equivalents should be up by the end of halfterm, subject to the usual caveats.

Ofqual’s Absolute Error

In science lessons we teach students about the two main categories of error when taking readings. (And yes, I know that it’s a little more complicated than that.) We teach about random and systematic error.

Random errors are the ones due to inherently changing and unpredictable variables. They give readings which may be above or below the so-called ‘true value’. We can make allowances for them by repeating the reading, keeping all control variables the same, then finding a mean value. The larger the range, the bigger the potential random error – this is now described as the precision of the reading. I sometimes have my students plot this range as an error bar.

A systematic error is an artifact of the measuring system. It will be consistent, in direction and size (perhaps in proportion to the reading, rather than absolute). A common type is a ‘zero error’, where the measuring device does not start at zero so all readings are offset from the true value. We sometimes calibrate our readings to account for this.

You can consider spelling errors due to sloppy typing as being random, while persistently misspelling a particular word is systematic.

So what does this have to do with Ofqual?

The recent issues with the scoring of GCSE English coursework – discussed on twitter with the hashtag #gcsefiasco – are a good example of errors causing problems. But if we use the scientific approach to errors, it is much harder to blame teachers as Stacey has done.

Coursework is marked by teachers according to a markscheme, provided by the exam board. (It’s worth remembering that apart from multiple choice papers all external exams are marked in this way too.) An issue with controlled assessments is that teachers are unavoidably familiar with the marking guidelines, so can ensure students gain skills that should help them demonstrate their knowledge. This is after all the point of the classroom, to learn how it’s done. To complain that we ‘teach to the test’ is like criticising driving instructors for teaching teenagers how to drive on British roads.

Once the work of all students in a  cohort has been marked, the department will spend some time on ‘internal moderation’. This means checking a random sample, making sure everyone has marked in the same way, and to the standard specified by the markscheme. Once the school has committed to the accuracy of the marks, they are sent to the exam board who will specify a new random sample to be remarked externally. If the new scores match those awarded by the school, within a narrow tolerance, then all the scores are accepted. If not, then all will be adjusted, up or down, to correct for a systematic error by the department. There will still be a few random errors – deviations from the ‘correct’ score on specific essays – but these will be fairly rare.

The exam board then converts the coursework score, using a top secret table, into a percentage of the available marks. You may not need to get everything perfect to get an ‘effective’ 100% on the coursework element of the course. And dropping 2 of 50 on the raw score, as marked by the teachers, may mean more than a 4% decrease after conversion. This table will be different for different papers because some exams are harder than others, but changes should be minimal if we want to able to compare successive years.

So what happened last summer?

Students who had gained the same raw score on the same coursework task, which had been marked to the same standard as confirmed by the exam boards during external moderation, were awarded different percentages by the exam boards depending on when the work was sent in. This was after sustained pressure from Ofqual, possibly because using the same boundaries in June as they had in January would have resulted in ‘too many’ higher grades. This was not about a small number of random errors in marking. This was not about a systematic error by some or all schools, because the boards had procedures to identify that. This was about a failure by the exam boards and Ofqual to discreetly fix the results the way they intended to.

It is a basic principle in science that you cannot adjust your results based on what you want or expect them to be. You might be surprised, you might recheck your working, but you can’t change the numbers because of wishful thinking. If there was an error, it was by the exam boards and Ofqual, who showed that they could not specify what work was equivalent to a C grade.

The procedures were followed in schools. The exam boards agreed that the controlled assessments were marked to their own standards. And yet Ofqual still claim that it is the fault of us teachers, who prepared our students so well for the controlled assessment that we are being called cheats.

I’ve blogged before about the weaknesses built in to the science ISAs. The exam board and Ofqual are either too busy to read what one teacher has to say – perfectly reasonable – or don’t have an answer. I don’t understand how it is our fault when their system approved what teachers did and how they marked.

So maybe we shouldn’t be marking controlled assessments at all.

PS (This is the cue for the unions to step in. And they won’t. This is why we need one national professional body representing teachers, using evidence rather than political rhetoric.)

Enemies of Promise

This will be a short post, partly because I’ve got lots of other things on the go and partly because I’m too angry about what appears to have happened. I say appears because I truly hope things aren’t as they seem, for the sake of our students.

In January – and at points since then – Michael Gove has labelled teachers and others who criticise his plans as ‘enemies of promise’. This has been used despite the criticisms often being valid and fair, based on data rather than ideology, and often from those who clearly know far more about educational theory and practice than him.

It appears, from lots of conversation on twitter and in the press, that this year’s GCSE results show some unexpected features. Overall, they seems to be a little lower than in previous years, and one exam in particular seems to have affected English results. Students who completed the foundation controlled assessment papers in January needed a lower score to achieve a C than those who sat the equivalent exam in June. (This issue is one we have seen many times with the AQA Science equivalent, ISAs.) The difference is significant and means that many students nationally have failed to reach a Grade C despite being on track for it up until this point.

There are two issues here, one of which is immediately significant. Students who have failed to achieve a grade C in English will find that their next steps – college or sixth form courses, apprenticeships and so on – are now barred to them. This matters now. Many of them will have been expecting to confirm their education and training places in the next week or so. There is little time to address this problem, if things are really as unfair as they seem.

And things are unfair. Most teachers, most people, accept that more challenging courses are worthwhile. Students may not be happy with the idea, but the difficulty of achieving particular grades is effectively an arbitary choice. Changing it from year to year, or between exam boards, obviously makes comparisons and target setting much harder, but it is not unfair. Changing the grade boundaries, between the students sitting an exam and being given their grade – for students doing one particular course – is clearly very different. The press today have suggested it is like moving the goalposts not just during a football game, but after a penalty has been taken and before the ball crosses the line.

A cynic would suggest that the government see moving goalposts after the numbers are known is a standard political tactic.

The other issue – and today of all days, this must be seen as secondary to the plight of affected students – is that schools are judged on their results. Gove and the Department for Education can take greater steps to control what happens in a school if GCSE results drop below certain levels. A significant indicator are the the number of 5A*-C grades, including English and Maths, and the EBacc. Both of these will drop in schools which have had students marked down from a C to a D grade in English due to these eleventh hour changes.

I’m trying very hard not to be cynical. I don’t teach English, except in the sense that many teachers share favourite books, correct spelling or help with grammar. But like many others, I struggle to see the fairness in changing how students are graded, after they have studied and sat the exam. Their lower results will now make it easier for unpopular, non-evidence-based and rushed changes to be pushed through, including forced academisation. This means it is even more important to find out who ordered these grade boundary alterations.

Who are the enemies of promise now, Mr Gove?

Gove’s Resit

I was already planning to type up a few more developments in the #govelevels saga. Reading that Gove is to blame pretty much everyone except politicians for the difficulties in the exam system just means I’m finding it a little hard to be balanced. I’ll do my best, because I partly wonder if his intent is to push teachers to react angrily rather than rationally to his proposals. The more we respond with rhetoric and ad hominem attacks – as tempting as it is – the harder it is to seem professional.

Basically, this is going to be a short post with links that I’ve already shared on twitter. I’d like to flag them up again for anyone who missed them the first time, and to take the chance to comment in a little more than 140 characters. If there’s time, I’ll also address some of the comments from my previous post, which got a lot more attention than I expected.

First of all, I’d like to direct you to @miss_mcinerney‘s blog, where she explains why Gove is wrong on the ‘bottom 25%‘. The calculation goes some way to address my concerns in terms of ‘borderline’ students. It looks as if the 25% figure was plucked out of the air, perhaps to appeal to the very Daily Mail readers the story was leaked to. Laura’s calculations suggest an absolute maximum of 10% of students would be best suited to not doing O-levels, unless Gove is planning to make them even more challenging than even he suggests. (Of course in Science we’ve already seen OfQual decrease student grades, demoralising students and making targets fairly useless: information here and here.)

This smaller proportion will potentially stigmatize the students even more, as well as making the cost per student of implementing them – in terms of teacher time and money – even greater. Of course, maybe Gove just can’t tell the difference between 25% and 10%, in which case a resit is needed. (Oops – they’re not allowed any more!) I’ve already linked to her original post but if you haven’t yet read it, I’d like to recommend it once more.

If I’d read this post by @dukkhaboy about why O-levels aren’t the issue before I’d written up my piece, it might have saved me a lot of time.  In particular it mentions something I passed over; each change in the specification means teachers can spend less time being innovative because they have to sort out the teaching scheme. Politicians seem oblivious to the thought that we might not be able to do this kind of thing as paid overtime.

Lots of interesting, reasonable responses, at least some of which are from people who know what they’re talking about, at the Guardian.

Warwick Mansell (@warwickmansell) has written a scathing critique of the National Curriculum review – it appears some of the same issues are present as with the ‘proposals’ for 14-16 exam changes. In particular, it seems ministers are ignoring the advice of professionals, the demands on teachers for writing local schemes, and the difficulties of implementing the changes in a short time. It’s as if the politicians haven’t a clue about the real world of education. The contrast between the evidence found and quoted in this article, and the very vague attempts at justification by Gove, Gibbs et al, is striking.

Thoughts, comments, ideas? Is Gove leaking such dramatic changes, as some have suggested, that more reasonable ‘official’ ones are accepted more easily. I suppose we’ll find out in time whether he has some evidence-based suggestions or if this has just been a way for him to bolster political support for a future leadership bid. I’ll leave you with that scary thought: that instead of being about children’s qualifications, this could all have been for political advancement. That’s the real weakness of having a Secretary of State who is a politician not an educator.