I’m sure many other bloggers have posted about this already, but in case it’s passed you by; the new GCSE specification is officially starting from September. Many schools, of course, started teaching from the draft simple because, if you’re delivering GCSE Science over three years, there was no choice. For various reasons I’ve been looking at the AQA version in quite a lot of detail (as my previous post explained) and I wanted to share a summary I put together for the new content. The new material come from both directions, KS3 and A-level. It’s probably worth me explaining this.
Until now, some material was taught at KS3 (assuming you followed the national curriculum matched to the much-lamented SATs) and then assumed for GCSE. Some of this is now explicitly examined as part of the exam at 16. You could, of course, claim that once it’s been taught at age 13 it wouldn’t need to be revisited. Which, in my opinion, is daft. Other material has been taught as part of A-level for years, but hasn’t been part of the KS4 specification for years – certainly in my teaching memory of a decade or so. This will be a particular issue for schools which don’t deliver A-level, because they won’t have equipment or experience.
Energy: less emphasis on heat transfer and no mention of U-values. Note the use of the ‘new’ energy language (stores and pathways/processes) plus extra equations.
Filed under: AQA, physics, planning, practicals, teaching | 9 Comments
As I mentioned in my previous post, I’ve recently been doing some freelance work in a local school. The role is short-term and has an interesting mix of aims, but one part is to work with Year11 students on data analysis questions. Now, obviously I’ve taught these skills before. But I’ve not previously used the OCR B specification before, which features a final data question worth ten marks. I know this is running out soon but thought it might be worth sharing what I’ve created.
Firstly, a plea to all exam boards. When you release Examiners’ Reports – which are really useful, please keep doing it – can you combine them with the markscheme for easy reference? It’s something I’ve done for a while but it would make much more sense for you to do it.
- exam paper 2013 June H p30-32
- combined markscheme and examiner’s report 2013JuneOCRBGatewaySectionD
- Stopping distance card sort
Predictably, the specimen paper isn’t a great example to use. I’ve not included the 2015 paper because many schools will be using it for preparation in controlled conditions. The links above are to my own copies in case OCR rearranges their site with the new specifications, and I’ve added the Section D page details to the filenames to make life easier for colleagues.
It seems a good time to remind you all that in the past I produced quite a few resources for looking at past exam papers, mostly AQA. The tags on the right should make it fairly easy to find them.
When we used these in class, one of the outcomes was that students put together a list of “things to try if you’re stuck”. Now, for many pupils this will have been built in to their teaching, but we all know that kids don’t always absorb what we’re hoping they will. I think the real value of this is to generate a list with your own students, but for your interest:
- Highlight or underline numbers in the question
- Draw lines from the axes at specified values so you can find the corresponding value
- If the question is about differences, you’ll need to add or subtract
- If the question is about rates or uses the word ‘per’, you’ll need to divide or multiply and you might need to think about gradient or slope
Comments and suggestions welcome, as always.
Filed under: exams, ocr, physics, printables, resource, revision, teaching | Leave a Comment
Tags: exams, ocr, printables
As a physics teacher, I feel I should now make the point that teaching is a quantum process which is changed simply by the act of being observed. If you laughed at that, congratulations and please pick up your Physics Education Geek badge on the way out.
There are four stages:
- The ‘observee’ defines one or two aspects they want to focus on, choosing a couple of questions for the observer to bear in mind.
- The observer makes notes of specific features in the lesson relating to these questions – no judgment, just facts.
- The observer poses questions based on these features to prompt reflection and discussion.
- Together, the colleagues plan future actions based on the outcome of these prompts, leading to questions for the next observed lesson.
- Wright D, Lofthouse R. Developing the mathematics teacher as mentor through team-based approaches to mentoring. In: Hyde, R and Edwards JA, ed. Mentoring mathematics teachers: supporting and inspiring pre-service and newly qualified teachers. Abingdon: Routledge, 2013, pp.141-153.
Filed under: CPD, ed-research, reflection, resource, teaching | 3 Comments
Tags: observation, printable, reflection, research, resource
I’m really starting to get annoyed with this, and I’m not even in the classroom full-time. I know that many colleagues – @A_Weatherall and @hrogerson on Staffrm for example – are also irritated. But I needed to vent anyway. It’ll make me feel better.
EDIT: after discussion on Twitter – with Chemistry teachers, FWIW – I’ve decided it might help to emphasise that my statements below are based on looking at the Physics specification. I’d be really interested with viewpoints from those who focus on teaching Biology and Chemistry, as well as those with opinions on whether I’ve accurately summed up the situation with Physics content or overreacted.
The current GCSE Science specifications are due to expire soon, to be replaced by a new version. To fit in with decisions by the Department for Education, there are certain changes to what we’ve been used to. Many others have debated these changes, and in my opinion they’re not necessarily negative when viewed objectively. Rather than get into that argument, I’ll just sum them up:
- Terminal exams at the end of year 11
- A different form of indirect practical skills assessment (note that ISAs and similar didn’t directly assess practical skills either)
- More content (100+ pages compared to the previous 70ish for AQA)
- Grades 9-1 rather than A*-G, with more discrimination planned for the top end (and, although not publicised, less discrimination between weaker students)
Now, like many other subjects, the accreditation process seems to be taking longer than is reasonable. It also feels, from the classroom end, that there’s not a great deal of information about the process, including dates. The examples I’m going to use are for AQA, as that’s the specification I’m familiar with. At least partly that’s because I’m doing some freelance resource work and it’s matched to the AQA spec.
Many schools now teach GCSE Science over more than two years. More content is one of several reasons why that’s appealing; the lack of an external KS3 assessment removes the pressure for an artificial split in content. Even if the ‘official’ teaching of GCSE starts in Year 10, the content will obviously inform year 9 provision, especially with things like language used, maths familiarity and so on.
Many schools have been teaching students from a the first draft specification since last September. The exam boards are now working on version three.
The lack of exemplar material, in particular questions, mean it is very hard for schools to gauge likely tiers and content demand for ‘borderline’ students. Traditionally, this was the C-D threshold and I’m one of many who recognized the pressure this placed on schools with league tables, with teachers being pushed much harder to help kids move from a D to a C grade than C to B. the comparison is (deliberately) not direct. As I understand it an ‘old’ middle grade C is now likely to be a level 4, below the ‘good pass’ of a level 5.
Most schools start to set for GCSE groups long before the end of Year 9. Uncertainties about the grade implications will only make this harder.
The increased content has three major consequences for schools. The first is the teaching time needed as mentioned above. The second is CPD; non-specialists in particular are understandably nervous about teaching content at GCSE which until now was limited to A-level. This is my day-job and it’s frustrating not to be able to give good guidance about exams, even if I’m confident about the pedagogy. (For Physics: latent heat, equation for energy stored in a stretched spring, electric fields, pressure relationships in gases, scale drawings for resultant forces, v2 = u2 -2as, magnetic flux density.) The last is the need for extra equipment, especially for those schools which don’t teach A-level Physics, with the extra worry about required practicals.
Even if teachers won’t be delivering the new specification until September, they need to familiarize themselves with it now. Departments need to order equipment at a time of shrinking budgets.
I’m not going to suggest that a new textbook can solve everything, but they can be useful. Many schools have hung on in the last few years as they knew the change in specification was coming – and they’ve been buying A-level textbooks for that change! New textbooks can’t be written quickly. Proofreading, publishing, printing, delivery all take time. This is particularly challenging when new styles of question are involved, or a big change such as the new language for energy changes. Books are expensive and so schools want to be able to make a good choice. Matching textbooks to existing resources, online and paper-based, isn’t necessarily fast.
Schools need time to co-ordinate existing teaching resources, samples of new textbooks and online packages to ensure they meet student needs and cost limitations.
Finally, many teachers feel they are being kept in the dark. The first specification wasn’t accredited, so exam boards worked on a second. For AQA, this was submitted to Ofqual in December (I think) but not made available on the website. Earlier this month, Ofqual chose not to accredit this version, but gave no public explanation of why. Teachers are left to rely on individual advisers, hearsay and twitter gossip. This information would have given teachers an idea of what was safe to rely on and what was likely to change. It took several weeks for the new submission dates to appear on the website – now mid-March – and according to Ofqual it can take eight weeks from submission to accreditation.
If these time estimates are correct, the new AQA specification may not be accredited until mid-May and as yet there is nothing on record about what was wrong with previous versions. Teachers feel they are being left in the dark yet will be blamed when they don’t have time to prepare for students in September
I think that says it all.
Filed under: AQA, assessment, CPD, exams, planning, political, ranting | 3 Comments
All secondary teachers look forward to the summer term. Not just because we might actually get to see daylight before and after work, but for that possibly mythical creature, ‘gained time’. Assuming you don’t end up teaching RE to stroppy teenagers after a colleague collapses in tears trying to reconcile ‘Trinity’ and ‘monotheism’, you might get a classroom to yourself. Without kids. A chance to have a cuppa and finally clear out the bottom drawer of detention forms and credits.
Until you get handed 100 pages of new syllabus and are asked to write a scheme of work for September, that is.
Science teachers across the land are currently going quietly mad about the new GCSE specifications. We’ve lost count of which draft version the boards are on, although rumours abound that they’re going to be properly published any minute now. Even if you’re planning to start in September for a two-year GCSE this is cutting it fine for buying/creating resources, let alone ordering kit for the required practicals and any new content. And if you teach the content over three years, you’ve been having to use a draft specification for real kids. Which is more than a little frightening.
I’ve blogged before about the difficulties of finding resources to use without trawling through dozens of sites, each with their own login and categories. Even great sites like the eLibrary (its URL has changed but your login should be the same) can’t have everything. And every time the specifications change, we have to move everything around. If schools can share the planning then the workload can be reduced.
A school in Hampshire is holding a free “Science Curriculum in a day” event in March. Basically loads of teachers building a scheme of work as best they can. It’s organised by @MartynReah who tweeted about it, and I wondered if I could help. I can’t make it down there (although I will be trying to contribute via twitter: #teacher5adayScience ) and I suspect that’s true for many of my readers too. So how about crowdsourcing a resource list instead?
I’ve created a GoogleForm. It should take just a couple of minutes to complete for each online resource you’d like to share. Copy and paste the URL, tick a few boxes so they can be sorted by subject/topic and type of resource, and you’re done. The resulting spreadsheet will be freely available (although it’s currently pretty empty) and be used by those who can attend the day as a starting point.
EDIT: I’ve sorted a couple of bugs so specifying Chemistry topics doesn’t lead you to the Physics list (completely accidental I promise!) and you can now describe something as ‘All Subjects’. No need to repeat submissions but please add to the seven so far!
(I’ve suggested to Martyn that a Dropbox folder would allow colleagues to donate their own offline resources too, and will update this post if relevant.)
I have, according to WordPress, 132 followers. If each one of those can contribute a couple of links between now and the event, that’s over 250 teacher-recommended resources for a new Scheme of Work. The more people who get involved, the better the spreadsheet will be for us all, on the day or not. Heads of Department, why not ask your teams to add a favourite resource? NQTs, this would allow you to tick the ‘sharing good practice’ box on your paperwork. Fancy helping out?
I’ve even created short links so you can stick it up on noticeboards or in staff meetings. Please share widely. I intend to be tweeting this regularly with a running total of shared resources, so please help get the numbers up.
Filed under: CPD, planning, web | 1 Comment
Tags: planning, resources, SOWs, teachmeet
I’m a science teacher. When talking about the characteristics of sound in my lessons, I encourage students to give detail. It’s not enough to say that a change causes ‘more vibrations’. If the sound is a higher pitch, the vibrations of the ear drum will be faster, or more frequent. If the sound is louder, the displacement of the ear drum is bigger; we say the vibrations have greater amplitude or more energy. So it’s not that the ‘more vibrations’ answer is wrong – just incomplete. If we don’t give a full answer it can be misunderstood.
So I was catching up with news and read an article on the BBC about the continued arguments about institutionalized discrimination and hate speech in the Anglican church. Now, this isn’t about Welby being sorry for the discrimination – just not sorry enough to stand against it – or the hypocrisy of them sending out advice to schools on homophobic bullying. Instead, it’s simply about a number in the report.
I teach my students to do a ‘common sense check’ as part of any calculation and I was bemused that the BBC didn’t appear to have thought this through. Since when was a third of the UK Anglican? Now, I understand that calculating exactly how many (Anglican) Christians in the UK might be tricky, but 26 million seemed too far off to be reasonable. So I did some digging myself, and asked the organisation behind the ‘World Christian Database’ for the source of this number. It’s important to note that on Twitter they were very definite it was an aggregate figure and they used many sources of data.
— Global Christianity (@CSGC) January 15, 2016
So how should we find out how many (Anglican) Christians there are in the UK?
Simple, isn’t it? Pop into your local church on Sunday morning and count heads. But which Sunday? What about parishioners who are too ill to make it in, or are shift-workers? Would a Christmas or Easter service be more meaningful? And surely some believers prefer to worship in other ways. So church attendance figures, although useful, can probably be considered a lower limit. The Statistics for Mission 2014 (pdf) figures are just under a million for average Sunday attendance during October, with significantly higher numbers for Easter and Christmas services.
Church Attendance: 0.98m (980000)
Christmas Services: 2.4m
The question (‘What is your religion?’) asks about religious affiliation, that is how we connect or identify with a religion, irrespective of actual practice or belief.
According to the last Census figures, England and Wales has 33m Christians, but this isn’t broken down into denominations. Most data I’ve found suggests around half of UK Christians consider themselves Anglican, so we can get a reasonable estimate.
Census Anglicans: 17m (approx)
Might it be reasonable, I wonder, to suggest that claiming 26m Anglicans in the UK is bearing false witness?
Filed under: non-teaching, ranting, web | Leave a Comment
“So I was arguing on Twitter…”
That’s how all the best blog posts start, just like the best fairy tales start with “Once upon a time…” In this case, it wasn’t a new argument – in fact it was a disagreement I’ve had before, with the same person. But it’s also something which has been discussed in staffrooms all over the country, probably all over the world. A version of it has been had any time two people with the same job compare notes.
How can we be the best professionals possible without making all the mistakes personally?
It’s true that people learn from mistakes. Sometimes. When we recognize them. When we can change our behaviour based on that insight. When we’re not too hungry, angry, lonely or tired. When we have the chance to reflect on our actions and plan for ‘the next time’. When we can successfully generalize our specific experience.
I was having this conversation, for the hundredth time, with my eldest this week. In particular, we were talking about how the only thing better than learning from your own mistakes is to learn from somebody else’s. It’s generally less painful, expensive and embarrassing. We talked about how, perhaps, it’s the pain of our own mistakes which means they stick better.
Teaching from Mistakes
In education, we learn a lot from screwing up ourselves. From not labeling the beakers, from letting year 7 use powerpacks with 1A bulbs, to mixing up the two Rebeccas in your class during parents’ evening. We also, especially early in our career, learn a lot from watching our colleagues, deliberately or in passing.
(Brief digression: we should do more of this. Short observations, team-teaching, co-planning, watching a practical, seeing how they manage a demonstration, the ‘spiel’ for radioactive samples… all great chances to learn from a colleague and give them the ‘view from the back’. Go into an A-level English Lit lesson and talk for ten minutes about the ‘science’ of Frankenstein’s Creature, or invite a music teacher colleague into your Sound lesson to demonstrate high and low pitch. The important thing is to make a solemn promise that this will never show up on performance management.)
The argument I had seemed to come down to one principle. I think that we as teachers can – and should – learn from the successes and mistakes of other teachers as summed up in research. My counterpart feels that if someone isn’t a good teacher, they never will be, and that there’s nothing he can learn about teaching outside of a classroom. He sees educational research as a waste of his time.
But there’s a lot of research out there, which means a lot of student experiences added up to suggestions. Test results that might make patterns, implying how one approach on average works better than another. Don’t get me wrong – there’s a lot of crap, too. There’s a lot of context-free claims, a lot of ‘studies’ carried out without a control group, action research subject to the Hawthorne Effect and so on. But the argument I had – in this case and before – wasn’t about the bad ‘research’ that’s out there. It was about the very idea that educational research should or could guide our practice at all. And to me, that just seems weird.
@teachingofsci So the children and teachers were both cloned?????
— Adrian Dingle (@adchempages) January 11, 2016
During the conversation, @adchempages also used #peoplearenotelectrons. Which is true. But isn’t the whole point of science to use models, simpler than reality, to give us an indication of how reality works? We can model people as particles making up a fluid when we design corridors and stairwells. And that gives us useful information. Nobody suggests that those people travelling on the Underground are actually faceless, indistinguishable drones. (I’m saving the sarcastic comment as it would undermine my point.) But with enough data, and enough people, we can make good predictions about what will usually happen most of the time. There are caveats:
- Averages using large numbers aren’t specific to a small subset, even if homeogenous
- There are lots of confounding variables, some of which are unknown
- Kids are all different and there’s a fine line between describing and defining them
- Many anecdotes are not the same as data
- We tend to find/remember the results which confirm our expectations
I feel like I’ve been here before. In fact, I have – I wrote a similar post back in 2013 about how I might design a trial, and there’s also my post from when the Evidence-Based Bandwagon was taking off. But it’s worth revisiting as long as we are critical about research. We need to be able to ask good questions about the sample sizes, about the methodology, about sources of potential bias. But then we need to take on board the advice and try applying it to our own classes. Let’s imagine a way to test someone’s willingness to use research in their own practice.
- Recruit lots of teachers, teaching same subject to same age group.
- Match ‘equivalent classes’ or ideally randomize.
- Choose two interventions (or simply the same activities in a different order, eg theory then practical or the reverse.)
- Compare results of the kids in the same test.
Filed under: CPD, ed-research, reflection, teaching | 8 Comments