Morning all. I was at the Northern #ASEConf at the weekend, had a good time and had lots to think about. I’m going to try really hard to blog it this week, but I’m buried under a ton of stuff and pretty much every person in my immediate family is either ill, recovering or about to go into hospital. And Trump apparently won, which makes me think it’s time to dig a fallout shelter and start teaching my kids how to trap rabbits for food.
One of the recurring discussions between science teachers is about the new required practicals for the GCSE specs. I’m trying to put some resources together for the physics ones as part of my day job, on TalkPhysics (free to join, please get involved) and thought I’d share a few ideas here too.
The exam boards don’t need lab books. There is no requirement for moderation or scrutiny. There is no set or preferred format. And, realistically, until we’ve seen something better than the specimen papers there’s no point trying to second-guess what the students will be expected to do in the summer of 2018.
So apart from doing the practicals, as part of our normal teaching, in the normal way, why should we do anything different? Why should we worry the kids about them? Why should we worry about them? There’s time for that in the lead up to the exams, in a year’s time, when we’d revise major points anyway. For now, let’s just focus on good, useful practical work. I’ve blogged about this before, and most of it comes down to more thinking, less doing.
What we can do is make sure kids are familiar with the language – but this shouldn’t be just about the required practicals. So I put together some generic questions, vaguely inspired by old ISAs (and checking my recall with the AQA Science Vocab reference) and ready to print. My thinking is that each laminated card is handed to a different group while they work. They talk about it while doing the practical, write their answers on it, then they get added to a wall in the lab. This offers a quick review and a chance for teachers to see how ids are getting on with the vocab. The important thing – in my view, at least – is that it has to be for every practical. This is about improving fluency by use of frequent testing. And it ticks the literacy box too.
EDITED: more cards added, thanks to suggestion from @tonicha128 on Twitter.
So here you go: prac-q-cards-v2 as PDF.
Please let me know what you think, whether I’ve made any mistakes, and how it works if you want to try it out. It would be easy to produce a mini-test with a selection of these questions, or better ones, for kids to do after each practical. Let’s get them to the stage of being so good with these words that they’re bored by being asked the questions.
Filed under: exams, literacy, planning, practicals, resource, teaching | 1 Comment
Tags: exams, practicals, printable
A perpetual classroom problem is that students translate what we say into what they want to do. How many times have you come back from time off to see that students answered questions 1 and 10, not 1 to 10? Sometimes this is deliberate awkwardness. Sometimes it’s an actual lack of understanding, either of what the task was or why we’re asking them to do it in what seems ‘the hard way’. I’ve long been a fan of the template approach, giving students a framework so they’ve got a place to get started. And I produced a bunch of resources, some of which may be useful for you. I’ve shared these before, here and there, but figured a fresh post was worthwhile. This was mainly prompted by a tweet from a colleague:
You know when you set ‘revise for the test’ as the homework and the class hear ‘no homework this week’? That. 😭#markingtests
— Helen Rogerson (@hrogerson) October 3, 2016
So here’s a quick reminder of some printable resources. I’m not going to go through and remove the QR code, but it now goes to a dead link. Feel free to mess around with them as you see fit.
- lesson ticklist
- exam paper debrief
- learningtoolkit (20 pages for displays)
- eca boosting grade (Biology example)
- Quarters Revision
- Concepts Cues Consequences
Some of these can be downloaded as Office files, mainly docx and pub (links to a GDrive folder). There may also be jpg versions available for adding to Powerpoints or websites. If there’s no editable version of an example above that you’re after, add a comment here and I’ll dig it up.
If you’ve not already seen it (not sure how, but it’s possible), can I strongly recommend the excellent posters and resources available from the team at @acethattest, AKA The Learning Scientists. On my long and growing jobs list is producing some Physics specific versions to show how they could be applied within a subject.
Filed under: ed-research, exams, L2L, planning, printables, resource, revision, teaching | Leave a Comment
It’s not often I can claim to be ahead of the trend. Pretty much never, to be honest. But this time I think I’ve managed it, and so I’m going to make sure all my readers, at least, know about it.
Recently the TES “exclusively reported” – which means other sites paraphrased their story and mentioned their name, but didn’t link – that Cambridge Assessment was considering ‘crowd-sourcing’ exam questions. This would involve teachers sending in possible questions which would then be reviewed and potentially used in external exams. Surplus questions would make up a large ‘question bank’.
I suggested this. This is, in fact, pretty much entirely my idea. I blogged ‘A New Exam Board’ in early 2012 suggesting teachers contribute questions which could then provide a range of sample papers as well as external exams. So it is not, despite what Tim Oates claims, a “very new idea.” Despite the similarity to my original post I do, however, have some concerns.
Backwards Backwards Design
So instead of teachers basing their classroom activities on giving kids the skills and knowledge they need to attempt exam questions, we’re doing it the other way around? As I’ve written before, it’s not necessarily a bad thing to ‘teach to the test’ – if the test is a good one. Writing exam questions and playing examiner is a valuable exercise, both for teachers and students, but the questions that result aren’t always helpful in themselves. As my OT-trained partner would remind me: “It’s the process, not the product.”
Being an examiner is something that looks good on a CV. It shows you take qualifications seriously and have useful experience. How can teachers verify the work they put into this? How can employers distinguish between teachers who sent in one dodgy question and those who shared a complete list, meticulously checked and cross-referenced? What happens when two or more teachers send in functionally identical questions?
A related but not identical point. How is the time teachers spend on this going to be recognized financially? And should it be the teacher, or the school? Unless they are paid, teachers are effectively volunteering their time and professional expertise, while Cambridge Assessment will continue to pay their permanent and contract staff. (I wonder how they feel about their work being outsourced to volunteers…)
It’s hardly surprising at this early stage that the details aren’t clear. One thing I’m interested in is whether the submissions shared as part of the ‘questions bank’ will go through the same quality control process as those used in the exams. If so, it will involve time and therefore money for Cambridge Assessment. If not, it risks giving false impressions to students who use the bank. And there’s nothing in the articles so far to say whether the bank of questions will be free to access or part of a paid product offered.
Unless there are far fewer ‘donated’ questions than I’d expect, I don’t think we will really see a huge advantage held by students whose teachers contributed a question. But students are remarkably sensitive to the claims made by teachers about “there’s always a question on x” or “it wasn’t on last year’s paper, so expect y topic to come up”. So it will be interesting to see how they respond to their teachers contributing tot he exam they’ll be sitting.
I look forward to hearing from Cambridge Assessment, thanking me for the idea in the first place…
Filed under: assessment, ranting, web | Leave a Comment
Tags: cambridge assessment, exams
It turns out that I’m really bad at following up conference presentations.
Back in early June, I offered a session on teachers engaging – or otherwise – with educational research. It all grew out of an argument I had on Twitter with @adchempages, who has since blocked me after I asked if the AP Chem scores he’s so proud of count as data. He believes, it seems, that you cannot ever collect any data from educational settings, and that he has never improved his classroom practice by using any form of educational research.
But during the discussions I got the chance to think through my arguments more clearly. There are now three related versions of my opinion, quite possibly contradictory, and I wanted to link to all three.
Version the first: Learning From Mistakes, blogged by me in January.
Streamlined version written for the BERA blog: Learning From Experience. I wrote this a while back but it wasn’t published by them until last week.
Presentation version embedded below (and available from http://tinyurl.com/ian-redmatsci if you’re interested).
I’d be interested in any and all comments, as ever. Please let me know if I’ve missed any particular comments from the time – this is the problem with being inefficient. (Or, to be honest, really busy.) The last two slides include all the links in my version of a proper references section.
Thoughts from the presentation
Slide 8: it’s ironic that science teachers, who know all about using models which are useful even though they are by necessity simplified, struggle with the idea that educational research uses large numbers of participants to see overall patterns. No, humans aren’t electrons – but we can still observe general trends using data.
Slide 13: it’s been pointed out to me that several of the organisations mentioned offer cheaper memberships/access. These are, however, mainly institutional memberships (eg £50/yr for the IOP) which raises all kinds of arguments about who pays and why.
Slide 14: a member of the audience argued with this point, saying that even if articles weren’t open-access any author would be happy to share electronic copies with interested teachers. I’m sure he was sincere, and probably right. But as I tried to explain, this assumes that (1)the teacher knows what to ask for, which means they’ll miss all kinds of interesting stuff they never heard about and that (2)the author is happy to respond to potentially dozens of individual requests. Anyone other than the author or journal hosting or sharing a PDF is technically breaking the rules.
Slide 16: Ironically, the same week as I gave the presentation there was an article in SSR on electricity analogies which barely mentioned the rope model. Which was awkward as it’s one of the best around, explored and endorsed by the IOP among many others.
Slide 20: Building evidence-based approaches into textbooks isn’t a new idea (for example, I went to Andy’s great session on the philosophy behind the Activate KS3 scheme) but several tweeters and colleagues liked the possibility of explicit links being available for interested teachers.
Filed under: CPD, ed-research, reflection, teaching, web | Leave a Comment
Tags: ed-research, ResearchED, teaching
Just think… in a few weeks, you’ll have a new crop of brand-new Year 7 students. Shiny faces, uniforms without holes and a complete pencil case. For about a day.
So it’s nearly time to teach graphs.
You may have already seen the resources produced by the ASE on the Language of Maths in Science (LoMiS). If not, go download them for free and have a look. It’s worth it, really. For a quick taste, Richard Needham did a piece for the Royal Society of Chemistry a while back which is a great introduction to the aims of the project.
And here’s an approach I’ve come up with which you may find a useful beginning. It’s based on what I’ve done in lessons in the past with a final addition I’ve been discussing recently with delegates and colleagues at the SPN Oxford Summer School.
1 Number Lines
Putting numbers in sequence on a line is something students start to do at a young age, long before secondary school. To be honest, if kids can’t put whole numbers in the right order then graphs are going to be a distant dream. I agree that decimals make this harder at times, but I’m working on something about that too. Next week, maybe.
So give students a list of values and ask them to put them on a number line in order. Add challenge by having them convert values between units first, or have different numbers of significant figures. Top half of image:
2 Number Lines to Scale
They might do this automatically. If not, it shouldn’t be too hard to have them do so (image above). Once they have a scale sorted out for the line, placing laminated cards for your supplied values along it should be straightforward.
3 Number Line to Scale = Axis
If you now have your students put the two number lines (one from each set of values) at right angles, they should be able to see that they’ve defined each point.
4 Mathematical Axes Of Doom
Two wooden dowels from B&Q (other DIY stores are available), with insulation tape wrapped round at regular intervals. I deliberately chose different intervals. Next time, I’d probably use wooden dowels with rectangular cross-section, simply so they don’t roll. You could use metresticks but I wanted to avoid any numbers. The tape is all you need, really.
Put them at right angles and you have a set of axes, with the intervals clearly marked. Add the coordinate cards – because students have used the idea of a coordinate system for a lot longer than they’ve used graphs to tell a story – in the right places. They’re easy to adjust, so there’s less stress. (Low stakes, yes?) And if they look from above, any pattern is clear and anomalies can be considered. They can even see the best-fit line.
Extension ideas; use larger or smaller cards to get over the idea of precision in the readings. There is a link here to the idea of error bars, something we don’t usually cover but may find useful.
Thoughts, ideas, suggestions? Please let me know in the usual ways.
NB: you get funny looks if you carry the sticks on to a train.
Filed under: maths, planning, resource, teaching | 3 Comments
I’m sure many other bloggers have posted about this already, but in case it’s passed you by; the new GCSE specification is officially starting from September. Many schools, of course, started teaching from the draft simple because, if you’re delivering GCSE Science over three years, there was no choice. For various reasons I’ve been looking at the AQA version in quite a lot of detail (as my previous post explained) and I wanted to share a summary I put together for the new content. The new material come from both directions, KS3 and A-level. It’s probably worth me explaining this.
Until now, some material was taught at KS3 (assuming you followed the national curriculum matched to the much-lamented SATs) and then assumed for GCSE. Some of this is now explicitly examined as part of the exam at 16. You could, of course, claim that once it’s been taught at age 13 it wouldn’t need to be revisited. Which, in my opinion, is daft. Other material has been taught as part of A-level for years, but hasn’t been part of the KS4 specification for years – certainly in my teaching memory of a decade or so. This will be a particular issue for schools which don’t deliver A-level, because they won’t have equipment or experience.
Energy: less emphasis on heat transfer and no mention of U-values. Note the use of the ‘new’ energy language (stores and pathways/processes) plus extra equations.
Filed under: AQA, physics, planning, practicals, teaching | 10 Comments
As I mentioned in my previous post, I’ve recently been doing some freelance work in a local school. The role is short-term and has an interesting mix of aims, but one part is to work with Year11 students on data analysis questions. Now, obviously I’ve taught these skills before. But I’ve not previously used the OCR B specification before, which features a final data question worth ten marks. I know this is running out soon but thought it might be worth sharing what I’ve created.
Firstly, a plea to all exam boards. When you release Examiners’ Reports – which are really useful, please keep doing it – can you combine them with the markscheme for easy reference? It’s something I’ve done for a while but it would make much more sense for you to do it.
- exam paper 2013 June H p30-32
- combined markscheme and examiner’s report 2013JuneOCRBGatewaySectionD
- Stopping distance card sort
Predictably, the specimen paper isn’t a great example to use. I’ve not included the 2015 paper because many schools will be using it for preparation in controlled conditions. The links above are to my own copies in case OCR rearranges their site with the new specifications, and I’ve added the Section D page details to the filenames to make life easier for colleagues.
It seems a good time to remind you all that in the past I produced quite a few resources for looking at past exam papers, mostly AQA. The tags on the right should make it fairly easy to find them.
When we used these in class, one of the outcomes was that students put together a list of “things to try if you’re stuck”. Now, for many pupils this will have been built in to their teaching, but we all know that kids don’t always absorb what we’re hoping they will. I think the real value of this is to generate a list with your own students, but for your interest:
- Highlight or underline numbers in the question
- Draw lines from the axes at specified values so you can find the corresponding value
- If the question is about differences, you’ll need to add or subtract
- If the question is about rates or uses the word ‘per’, you’ll need to divide or multiply and you might need to think about gradient or slope
Comments and suggestions welcome, as always.
Filed under: exams, ocr, physics, printables, resource, revision, teaching | Leave a Comment
Tags: exams, ocr, printables