I’m going to keep this brief in the hope it actually gets (a) finished and (b) published. Because I’ve several drafts that I’ve just not found the time or motivation to finish off. In context; I have a small child, a shortage of caffeine and a grumpy temperament. This may be because not one new blogger built on my #aseconf session and contributed a post. Humph.
Recently, the skills vs knowledge debate has kicked off again. Not that it ever really went away! I think like many teachers, I actually stay away from both extremes. Of course kids need to know (ie recall with fluency) some facts. The question is where you draw the line. Do I expect my GCSE students to remember that Carbon has a proton number of 6? Of course I do. Do I expect them to memorize the entire periodic table, with or without the song? Of course I don’t. This could be applied to the reactivity series, the equations of motion, geological era or pretty much any other part of science. Knowing some is vital, knowing them all is unnecessary. But discussion online – perhaps especially on twitter – tends towards the argumentative.
So arguments about what should and shouldn’t be in the national curriculum, exam specifications or whatever are doomed to end unresolved. And, let’s face it – as teachers we don’t often get a say in it. We just have to make the best of what we get.
Instead, I was kicking some ideas around with colleagues and ended up with the bastard offspring of APP for younger kids and logbooks as suggested for AS, via ‘loyalty cards’ which I blogged after stealing the idea from @ange01. Hold on, it makes sense. Kind of.
Why not, I reasoned, put together lists for the students to use to record their various competencies? (I did something like this for teacher standards, although I’ve stopped keeping track of it. When I get around to it I’ll create a version for RSci and CSciTeach recording categories and wave it at @theASE via twitter.) This fits in well with the new approach to practical work at post-16, something else which has divided teachers and politicians alike. I made several deliberate decisions for the sample below, but I was very much thinking this would be better put together collaboratively, exam-board agnostic and perhaps led by expert/subject associations. (It would be interesting to have input from universities too, although I’ve a post brewing about university involvement in curriculum design too…)
- These are solely hands-on skills for the school lab – no analysis, no maths. There is no content. (Although it might be interesting to produce a paired list, with knowledge on the left and skills on the right. Hmm. Notes for later.)
- I ignored exam specifications and instead flicked through the relevant pages on PracticalPhysics. I’ve probably missed something, suggestions welcome.
- Instead of a ticklist, my idea was for students to add a date each time they demonstrated that skill. I suspect teachers would have varying ideas of how many times are needed. The only thing everyone will agree on is that once is not enough.
- This is for students to use themselves for tracking, not teachers to use for assessment. I hope HoDs are paying attention to this point.
It would be easy to use this approach for GCSE and AS/A2, one checklist per topic area. (I’m sure many colleagues and departments already do.) But why not spend a little time putting together a good list, based on agreed best practice? I do similar things for content revision, but it’s the first time I’ve done it for specific hands-on skills. I’m going to have a play around with a ‘minds-on, thinking scientifically’ version too.
I’d happily run a project producing high quality versions, based on wider consultation, for all subject areas. It would need more of my time and the time of colleagues. That means money, so let me know if you know where I could submit a proposal for funding…