Measurable Outcomes

Following a conversation on twitter about the phonics screening test administered in primary school, I have a few thoughts about how it’s relevant to secondary science. First, a little context – especially for colleagues who have only the vaguest idea of what I’m talking about. I should point out that all I know about synthetic phonics comes from glancing at materials online and helping my own kids with reading.

Synthetic Phonics and the Screening Check

This is an approach to teaching reading which relies on breaking words down into parts. These parts and how they are pronounced follow rules; admittedly in English it’s probably less regular than many other languages! But the rules are useful enough to be a good stepping stone. So far, so good – that’s true of so many models I’m familiar with from the secondary science classroom.

The phonics screen is intended, on the face of it, to check if individual students are able to correctly follow these rules with a sequence of words. To ensure they are relying on the process, not their recall of familiar words, nonsense words are included. There are arguments that some students may try to ‘correct’ those to approximate something they recognise – the same way as I automatically read ‘int eh’ as ‘in the’ because I know it’s one of my characteristic typing mistakes. I’m staying away from those discussions – out of my area of competence! I’m more interested in the results.

Unusual Results

We’d expect most attributes to follow a predictable pattern over a population. Think about height in humans, or hair colour. There are many possibilities but some are more common than others. If the distribution isn’t smooth – and I’m sure there are many more scientific ways to describe it, but I’m using student language because of familiarity – then any thresholds are interesting by definition. They tell us, something interesting is happening here.

The most exciting phrase to hear in science, the one that heralds new discoveries, is not “Eureka!” but “That’s funny …”

Possibly Isaac Asimov. Or possibly not.

It turns out that with the phonics screen, there is indeed a threshold. And that threshold just so happens to be at the nominal ‘pass mark’. Funny coincidence, huh?

The esteemed Dorothy Bishop, better known to me and many others as @deevybee, has written about this several times. A very useful post from 2012 sums up the issue. I recommend you read that properly – and the follow-up in 2013, which showed the issue continued to be of concern – but I’ve summarised my own opinion below.

phonics plot 2013
D Bishop, used with permission.

Some kids were being given a score of 32 – just passing – than should have been. We can speculate on the reasons for this, but a few leading candidates are fairly obvious:

  • teachers don’t want pupils who they ‘know’ are generally good with phonics to fail by one mark on a bad day.
  • teachers ‘pre-test’ students and give extra support to those pupils who are just below the threshold – like C/D revision clubs at GCSE.
  • teachers know that the class results may have an impact on them or the school.

This last one is the issue I want to focus on. If the class or school results are used in any kind of judgment or comparison, inside or outside the school, then it is only sensible to recognise that human nature should be considered. And the pass rate is important. It might be factor when it comes time for internal roles. It might be relevant to performance management discussions and/or pay progression. (All 1% of it.)

“The teaching of phonics (letters and the sounds they make) has improved since the last inspection and, as a result, pupils’ achievement in the end of Year 1 phonics screening check has gradually risen.”

From an Ofsted report

Would the inspector in that case have been confident that the teaching of phonics had improved if the scores had not risen?

Assessment vs Accountability

The conclusion here is obvious, I think. Most of the assessment we do in school is intended to be used in two ways; formatively or summatively. We want to know what kids know so we can provide the right support for them to take the next step. And we want to know where that kid is, compared to some external standard or their peers.

Both of those have their place, of course. Effectively, we can think of these as tools for diagnosis. In some cases, literally that; I had a student whose written work varied greatly depending on where they sat. His writing was good, but words were spelt phonetically (or fonetically) if he was sat anywhere than the first two rows. It turned out he needed glasses for short-sightedness. The phonics screen is or was intended to flag up those students who might need extra support; further testing would then, I assume, suggest the reason for their difficulty and suggested routes for improvement.

If the scores are also being used as an accountability measure, then there is a pressure on teachers to minimise failure among their students. (This is not just seen in teaching; an example I’m familiar with is ambulance response times which I first read about in Dilnot and Blastland’s The Tiger That Isn’t, but issues have continued eg this from the Independent) Ideally, this would mean ensuring a high level of teaching and so high scores. But if a child has an unrecognised problem, it might not matter how well we teach them; they’re still going to struggle. It is only by the results telling us that – and in some cases, telling the parents reluctant to believe it – that we can help them find individual tactics which help.

And so teachers, reacting in a human way, sabotage the diagnosis of their students so as not to risk problems with accountability. Every time a HoD puts on revision classes, every time students were put in for resits because they were below a boundary, every time an ISA graph was handed back to a student with a post-it suggesting a ‘change’, every time their PSA mysteriously changed from an okay 4 to a full-marks 6, we did this. We may also have wanted the best for ‘our’ kids, even if they didn’t believe it! But think back to when league tables changed so BTecs weren’t accepted any more. Did the kids keep doing them or did it all change overnight?

And was that change for the kids?

Any testing which is high-stakes invites participants to try to influence results. It’s worth remembering that GCSE results are not just high-stakes for the students; they make a big difference to us as teachers, too! We are not neutral in this. We sometimes need to remember that.


With thanks to @oldandrewuk, @deevybee and @tom_hartley for the twitter discussion which informed and inspired this post. All arguments are mine, not theirs.

Advertisements

CSciTeach Evidence

It’s odd, in some ways; for a profession which is all about leading and tracking progress for our students, we’re remarkably bad at agreeing any kind of consistent way to record what we do.

Years back I put together a Google Form for me to record what I was doing. The idea then was to match different activities to the Teacher Standards. To be honest, I didn’t use it for very long, although the process was useful in itself. Since then I’ve thought several times that a better way to track what I do is in the context of professional accreditation. For science teachers, who I work with in my day job, there are several things to consider for CPD tracking.

  1. Performance management forms are very specific to institutions, but in most cases having a record of what’s been done in between school-based INSET would help.
  2. There are several ways for a science specialist to become accredited; this is about recognising current knowledge and skills, not jumping through new hoops. CSciTeach is the route I chose, through the ASE (now also available via RSC and RSB). You may also wish to consider the new STEM Educator pathway. I have just completed the Chartered Physicist accreditation, which is available to physics teachers and teacher-trainers with appropriate experience. (I should point out I’m involved with making this better known to teachers/teacher-trainers and more information, exemplars etc will be out this autumn.)
  3. Having this information to hand can only be a good thing when it comes time to apply for new roles. I personally think it’s bizarre that there isn’t a single national application form, universal* with perhaps a single page ‘local detail’ for stuff a school feels just has to be asked. Otherwise colleagues have to waste time with many tiny variations of badly formatted Word forms, rather than their cover letters.

The thing is, who writes down every time they read/watch/observe something which ends up in a lesson? And if you do make a note of it, mental or otherwise, what are the chances of it being recorded in one central place? We end up with a formal record which has a few courses on it, and all the other ideas are along the lines of:

I think I got it at a teachmeet – was it last year? Might have been the one before. I’m pretty sure there was an article, I’ll have a look for it in a minute…

 


 

My Proposed Solution

What I’ve produced didn’t take long, and it’s only the first version – I’d really welcome ideas and suggestions for how to improve it. The idea is to gather information, reflect on impact and be able to refer back to it as evidence of professional practice.

If you want to try out the form, then feel free – this link takes you to my trial version and is not linked to the downloadable version below. You can also look at (but not edit) the resulting spreadsheet; note that the ASE guidance is reproduced on the ‘Notes’ tab. Thanks to Richard Needham aka @viciascience for some suggestions.

I’ve used the CSciTeach standards, but obviously (1) you need to do more than this form to be accredited and (2) other accreditation schemes are available.

Slide1

Slide2


Want to play around with your own version, editable and everything? You’re in luck:

1 Set-up

You’ll need a Google account. Go to the responses sheet (starting here means the formatting of the final spreadsheet is preserved.) Select ‘File’, then ‘Make a Copy’. Choose ‘Form’, then ‘Go to live form’; save the form URL as a bookmark on each of your devices. The spreadsheet URL will probably be most useful on something with a keyboard, but YMMV.

2 Capture

The form is set-up to get a few brief details fast, and then gives the option to skip to ticking relevant CSciTeach standards. If preferred, you can add the details of your reflection and impact in your setting at the same time. This completes the entry, but often you’ll want to come back when you’ve had a chance to think or try something out with students.

3 Reflect

Assuming you skip the in-depth reflection during step 2, you’ll want to return to the spreadsheet the form generates. I’ve included a few formatting points to make it work better which should be preserved when you copy it.

  • Column headings are bold
  • Columns are sized so it should print neatly on landscape A4
  • Text is justified ‘left, top’ and wrapped to make the columns readable
  • If empty, the columns for further reflection and impact are shaded red to prompt you to fill them in
  • The standards cells are shaded if at least one in that category has been ticked.

The point of CSciTeach, or any other accreditation is to recognise that ‘doing CPD’ is not a one-off event or course. Instead, it is a process, and one which should have reflection and consideration of measurable impact at its heart. This impact may be on students, teachers or both. This will very much depend on your role.

4 Share

You may prefer to keep the spreadsheet for your own reference only, using it to fill in other forms or complete applications. Sharing a Google spreadsheet is easy enough, of course; that’s the point! Just be aware that if you give ‘edit’ access, whoever it’s shared with can change your details. If you want their input – for example a professional mentor or coach – it might be better to give them permission to ‘view and comment’.

Alternatively, you might wish to search for particular examples and copy the results to a fresh document, depending on context. It would be easy to modify the form so that the Stimulus question was multiple choice, allowing you to categorise different kinds of formal and informal CPD. If colleagues think this would be more useful, I’ll create an alternate version centrally.

If, as a HoD or similar, you want to try something like this collectively, then it would be easy to adapt. Give the form URL to all team members and ask them to contribute. Whether you wish to add a question where they identify themselves is, of course, a more sensitive issue!


 

What Next?

Firstly; tell me what might be worth changing using the comments below. If I agree, then there’s a fair chance a version 1.1 will be shared soon. If you’d rather play around with it, feel free. I’d appreciate a link back if you share it.

Secondly, there are a couple of features which would be great to add. Being able to upload a photo or screenshot would be much better than copying and pasting a link, but I can’t see how to do this with a GForm. Related, if you think this could be developed into a mobile app then I’m sure the ASE would love to hear from you.

Lastly, yes, the SNAFU above* was on purpose. Those readers who understood can feel smug for exactly five seconds.

#ASEslowchat Tuesday: Practicals


I can’t comment on what is happening in my classroom, or my department. Because I don’t have a classroom; instead I work with teachers in their classrooms, supporting their departments. So most of what I’ll be sharing will be at one step removed, but it is based on what ‘real’ teachers have told me is happening in their schools. I’ve played around with the stimulus questions a little.
Which required practicals have you completed with your classes; have you only completed these, or gone beyond them? Why?

I posted a little while back about how I felt the required practicals should fit into a balanced science curriculum. (This was a different post to one from even earlier, based o a draft of the AQA required pracs.) Nothing I’ve seen has caused me to change my mind. The summary is that whether a practical is required or not it should be used in the same way; to support teaching of science content and skills. It might, of course, be worth returning to the required practicals as part of the organised review/revision schedule, because they’re effectively content. Until then, ask the same questions, practise the same skills, as you would for any practical. (And, of course, don’t neglect these aspects if a practical is ‘unrequired’!)

Has the GCSE impacted on the work of the technicians in the department? Have you had any issues with equipment?

Not being in a school full-time, I’m not sure about the workload side of this. I don’t think it’s been a huge issue – certainly compared to lots of ISAs to worry about! (I hope school technicians are being encouraged to contribute to this topic, by the way.) But I have been doing a fair bit about the physics practicals with teachers, in school and by email, so I have a few resources to point to.

There is a dedicated TalkPhysics group for the GCSE required practicals – obviously just the physics ones. It’s fairly quiet at the moment, but I/we would love to see more teachers on there swapping ideas and answers, for example about specific components for I/V graphs or precise methods for using a ripple tank. If you’re not already a member, you can get a free login in a day or so, and the group is open to all. Technicians and all teachers of physics – not just physics specialists – are welcome. Please join in.

Most equipment issues I’ve heard about have been predictable:

  • Getting a class around a ripple tank is hard. Much of the work can be done in pairs by putting a piece of laminated squared paper in a Gratnells tray – other trays are available – adding a centimetre of water with a couple of drops of ink, then making and timing ripples. Very fast, very cheap, and lots of data to criticise.
  • Dataloggers for a=F/m. As you might expect, manufacturers are trying to log complete systems which will work brilliantly for a week then be a pain to set-up and calibrate. If you can use phones in school, kids can probably use slow-motion cameras to collect some useful data. Alternatively, I’m a huge fan (no commision, sadly) of the Bee Spi V lightgate. It displays either speed or acceleration of an object passing through it. It doesn’t log it, which to my mind is an advantage as it means kids have to do the table/points/line bit themselves. They’re £20 each, run on batteries and don’t need to be plugged into any device.
  • The specific heat capacity practical – assuming you have the kit – has always produced data with, shall we say, lots to comment on. An improved method is available from PracticalPhysics, and it’s easier if you can (a) use a joulemeter and (b)record the maximum temperature, not the temperature at the end of the heating time.

How are you developing knowledge of practical work and investigations in your teaching ready for the examinations? 

‘Required Practicals’ is one of the sessions I run in schools as part of my day job with the IOP. So allow me to invite you to a virtual session, which will require you to imagine all the hands-on sections. There are presenter notes with even more links than in the slides themselves. PNCs will often run their own versions of these, and we do a lot at days and events open to all teachers. Please consider this an invitation.

If in doubt, checking out the work of Ian Abrahams is always worthwhile. He’s got a book out with Michael Reiss fairly recently: Enhancing Learning with Effective Practical Science 11-16, which I will buy as soon as my next freelance cheque arrives. Unless anyone would like me to review it, hint hint. He writes regularly in SSR so you’ve probably experienced a flavour of his work already.

A few years ago, Demo: The Movie was unleashed on an unsuspecting world by @alomshaha and co. It should be required watching for all science teachers and departments, and provides some great ideas about how to make demonstrations much better for learning. He’s got loads of films, some of which aren’t directly relevant but the techniques discussed are great. I reflected on some of the material in a blog post too.

Other resources I’d recommend (there will undoubtedly be some overlap) are collated at STEM Learning (the eLibrary that was, once upon a time). And I always like to put in a word for the SchoolPhysics materials by Keith Gibb, author of the Resourceful Physics Teacher.

Something I’ve chatted about in workshops, on Twitter and elsewhere; you may find it useful to break down the POE approach in a slightly more specific way which I call PRODMEE:

  • Predict: what do you think will happen? (encourage specific changes to specific variables)
  • Reason: why do you think that? (from other science content, other subjects, life experience)
  • Observe: what actually happens? (we may need to ensure they’re looking the right way)
  • Describe: in words, what happened? (qualitative results)
  • Measure: in numbers, what happened? (quantitative results, devices, accuracy/precision, units)
  • Explain: what’s the pattern and does it match the prediction? (digging into the mechanism)
  • Extend: why does this matter? (other contexts, consequences for everyday life)

What resources or advice can you share with other teachers about approaching a specific required practical? What issues and opportunities have you come across when going about teaching the required practicals to your classes?

A few suggestions I’ve made in workshops, often based on conversations with teachers; this is obviously an incomplete list!

  • Density is boring; why not provide a few blobs of blue-tac and have kids plot mass against volume on a graph. Make it more challenging by hiding a ball bearing inside one to provide an anomaly to the line of best-fit. Or can students separate LEGO, Mega-Bloks etc based on density?
  • Hooke’s Law: as the kids have already seen it, why not try using strawberry laces? Alternatively, there’s a simple set-up using copper wire from PracticalPhysics. And you can always use it to hammer home the idea of science-specific vocab, because ‘elastic’ bands aren’t elastic.
  • Acceleration: I mentioned Bee Spi V for measuring earlier. My only other suggestion is to always teach it as F/m=a so you start with the cause (force), shared out because of the conditions (mass) which leads to a consequence (acceleration).
  • Ripples: discussed above, but you can also use a speaker as a vibration generator for some interesting results.
  • Heat capacity: An old experiment uses lead shot which falls a known distance and heats up. Like stroking a metal lump with a hammer, this is a nice example of the idea that the energy in a thermal store can increase without ‘heating’ as we might normally consider it.
  • I/V characteristics are a lot more interesting if students must compare results from a mystery component to standard graphs. This is included in the presentation of my workshop, linked above.
  • Resistance, series and parallel: instead of just reusing the old ISA hardware, why not try taking measurements of different versions of squishy circuits dough?