PBODME Resources

My last post got rather more responses than I expected, which is great. Some of them challenged how I think about using this framework with students, which is even better. I still like it, and I’ll still use it, but it was pointed out that I didn’t make it clear that this was only one of the tools that help students with practicals. I’ve blogged about the different aims of practical work before, and probably will again, but check out articles by @alomshaha for far more eloquent words than mine.

Possible Aims of Practical Work

  • To enthuse – explosions and the wow factor
  • To model and practise technical skills
  • To collect data
  • To boost appreciation of difficulties with data such as random errors and so improve experimental design
  • To illustrate a scientific idea or principle clearly by removing distractions

As I’ve commented in the past, these are all useful aims as long as we are clear in our own minds why we are doing the practical. This might not be shared with students beforehand, but should be afterwards. (NB: I was marked down in a 30 minute observation because students failed to make ‘good progress’ during a practical. The observer had not appreciated that the point was for the kids to struggle and then, in later discussion, to share tactics and appreciate why the concept was hard to observe in school lab conditions.) Of course, we should also vary the kinds of practical work we do!

Responses to the post

Read the comments; my readers put it better than I could. For which many thanks; in a week when it feels like the only things I’ve achieved involve feeding the cats and a pike of marking as tall as my five year old, the feedback really helped. The only addition I’ll make is to quote @fnoschese:

I particularly like the second flowchart (IF/AND/THEN/THEREFORE), something I’ll be adapting over the weekend between decorating and getting another year closer to forty. Unfortunately I can’t copy it as an image so you’ll just have to follow the link.

My PBODME resources

This was originally going to be the only section of this post, but never mind. For your use and interest, hopefully:

  • pbodme as ppt (print slides for a quick display)
  • pbodme flowchart/student capability checklist as pdf

As ever, I’d value comments. Can I ask that if you have a useful link that you add a comment as well as tweeting me? I always worry I’m going to miss something, and that way it’s a proper conversation for everyone.

Also, a general appeal; if you use my materials, for general displays, CPD or with your own students, can you let me know? Always nice to point to wider impact of what I do, quite apart from giving me a nice warm glow. Feedback is the only thing us bloggers ask, after all…


Moving Beyond Predict/Observe/Explain

I don’t remember when I first used the idea of breaking down a demonstration for students by having them follow the POE format:

  • Predict what will happen
  • Observe what actually happens
  • Explain it in context

I think a lot of science teachers used this before – or even without – referencing the ideas of Michael Bowen, who explains the approach in this video. He wasn’t the first, but I tracked down the link via the site of the National Science Teachers Association in the US. There are several papers available there, for example this from a decade ago about hypothesis-based learning, which makes explicit the difference between a hypothesis and a prediction. It’s easy to see how these steps link nicely with a 5/7Es planning method. But I think it’s worth adding some steps, and it’s interesting to see how it might have developed over time. How students cope with these stages is an easy way to approach formative assessment of their skills in thinking about practicals, rather than simply doing them.

Please note – I’m sure that I’m missing important references, names and details, but without academic access I simply can’t track original papers or authors. My apologies and please let me know what I’m missing in this summarised family tree!

PEOE: I think this because

To stop students making wild speculations we need to involve them in a conversation justifying their predictions. I suppose this is a first step in teaching them about research, to reference their thoughts. I find this needs guidance as many students mix up the two uses of explain; the derivation of their prediction and the link to accepted theory.

PODME: Recording what we observe

I got this from Katy Bloom (at York SLC, aka @bloom_growhow) I think after chatting at a TweetUp. I’m paraphrasing her point: in Science it’s not enough simply to observe, we must also share that observation. This can take two forms, Describing in words and Measuring in numbers. The explanation then becomes about the pattern rather than a single fact or observation. Bonus points to students who correctly suggest the words qualitative and quantitative for the observations here!

PBODME: My current approach

I’ve tweaked this slightly by making the first explanation phase explicit. The display is on the wall and students can apply this (with varying degrees of success) from year 7 practicals with burning candles to year 13 physics investigations into gamma intensity affected by thickness of lead shielding.

  • Prediction of outcome
  • Because of hypothesis based on life experience, context or research
  • Observation using senses, measuring devices
  • Description in words of what typically happens (sometimes as commentary during practical)
  • Measurement using appropriate units, with derived results and means where needed
  • Explanation of results, patterns, anomalies and confidence

Is it getting ungainly? Having this structure means students can see the next step in what they are doing, and are hopefully able to ask themselves questions about how to develop a practical further. I suppose you could argue that the original POE approach is the foundation, and these stages allow us to extend students (or ideally allows them to extend themselves).

PBODMEC: Why does it matter?

In many ways, the natural next step would be about Context – why should we care about the results and what difference do they make to what we know, what we can do or what we can make?

I plan to follow up this post with the printable resources (wall display and a student capability checklist) but they’ll have to wait until I’m home. In the mean time, I’d welcome any thoughts or comments – especially any with links to other formats and their uses in the school science lab.

Power Stations

“Okay, class… everybody… I’m not going to teach you about power stations. You need to know all the features but you’re going to be teaching each other. In groups of three you’re going to be putting together a presentation on one of the energy resources…”

Hands up if this sounds familiar? I’ve used variations on this theme for years, partly because I’m lazy but mainly because it works. I’ve fine-tuned it, of course; I now start off with two example presentations, one reasonable and one awful, and have the students tell me what they need to avoid.

If you can’t be a good example then you’ll just have to be a horrible warning.

Catherine Aird

But it doesn’t always work very well, even if you give them a energy resources blank table to complete as they listen. This year I’ve ended up trying out some different approaches and thought it might be worth sharing them.

Small changes

For chatty groups, how about having the presentations put together in the same way, but then present as part of a circus or marketplace activity? Students only need to speak to a handful of classmates at a time, and they get to rehearse it too. They can complete the same blank template as they work and ask questions they might not check if in a larger group. The downside is that you can’t listen in to correct misconceptions; I had students email their presentations first, then gave feedback before they shared with each other. Afterwards, of course, the powerpoints can be added to a shared drive through school. If you’ve the resources, kids could be videoed presenting for long term storage.


In small groups, students could identify viewpoints for and against different power stations. This risks being more about emotion than explanations, but doesn’t have to take a long time in the classroom. Choose good roles and after each discussion they can add + and points to a whiteboard; this can be photographed for later recall. Offer bonus points for students able to identify bigger patterns such as ‘fossil fuels all contribute to climate change’ or ‘renewable resources are often unreliable’.

Top Trumps

Some groups love the idea of choosing four or five categories then scoring each power station from 10 (fantastic) to 1 (awful). Some kids struggle with the arbitrary nature of the scores, while others get bogged down in irrelevant squabbles. I found that using the category definitions as a starter got them more or less focussed. Dissuading them from spending the majority of the time drawing pictures was an issue! This led me to a slightly different approach, which I tweeted.

Effectively I gave the students a power station scorecard listing the main ways in which two power stations could be compared. In pairs they had to choose one each, then discuss which ‘won’ each round. Finally they had to choose an overall winner. To make life more complicated, simply give the class a new location every five minutes. More able swtudents will recognise that these factors do not have equal weighting – you could discuss with them that a long-term view might award double points for ‘winning’ some of the rounds.



The cards ideas above are both good for reviewing content – you could also allow more time but provide resources like textbooks or laptops (or BYOD). To quickly review the content, it’s easy to produce a simple card sort which students can arrange into renewable/non, thermal/kinetic, carbon contributors/neutral and so on.

Hope some of these ideas are useful – please let me know if so!

Exam Paper Debriefs (Summer 2012)

I’m combining two resources into one post here, but hopefully they should still show up by searching. (He types, hurriedly adding some tags.) I’ve made two powerpoints, each matched to what I think are the easy marks available on the summer 2012 P1 and P2 exams from AQA. Useful as practice or as full mocks, I often have students go through them focusing on what they should all aim for, before checking through in more detail. Having students divide their missed marks (using this exam paper debrief pdf) into recall failures and method mistakes can be helpful.

If students are able, they could also be pointed towards the examiners’ reports, which are only available if you go through the subject link at AQA rather than the direct Past Papers route. If not, then this is our job anyway – perhaps something to consider as part of a backwards design approach?

P1 june2012 easy as ppt, for the P1 summer 2012 exam – see also my P1 summary activity.

P2 may2012 easy as ppt, for the P2 summer 2012 exam – see also my P2 summary activity.

And yes, before you ask – I am working on equivalent resources for more recent exams, hopefully to be done before we all need them for mocks. Although the summer 2013 papers haven’t shown up yet – is that because, without January 2014 papers to use, AQA are expecting those to be used as mocks too? Must check e-AQA… (adds to evergrowing to do list)

Finally; yes, I’ve been fairly quiet and quite down as of late; lots going on, I’ll be fine, send chocolate and coffee if feeling helpful. As that’s pretty much all I’ve been eating for a while, supplies are running low!



Divided and Conquered?

So I was on Twitter.

@TeacherROAR – who I follow – retweeted an item from @NUTSouthWest – who I don’t – which in turn quoted figures from an article in the Independant.

I followed the conversation and was struck by this tweet to another tweeting teacher.

followed by:

I responded in turn and a not particularly pleasant slanging match ensued. I had two main issues, one about Twitter and the other about teacher solidarity. Maybe I didn’t express myself well in 140 characters – but more on this limitation in a moment. EDIT: And this is without even considering the actual figures incolved, of which more added at the end.

Firstly, I don’t think anyone assumes that a retweet means total support of the original message. In fact, sometimes it’s intended as mockery! But if you quote figures, and someone asks you about them, it’s reasonable to justify or explain. I think. If it turns out they’re wrong, I’d see it as only fair to tweet a follow-up. Accountability, yes? Online we only have our reputation as currency. Challenging figures or opinions isn’t the same thing as an attempt to censor opinion, and for what it’s worth, I agree that if we only have exaggerated figures to use as propaganda we’ve got no chance. As I tweeted to @sidchip64, a ‘roar’ without anything to back it up is just bluster.

Secondly, I can just imagine Gove or his minions rubbing their hands together and laughing, watching those who teach fighting with each other instead of him. Dismissing a challenge from another teacher is rude. I expect my students to question what I say – often I demand it. But I expect better of any professional who works in a classroom. Solidarity means we work together to get it right, and that includes good statistics. It doesn’t mean we unquestioningly back a colleague who’s wrong.

Maybe it’s about a limited medium. I often find this on Twitter – great for tips, bad for clear ideas. Soundbites, not critical debate. So I suggested to @TeacherROAR that it wouldn’t be hard to clarify what they meant – and justify it – in a blog. For some reason this was seen as a demand and so I decided to do it myself. Half an hour later, here we are. I feel better for it, anyway.

So what I didn’t include last night – and, believe it or not, woke up thinking about at half-five this morning – is a point of view on the numbers. They got attention, obviously. That was the point. But I think it was poor of the Independent to quote from a report by the Sixth Form Colleges Association – a report I haven’t yet found, but that may be due to lack of caffeine – which makes a direct comparison between the annual funding for their students and that spent on setting up free schools this year.

Now, it would be fair to say that I’m very dubious about free schools, in particular the application and set up process. Laura McInerney explains these concerns much more eloquently and expertly than I could. But that doesn’t mean we should misuse data in this way. Making the last year’s nine free schools (some or all of the total?) and their current 1557 students liable for the entire cost of setting them up – when the assumption is that these costs would actually be spread over the foreseeable life of the schools – is wrong. If I can be forgiven a physics example, it’s like working out the kWh cost of electricity from a nuclear power station using all the commissioning and decommissioning costs but only a single year of electrical output.

Picking numbers out of the air, if each of those nine free schools costs £3m to run this year (which would make the set up costs £35m) then the cost per student comes to a little over £17000. If their costs are £2m annually, then the figure is £11500 or so. Now, these figures are still too high – but they’re more realistic, unless each of those schools is to shut down after a single year being open.

Yes, I agree that free schools haven’t always been set up where they’re actually needed, so you could argue the costs are wasted. Yes, I know that this year a lot has been spent, potentially to the detriment of sixth form colleges. But I’d be prepared to bet that back when the colleges were set up, some people claimed they were a waste of money. And I’m sure they were justified by looking at the benefits over time, not just costs in the first year. If we want to be taken seriously – and this goes back to my first point – then we must justify the numbers we use, or we are building our argument on very weak foundations.

A final quote, this time from much longer ago.

If we do not hang together, we shall surely hang separately.

Benjamin Franklin