From the Classroom Up

So we had a Journal Club.

Getting on for 200 tweets from a small (but dedicated) group of Science teachers, with some tentative conclusions as Storified elsewhere. Although participants commented on the weak results from the case study – unavoidable with small groups on a single site – it certainly seemed interesting.

Could we show improved understanding, and hence achievement, by moving away from HSW skills integrated with content, and instead start KS3 by teaching these skills discretely? Enquiring minds want to know. If only there was a way to expand an interesting case study to get more reliable and/or generally applicable results. If only there was a general move towards gathering more evidence at a classroom level that could be widely shared in the profession…

“Hang on, fellas. I’ve got an idea.”

hangon

 Where We Are

An interesting case study has found a benefit from one approach (discrete teaching of Sc1 skills at the start of KS3) over another (gradually introduced over the year). A small sample was involved at one school.

What We Could Do Next

As several people pointed out, we need more data before proceeding to a full trial. The next step would be collecting information about schools which use these two approaches and how well they work. How do schools assess students’ understanding of the language and methods? A Googleform or similar would be an easy way to acquire the data without a high cost at this stage.

Trial Design

I should possibly leave this to the experts, but the whole point of this teacher-led approach is to get us involved. (Alternatively, the DfE could press release a huge study but not tell us what they’re actually investigating.) As I understand it, we’d need to

  1. Get an education researcher to co-ordinate design/timetables/data analysis.
  2. Produce standard resources to be used either all together (discrete unit) or spread through the year (integrated learning) – this could be based on CASE or similar approaches.
  3. Design outcome measure, ideally something cheap and non-intrusive.
  4. Recruit participant schools.
  5. Visit schools during trial (in both arms) to observe delivery, consider deviation from ‘ideal script’, and also raise profile of organisation/idea.
  6. This provides good ‘teacher/researcher’ links and could be used as a way to observe CSciTeach candidates perhaps, or at least accredit ‘teacher-researchers’.
  7. Collect data on outcomes for both groups. Tests need to be blinded, ideally marked externally or by computer. Workload!
  8. Data analysis – which approach gives the best results? Is this correlated with some characteristic of the schools?
  9. Share results widely, provide materials and best practice guidance based on evidence.
  10. Plan the next RCT, perhaps looking at the materials used.

Funding and Support

I’ve a few ideas, but they’re probably way off. I don’t know how much it would cost, either in terms of money or time. The EEF is focused on attainment of particular groups, so I don’t know how relevant it would be to their aims. (But their funding round closes in October.) The ASE, I suspect, would have the organisational skills but not the money. Might the Science Learning Centres have a part to play, if we consider this from the point of view of teachers developing themselves professionally while conducting research? It would also nicely complement some of the aims of YorkScience. And we shouldn’t forget the original author, Andrew Grime, although I don’t think he’s on Twitter. (We probably should have tried harder to get in touch with him before the Journal Club session, come to think of it…

I’m sure there are many other questions that could be answered in UK Science classrooms. But the question should be, which one shall we try to answer first? Instead of complaining from the sidelines, teachers should, ideally through coordinated projects and their professional associations, get involved. This seems like an ideal chance to make the most of the Evidence-Based Teaching Bandwagon and could perhaps be launched/discussed at ResearchED2013. If we want to make something of it.

Do we?

 

An apologetic postscript: sorry to followers of the blog who got spammy emails about a post which wasn’t there. This was because I hadn’t read the fine print on Storify about not being able to embed the material on a WordPress.com blog.  It’s the same Storify I link to above, now happily live at the SciTeachJC site.

Advertisements

Exit Questionnaire: Useful?

Last year, as part of the Action Research in Physics Project run through the Science Learning Centres, I collected data in my school about those who didn’t do Physics at AS. If this seems odd, think for a moment. If we ask those who did choose our subject, we’re only getting the success stories. Surely what we want to know is what put off everyone else. I was particularly interested in the high number who had achieved well at GCSE (getting A* in the separate Physics course) but had not chosen it as part of their AS timetable.

At my workplace, students are selected for triple science GCSE rather than choosing it themselves, which might account for some of them – they were bright students who achieved well in all or many of their subjects. And we have a lot of students doing Physics at AS, it’s not as if we’re in danger of losing classes. However, we do lag behind Biology and Chemistry. Boo. Hiss. I’m obviously not the first person to consider this, and I noticed some of the issues raised in, for example, the IOP Girls in Physics report. Numbers seem to be rising (32860 finished A2 last year, according to this Telegraph story which credits Brian Cox, or see this IOP press release for more detailed numbers.)

Scientists always like more data, and one school is hardly respresentative. So, I thought, why not collect more? If only there was some way to make this kind of quick survey available to colleagues in other schools, so that we could get a bigger sample. If only there was some way to automate and easily share the results, so that we could all learn from it…

At the risk of sounding like a Year 8 stuck on their homework, the answer is Google. A Google form, to be precise.

Obviously the results will be skewed, as I expect only students who have continued to their school 6th form will be pointed towards this, but the more data we can collect the better. Obviously the results will be open to all participants and I will also be blogging about them – it’s also possible that they will inform an article somewhere, perhaps SSR.

What I need to know is whether this is worth taking forward. I’ve put a draft Google form together, based on the paper version I used at my school last year. I have some questions to use, although obviously I’d be interested in any extra suggestions. I want to make this a fast questionnaire, not something students or teachers have to spend a lot of time on. My plan is to finalize the form in a week’s time, so the more feedback and suggestions I get in that time the better. I plan to post and tweet the link to the improved version on September 1st, and hope that as many colleagues as possible will get kids to fill it in. I’d also appreciate suggestions about how to get the word out to as many teachers as practical in a short space of time.

Anyone interested?