#rEDrugby 2/2

Following up yesterday’s reflective post, my typed up bullet points of the afternoon sessions. As before, my thanks to the organisers and presenters and a promise that I’ll update these posts with links to the actual presentations in a week or so.

Do They Really Get It session by Niki Kaiser (@chemDrK)

  • Session was a development of a post on Niki’s blog.
  • Students gave correct answers by imitation, not based on deep understanding, as shown by discussions of ions in a solution vs electrons in a wire; I wonder if the demo showing movement of coloured ions during ‘slow’ electrolysis would help?
  • Threshold concepts guide the teacher when choosing what to highlight, what to emphasize in lessons. There should be no going back from the lightbulb moment. If so, why do we need to constantly return to these misconceptions where students rely on folk physics despite explicit refutation work with us?
  • It is worth making explicit to students that these are challenging (and often abstract) concepts, and so time to understand them is both normal and expected. In Physics we make this clear with quantum work but perhaps it should be a broader principle.
  • Teachers will do a lot of this already, but we need to be more deliberate in our practice, both for our students and for our own reflection. This is how we improve, and is particularly important for us as experts to put ourselves in the position of novices. This is part of what we refer to as PCK.
  • “Retrace the journey back to innocence…” a quote from Glynis Cousins in a 2006 paper (this one?) which is about better understanding where our students are coming from. I would use the word ‘ignorance’, but like ‘naive’ there are many value judgments associated with it!
  • It’s not properly learned unless students can still do it when they weren;t expecting to need to.

Singapore Bar-Model session by Ben Rogers (@benrogersedu), blogged at Reading for Learning.

  • Developing ideas from previous posts on his blog.
  • The bar-model is an algebraic way of thinking about a situation, without using algebra explicitly. This means it is compatible with better/quicker approaches, rather than being a way around them like the formula triangle.
  • Uses principles from CLT; less working memory is needed for the maths so more is available for the physics.
  • Suggests (emphasizes this is speculative) that visual rather than verbal information is a way to expand working memory. This is also an example of dial coding and presumably one of the strengths.
  • Compare approaches by using different methods with two halves of a class. Easiest way is to rank them using data, then ‘odd number positions’ use one approach to contrast with ‘even number positions’ for the other. Even if the value of the measurement used for the ranking is debatable, this should give two groups each with a good spread of ability/achievement.
  • Useful approach for accumulated change and conservation questions; could be difficulties for those questions where the maths makes it look like a specific relationship, such as V = E/Q, as this reinforces a unit approach rather than ratio.
  • A Sankey diagram, although a pain to draw, effectively uses the bar method. The width of each arrow is the length of the bar, and they are conserved.
  • Some questions are harder than others and the links may not be obvious to students, even if they are to us. Be explicit about modelling new ‘types’ (and discussing similarity to established methods). This sounds like a use, deliberate or otherwise, of the GRR model from Fisher and Frey.

Memory session by Oliver Caviglioli (@olivercaviglioli)

  • Reconstructing meaning is how we build understanding. Although this process is by necessity individual, it can be more or less efficient.
  • The old idea of remembering seven things at once is looking shaky; four is a much better guideline. If one of those things or ‘elements’ is a group, however, it represents a larger number of things. Think of this as nested information, which is available if relevant.
  • We need to design our lessons and materials to reduce unproductive use of the limited capacity of the brain.
  • Two approaches are the Prototype (Rosch) and Sets (Aristotle). Suspicion that different disciplines lean more towards different ends of this spectrum. Type specimens in science are an interesting example. My standard example is of different Makaton signs for ‘bird’ and ‘duck’ and the confusion that follows. Links to discussion on twitter recently with @chemdrK about how we need to encourage students to see the difference between descriptions and definitions (tags and categories) when, for example, talking about particles.
  • Facts can be arranged in different ways including random (disorganised), list, network (connections) and hierarchical. By providing at least some of this structure, from an expert POV, we save students time and effort so recall (and fluency) is much more efficient. Statistic of 20% vs 70% recall quoted. Need to find the source of this and look into creating a demonstration using science vocab for workshops.
  • The periodic table is organised data, and so the structure is meaningful as well as the elements themselves. Alphabetical order, or the infamous song, are much less useful.
  • Learning as a Generative Activity, 2015 is recommended but expensive at ~£70.
  • Boundary conditions are a really important idea; not what works in education, but what works better, for which students, in which subjects, under X conditions. This should be a natural fit for science teachers who are (or should be) used to explaining the limitations of a particular model. This is where evidence from larger scale studies can inform teacher judgment about the ‘best’ approach in their setting and context.
  • Bottom-up and top-down approaches then become two ends of a spectrum, with the appropriate technique chosen to suit a particular situation and subject. To helpfully use the good features of a constructionist approach we must set clear boundaries and outcomes; my thought is that for a=F/m we give students the method and then ask them to collect and analyse data, which is very different to expecting them to discover Newton’s Laws unassisted. It might, of course, not feel different to them – they have the motivation of curiosity, which can be harnessed, but it would be irresponsible to give them free rein. From a climber’s perspective, we are spotting and belaying, not hoisting them up the cliff.

Missed Opportunities And My Jobs List

As you might expect, there were several sessions I would have loved to attend. In my fairly limited experience this is a problem with most conferences.  In particular I was very disappointed not to have the chance to hear the SLOP talk from @rosalindphys, but the queue was out of the door. The presentation is already online but I haven’t read it yet, because then I knew I’d never get my own debrief done. This applies to several other sessions too, but it was only sensible to aim for sessions which could affect my own practice, which is as a teacher-educator/supporter these days rather than a ‘real’ teacher.

After some tweeted comments, I’m reproducing my jobs list. This has already been extracted from my session notes and added to my diary for the coming weeks, but apparently it may be of interest. In case you’re not interested, my customary appeal for feedback. Please let me know what if any of this was useful for you, and how it compares with your own take-away ideas from the sessions. And if I didn’t catch up with you during the day, hopefully that will happen another time.

  • Talk to Dom about CPhys course accreditation
  • use references list to audit blended learning prototype module
  • add KS3 circuits example showing intrinsic/germane/extraneous load to workshop
  • review SOLO approach and make notes on links to facts/structured facts part of CLT
  • check with Pritesh if subject associations have been (or could be) involved with booklet development
  • read Kristy’s piece for RSC about doing your first piece of ed research
  • check references for advice on coding conversations/feedback for MRes project
  • search literature for similar approach (difficulty points scores) for physics equation solving
  • share idea re reports: a gap in comments may itself be an implicit comment
  • check an alert is set with EEF for science-specific results
  • use Robin’s presentation links to review roles for a research-informed school – might be faster to use Niki’s Research Lead presentation
  • build retrieval practice exercise for a physics topic that is staged, and gives bonus points for recall of ‘earlier’ concepts
  • TILE livestream from Dundee Uni; sign-up form?
  • follow Damian Benny
  • share ionic movement prac with Niki
  • add Cousin, 2006 to reading list
  • write examples of singapore bar model approach for physics contexts – forces?
  • pre-order Understanding How We Learn
  • use Oliver’s links as a way to describe periodic table organisation – blog post?
  • find correct reference from Oliver’s talk, AGHE et all 1969 about self-generated vs imposed schema changing recall percentages

You’ll have to check in with me in a month to see how many of these have actually been done…

Advertisements

#rEDRugby 1/2

Going to a conference isn’t good CPD unless you reflect on the new information and apply it to your own practice. (This isn’t an original thought, of course; @informed_edu probably put it best a while back.) So although I found the day in Rugby really interesting – and all due congratulations to @judehunton and the team for a great day – if I want to make it worthwhile I need to think about it a little more. The same as feedback should be more work for the student than the teacher, reflection should be more intense for the participant than speaking was for the colleague leading a CPD workshop or talk.

photo of a notebook page from ResearchED Rugby

The notes I take during a talk are quite straightforward; I use a modified Cornell notes structure, adding key terms on the left before I leave to sum up, and tasks at the bottom I can tick off when completed. The bullet points for each session are from my notes, with italics marking out my thoughts and responses. Many of the speakers will be blogging or sharing their presentations, but I’ll update this in a week rather than waiting.

It’s not listed below, but one of the most valuable things for me about the day was talking to colleagues about their responses to the talks, how they planned to use the ideas and how I might get them involved in my projects. I was particularly touched by several colleagues, who I’ve ‘known’ through Twitter but not met before, who made a point of saying how they appreciated particular things I’ve done over the past few years. Always nice to be appreciated!

Cognitive Load session by Dom Shibli (@ShibliDom)

  • Emphasized that CLT (from John Sweller) is a really useful model but is disputed by some.
  • Load = intrinsic (which will vary depending on student and their starting point) + germane (which builds capacity) and extraneous (distractions or ambiguities which we as experts know to ignore but students worry about)
  • Being concise with instructions reduces extraneous load so they can focus on what is intrinsic/germane. This might involve training them for routines early on.
  • Curiosity drives attention so ration it through the lesson!
  • Explicitly providing subject-specific structures to pupils means they organise knowledge into an effective schema. The process of making those links itself adds to the cognitive load, which is something to be aware of but not avoid.
  • This feels a bit SOLO to me; meaningful connections themselves are a form of knowledge, but one which is harder to test.

Curriculum Design session by Pritesh Raitura (@mr_raichura), blogged by him at Bunsen Blue.

  • Acknowledged that his setting (Michaela) get a lot of attention from media/twitter and tends to polarise debate.
  • Spending time as a team on building a shared curriculum means more efficient use of that time; this is supported by school routines eg shared detentions.
  • Starting with the big ideas, break down content to a very small scale and then sequence. Bear in mind the nature of each facet; procedural vs declarative, threshold concepts, cultural capital, exam spec. One of my thoughts was that this must include knowledge about the subject, such as the issues described by @angeladsaini in her book _Inferior_.
  • Sequencing is a challenge when the logical order from the big ideas is contradicted by the exam spec order, which is supported by resources from the exam boards.
  • Booklets used which are effectively chapters of a self-written textbook. Really interesting approach, I’d love to see how students use these (write-on? annotate?) and the sources of explanations, links to learned societies etc.
  • Feedback to students may consist simply of the correct answers. I disagree with this, because which wrong answer they choose may be diagnostic and sharing the process with them may be useful to help them recognise their own ‘wrong tracks’. Also consider @chemDrK‘s post on students giving the right answer by rote, not understanding.
  • Some really interesting ideas, but my concern is that this is only possible if the whole school follows a very clear line. This is much harder to ensure with existing schools rather than a new approach from scratch. So it may not be scalable. Researcher/Teacher role session by Kristy Turner (@doc_kristy)
  • 0.6 Uni lecturer, 0.4 school teacher (plus freelance)
  • Teachers in school were slow to adopt evidence informed practice, so an attempt made to do some research looking at historical data (therefore no ethical issues)
  • Coding phrases from reports was a challenge. Codes were based on ideas from the A-Level Mindset book. I need to adapt this approach to analyse the reflective comments on workshops etc that will form the basis of my own MRes project.
  • Results showed that, rather than science, Physics teachers were the outlier (along with Music and Russian) about how often innate characteristics were praised.
  • Lots of the comments were vague, and this will itself inform report-writing. Many could be interpreted in different ways, and this is worth remembering for parents. My immediate thought is that some parents will be able to decode the comments much better than others (social issue?), and we as teachers may recognise that an absence of a comment may itself reflect a judgment eg if no comment about working hard, they may be lazy.
  • An ongoing study is looking at student answers to ten chemical equation Qs, scored for difficulty by teachers based on values of coefficients, number of elements etc, comparing them before and after summer break. Some evidence that older students do better (‘year 9 into 10’ vs ‘year 8 into 9’) even without explicit balancing equations work in that year – is this because of increasing maturity, drip-feeding chemical equations over the year or something else?
  • I need to look for an equivalent test (or write one) for physics equations, with the equations assessed for difficulty in the same way.

Research-Informed Schools by Robin Macpherson (@robin_macp)

  • We need to start with a model of teacher competency which is reflective, not deficit-based. Research-informed practice is often time-effective, but the ‘informed’ matters because it is always adjusted/filtered by our own approach and setting. Professional judgment is key!
  • the gap between research and practice is where weird ideas get in, and these are what cause us problems. I remember comments, years back, that some knowledge about ed-research is a vaccine against BrainGym and similar.
  • Building in ideas from, for example, Dunlovsky can be as simple as making sure there are bonus points on tests for questions relating to earlier topics. We’re making explicit that we appreciate and reward recall going back further than last week.
  • Not all ideas turn out to be useful. Differences in mindset seem to be real, but there’s growing evidence that these differences are slowly accumulated and not something we can change by displays or interventions.
  • A Research Lead will have many jobs to do, including but not limited to curation, distillation, signposting and distribution. (These words are my paraphrasing.) Making a school research-informed is a slow process, 5-10 years, not an instant fix. One link shared was TILE for good practice examples.

 

I’m flagging with lack of coffee and so will post the afternoon’s sessions tomorrow. Or maybe the day after!

 

Responding to “Secret Origins”

This post is a duplicate of the comment I’ve just left on a post at Vince Ulam’s blog; it’s here because otherwise the time I spent on formatting and adding hotlinks was wasted.

“These useful idiots, grateful for the imagined recognition and eager to seem important in the eyes of their peers, promote the aims and ideas of their recruiters across social media and via ticketed salons.”

It must be really nice to see yourself as immune to all this, too smart to fall for the conspiracy that everyone else has been duped by. Because, whether you intended it or not, that’s how much of the original post comes across. I think this is what put my back up, to be honest. I’ve attended two ResearchED events, one of which I spoke at. I’d like to think I earned that, rather than being recruited as a useful idiot. But then, in your viewpoint, it’s only natural I’d fall for it: I’m not as clever as you. The contrary argument might be that you’re resentful of not having the opportunity or platform for your views, but I’ve no idea if you’ve applied to present at ResearchED or anything similar. So how about we look at the facts, rather than the inferences and assigned motives you write about?

ResearchED in Context

From a local teachmeet up to national events, the idea of ‘grassroots’ activism in teaching is a powerful one. As bloggers, we both believe that practitioners can influence the ideas and work of others. And yes, I agree that appearing practitioner- or public-led, but actually being influenced by specific political parties or organisations, would be appealing to those organisations. It would lend legitimacy to very specific ideas. You only have to look at the funding of patient organisations by pharmaceutical companies, or VoteLeave and allied groups, to see the issues. But there is surely a sliding scale of influence here.

How we assess the independence of such a grassroots organisation could be done in several ways. Do we look at where the money comes from? Do we examine the people involved in organising or leading it? Do we look at the decisions they make, and how they are aligned with other groups? Do we look at who chooses to be involved, and who is encouraged/dissuaded, subtly or otherwise?

In reality we should do all of those. I think my issue with your post is that you seem to be putting ResearchEd in the same category as the New Schools Network among other groups, and (on Twitter) to be adding in the Parents and Teachers for Excellence Campaign too. I see them as very separate cases, and I’m much less hesitant about ResearchEd – partly because the focus is teacher practice and engagement, not campaigning. And you raise Teach First, which I have my own concerns about and am leaving to one side now as it’s not relevant.

The New Schools Network is (mostly) funded by government, and many have written about the rather tangled set of circumstances which led to the funding and positions expressed being so closely tied to a policy from one political party. I must admit, I find myself very dubious about anything that Dominic Cumming has had a hand in! Their advocacy and support for free schools, with so far limited evidence that they provide good value for money, frustrates me.

The PTE Campaign is slightly different. I’ve not spent time on searching for funding information but remember from previous news items – this from Schools Week for example – that it lacks transparency, to say the least. I think the name is misleading and their claim to be about moving power away from ‘the elites in Westminister and Whitehall’ to be disingenuous.

And let’s not even start with Policy Exchange.

From where I sit, if you want to group ResearchED with other education organisations, a much better match would seem to be Northern Rocks. The focus is improving and sharing classroom pedagogy, rather than campaigning. They’re both run on a shoestring. Classroom teachers are keen on attending and praise what they get out of the sessions. I can’t find anything on your blog about Northern Rocks, but that could be simple geography. (The bitter part of me suggests it’s not the first time anything happening past Watford gets ignored…)

Back to ResearchED: Funding and Speakers

“We have to hand it to Tom Bennett for his truly amazing accomplishment of keeping his international ‘grassroots’ enterprise going for four years without producing any apparent profits.”

Maybe it’s me seeing something which isn’t there, but your post seems to imply that there must be some big funding secret that explains why ResearchED is still going. What do you think costs so much money? The speakers are volunteers, as are the conference helpers. I don’t know if Tom gets a salary, but considering how much time it must be taking it would seem reasonable for at least a few people to do so. The catering costs, including staffing, are covered by the ticket price. The venues I remember are schools, so that’s not expensive.

As you’ve raised on Twitter during our discussions, the question of transport for UK-based speakers to overseas venues is an interesting one. I know that when I presented at Oxford (the Maths/Science one), my employer covered my travel costs; I assume that was the same for all speakers, or they were self-funding. If you have other specific funding concerns, I’ve not seen you describe them; you can hardly blame me for focusing on this one if you’d rather make suggestive comments than ask proper questions. I would also like to know if speakers can access funding support and if so, how that is decided. I can’t find that information on the website, and I think it should be there. I disagree with lots of what you say – or I wouldn’t have written all this – but that loses legitimacy if I don’t say where we have common ground.

I was surprised to find out how many ResearchED conferences there had been; I was vaguely thinking of seven or eight, which is why I was surprised by your suggestion that David Didau had presented at least six times. I stand corrected, on both counts. Having looked at the site, I’m also surprised that there’s no clear record of all the events in one place. A bigger ask – and one I have addressed to one of the volunteers who I know relatively well – would be for a searchable spreadsheet of speaker info covering all the conferences.

That would be fascinating, wouldn’t it? It would let us see how many repeat speakers there are, and how concentrated the group is. My gut feeling is that most speakers, like me, have presented only once or twice. Researchers would probably have more to say. I’d love to see the gender balance, which subject specialisms are better represented, primary vs secondary numbers, the contrast between state and independent sector teachers, researcher vs teacher ratios…

I’m such a geek sometimes.

You tweeted a suggestion I should ignore my personal experience to focus on the points in your post. The thing is that my personal experience of – admittedly only two – ResearchED conferences is that any political discussion tends to happen over coffee and sandwiches, and there’s relatively little of that. Maybe there’s more at the ‘strategic’ sessions aimed at HTs and policy-makers, rather than the classroom and department methods that interest me. If there’s animosity, it’s more likely to be between practitioners and politicians, rather than along party lines. I suspect I have more in common, to be honest, with a teacher who votes Tory than a left-leaning MP without chalkface experience. It’s my personal experience that contradicts the suggestions in your post about ResearchED being part of a shadowy conspiracy to influence education policy debate.

To return to Ben Goldacre, featured in your post as a victim of the puppet-masters who wanted a good brand to hide their dastardly plans behind: his own words suggest that in the interests of improving the evidence-base of policy, he’s content to work with politicians. Many strong views have been expressed at ResearchED. With such a wide variety of speakers, with different political and pedagogical viewpoints, I’m sure you can find some presentations and quotes that politicians would jump on with glee. And I’m equally sure that there are plenty they ignore, politely or otherwise. But I don’t believe the speakers are pre-screened for a particular message – beyond “looking at evidence in some way is useful for better education.” To be honest, I’m in favour of that – aren’t you? If there’s other bias in speaker selection, it was too subtle for me to notice.

But then, I’m not as clever as you.

Variations on a Theme

It turns out that I’m really bad at following up conference presentations.

Back in early June, I offered a session on teachers engaging – or otherwise – with educational research. It all grew out of an argument I had on Twitter with @adchempages, who has since blocked me after I asked if the AP Chem scores he’s so proud of count as data. He believes, it seems, that you cannot ever collect any data from educational settings, and that he has never improved his classroom practice by using any form of educational research.

But during the discussions I got the chance to think through my arguments more clearly. There are now three related versions of my opinion, quite possibly contradictory, and I wanted to link to all three.

Version the first: Learning From Mistakes, blogged by me in January.

Streamlined version written for the BERA blog: Learning From Experience. I wrote this a while back but it wasn’t published by them until last week.

Presentation version embedded below (and available from http://tinyurl.com/ian-redmatsci if you’re interested).

I’d be interested in any and all comments, as ever. Please let me know if I’ve missed any particular comments from the time – this is the problem with being inefficient. (Or, to be honest, really busy.) The last two slides include all the links in my version of a proper references section.

Thoughts from the presentation

Slide 8: it’s ironic that science teachers, who know all about using models which are useful even though they are by necessity simplified, struggle with the idea that educational research uses large numbers of participants to see overall patterns. No, humans aren’t electrons – but we can still observe general trends using data.

Slide 13: it’s been pointed out to me that several of the organisations mentioned offer cheaper memberships/access. These are, however, mainly institutional memberships (eg £50/yr for the IOP) which raises all kinds of arguments about who pays and why.

Slide 14: a member of the audience argued with this point, saying that even if articles weren’t open-access any author would be happy to share electronic copies with interested teachers. I’m sure he was sincere, and probably right. But as I tried to explain, this assumes that (1)the teacher knows what to ask for, which means they’ll miss all kinds of interesting stuff they never heard about and that (2)the author is happy to respond to potentially dozens of individual requests. Anyone other than the author or journal hosting or sharing a PDF is technically breaking the rules.

Slide 16: Ironically, the same week as I gave the presentation there was an article in SSR on electricity analogies which barely mentioned the rope model. Which was awkward as it’s one of the best around, explored and endorsed by the IOP among many others.

Slide 20: Building evidence-based approaches into textbooks isn’t a new idea (for example, I went to Andy’s great session on the philosophy behind the Activate KS3 scheme) but several tweeters and colleagues liked the possibility of explicit links being available for interested teachers.

Northern Rocks

I had to get up early, on a Saturday.

It rained.

And I missed my train, so it was a really long day.

So in all, I had a fantastic day in Leeds. The speakers were great. The organisation was excellent. The food was good, even though I hadn’t booked anything. The company was funny, enthusiastic and friendly. The site was welcoming, although distinctly damp. The WiFi was highly reliable.

I even got a pen.

badges

This is not going to be comprehensive, obviously. Every attendee will have been to a different conference, with different speakers, picking and mixing to suit themselves. As I did. So all I can do is give a flavour of the day, share links to my rough notes and write about how the day will change what happens for my pupils. In the end, as several speakers pointed out, this is the whole point of what we do.

(Comments in my notes and on here are paraphrases and summaries, in my words not theirs. Please let me know if you feel I have misrepresented the views expressed or points made during the sessions.)

Opening Panel

The speakers were interesting, and in many ways seemed to be in broad agreement.

  • Ofsted is a real problem, getting worse because it is being viewed as more and more political.
  • We need less politics in running schools an less interference in specifications.
  • Teachers work damn hard and we need to make sure it’s time well spent, on things that matter.

Differences became clearer when questions probed:

  • how we could ensure high standards without some form of central organisation – I found Dominic Cummings‘ answers about a market-led approach seemed to miss the point, and his insistence that Gove etc had tried to move away from centralization unconvincing when we consider phonics as a fundamental part of teacher standards, and all authority for a school leading to the DfE. But maybe that’s just me.
  • what we should do about the difficulties with Ofsted; most felt that we still need accountability but that, perhaps, a pass/fail approach would be more constructive. Dot Lepkowska was one who agreed that we need to completely remove political access and involvement with Ofsted, to avoid perception that it is being used for political motives.

Click here for my rough notes.

 

David Weston aka @informed_edu on Teacher Development

Chair of the Teacher Development Trust (see also: National Teacher Enquiry Network, The Good CPD Guide)

One of the main things I took away from David’s talk is how ineffective most CPD is – and for reasons that we can only change if schools are prepared to adjust their approach. He gave the example of watching bad TV, learning/confirming that we should eat more healthily – but nothing changes. A longer-term approach is needed, fewer ‘bits’ of CPD on topics that have nothing to do with student progress. 

David’s slides / My rough notes

My action points:

  1. Every CPD session should be explicitly focused on the effect it will have on student outcomes. Reflect and ask!
  2. Use the idea of 3 colleagues at different career stages in the same CPD session. These are  my ‘case colleagues’, and I should consider how each of them will take away different ideas; makes the concepts more ‘stract’ (my word, not David’s!)
  3. Spend more time on (teaching) diagnosis skills, rather than just interventions.
  4. Review characteristics of effective CPD and blog about how to build them into small group sessions about science teaching

 

Tim Taylor aka @imagineinquiry on the Mantle of the Expert

In many ways I wasn’t the right audience for this session, as the techniques have been much more widely explored in primary. I like the idea of a pervasive imaginary world that students can step in and out of; as a parent I’m very familiar with this! (I’ve a very clear memory of my eldest telling his brother earnestly, aged 7 and 3: “Quick, we need to escape from the Chickens of Doom!”). And the ideas of humans being wired to respond well to narrative approaches is one that resonates with me partly from reading about the concept of us being Pan narrans, the story-telling ape, in The Science of Discworld series.

Students taking the role of experts who are commissioned to complete particular tasks, involving cross-curricular learning, is fascinating. It will inevitably be less engaging in secondary when it can only take a relatively small part of the curriculum unless the timetable and teachers can make it work. It is something that I have used working with Year7 using the upd8 WIKID scheme, which can be great but has some very confusing sections. It’s a step up from role play as it links imagination and skills development more closely.

Tim’s presentationMy rough notes

My main thoughts:

  • Limited use across timetable in secondary without major timetable considerations and enthusiasm from management.
  • It would be interesting to examine whether these ideas were deliberately used for WIKID.
  • Develop role play for guest lessons, making clear need for teacher to take a subordinate role to encourage students into a more assertive one.
  • Review/rewrite current roleplays using the immersive principles described – Teaching as Story Telling, recommended by Tim, would be interesting to read if money/time permit.

 

Dr Jo Pearson aka @jopearson3 on Research Considerations in School

I’ve done a little formal action research and I think most teachers have at some point asked themselves, “What will happen if I change this?” This was the only session in which I was asked to do something, looking at the questions that previous students had wanted to use on a Masters unit. The discussion of ethics was interesting, as Jo made the point that we should perhaps consider this kind of formalised, evidence-driven reflection as a normal and necessary part of our jobs (she still encouraged us to check the BERA Ethics guidelines though). I found myself strongly agreeing with the idea that failing to share what we learn is an ethical failure all of its own.

My rough notes

My action points:

  • Use a wider definition of data eg pupil work decoded, recorded conversations
  • Try using Cogi app with classes during discussion and planning to assess understanding
  • Improved questions for research need to be much more specific, local rather than global. Teachers I work with need to be encouraged to look at much smaller aspects over a small timescale.
  • Buy the book if at all possible: Inquiring in the Classroom

 

Dr Phil Wood aka @geogphil on Lesson Study

This session was fascinating and is something I intend to spend more time on. Phil was very dismissive of the idea of judging a teacher, or a lesson, based on a brief observation and the cycle he described seems like a much more constructive approach. Basically, several colleagues plan together, predicting how different aspects will lead to outcomes for three ‘case students’. One delivers the planned lesson, while another observes the students, and afterwards they reflect together. Ideally this reflection involves student interviews and/or a second (tweaked) delivery to an equivalent class. And so the cycle continues.

philwood slide-6-638

I like that this is a much more collaborative approach, and Phil described how more and less experienced staff were all able to contribute. The pressure and judgement is removed and instead different approaches are trialed in a safe setting. “An expert teacher understands wider policy, and the micropolitics of the school, so they can subvert these contexts in the interest of learning.” (my wording)

Phil’s session slides / My rough notes

 Action Points:

  • Reading required: need to look into this topic and the varied formats of collaborative planning/deliver/reflection cycle
  • EDIT: really interesting description of using this in science teaching on @headguruteacher‘s blog.
  • Blog about the cycle in more detail, seeking comment on how used by classroom teachers (especially ASTs/HoDs?)
  • “Once you use a ticklist, you miss what isn’t on the list.” – how can I apply this to markschemes and my teaching?
  • Put together timescale – perhaps using distance collaboration tools – for ways to use this cycle in coaching.

 

Final Session

Probably the less said the better, although the activities were… interesting… and the music was great. It was a really positive event and it was followed by coffee. Hoorah.

As I hope the points above make clear, the sessions I was able to attend (and there were three times as many I would have liked to see, hopefully some of which I’ll catch up with from the recordings) are the start, not the end. I suspect the ideas will feature in future posts and hopefully the impact on students is something I’ll be able to see.

In all, Northern Rocks was a great day and I’m sure the other participants thought so too. Huge thanks to Debra and Emma, as well as the presenters and those behind the scenes. Blogs about the day are popping up everywhere, and with 500 attendees I have no intention of trying to link to them all here. Please do comment with any thoughts about these sessions, in particular if you’ve got resources or links to point me towards. Because I’m lazy. 🙂