#rEDrugby 2/2

Following up yesterday’s reflective post, my typed up bullet points of the afternoon sessions. As before, my thanks to the organisers and presenters and a promise that I’ll update these posts with links to the actual presentations in a week or so.

Do They Really Get It session by Niki Kaiser (@chemDrK)

  • Session was a development of a post on Niki’s blog.
  • Students gave correct answers by imitation, not based on deep understanding, as shown by discussions of ions in a solution vs electrons in a wire; I wonder if the demo showing movement of coloured ions during ‘slow’ electrolysis would help?
  • Threshold concepts guide the teacher when choosing what to highlight, what to emphasize in lessons. There should be no going back from the lightbulb moment. If so, why do we need to constantly return to these misconceptions where students rely on folk physics despite explicit refutation work with us?
  • It is worth making explicit to students that these are challenging (and often abstract) concepts, and so time to understand them is both normal and expected. In Physics we make this clear with quantum work but perhaps it should be a broader principle.
  • Teachers will do a lot of this already, but we need to be more deliberate in our practice, both for our students and for our own reflection. This is how we improve, and is particularly important for us as experts to put ourselves in the position of novices. This is part of what we refer to as PCK.
  • “Retrace the journey back to innocence…” a quote from Glynis Cousins in a 2006 paper (this one?) which is about better understanding where our students are coming from. I would use the word ‘ignorance’, but like ‘naive’ there are many value judgments associated with it!
  • It’s not properly learned unless students can still do it when they weren;t expecting to need to.

Singapore Bar-Model session by Ben Rogers (@benrogersedu), blogged at Reading for Learning.

  • Developing ideas from previous posts on his blog.
  • The bar-model is an algebraic way of thinking about a situation, without using algebra explicitly. This means it is compatible with better/quicker approaches, rather than being a way around them like the formula triangle.
  • Uses principles from CLT; less working memory is needed for the maths so more is available for the physics.
  • Suggests (emphasizes this is speculative) that visual rather than verbal information is a way to expand working memory. This is also an example of dial coding and presumably one of the strengths.
  • Compare approaches by using different methods with two halves of a class. Easiest way is to rank them using data, then ‘odd number positions’ use one approach to contrast with ‘even number positions’ for the other. Even if the value of the measurement used for the ranking is debatable, this should give two groups each with a good spread of ability/achievement.
  • Useful approach for accumulated change and conservation questions; could be difficulties for those questions where the maths makes it look like a specific relationship, such as V = E/Q, as this reinforces a unit approach rather than ratio.
  • A Sankey diagram, although a pain to draw, effectively uses the bar method. The width of each arrow is the length of the bar, and they are conserved.
  • Some questions are harder than others and the links may not be obvious to students, even if they are to us. Be explicit about modelling new ‘types’ (and discussing similarity to established methods). This sounds like a use, deliberate or otherwise, of the GRR model from Fisher and Frey.

Memory session by Oliver Caviglioli (@olivercaviglioli)

  • Reconstructing meaning is how we build understanding. Although this process is by necessity individual, it can be more or less efficient.
  • The old idea of remembering seven things at once is looking shaky; four is a much better guideline. If one of those things or ‘elements’ is a group, however, it represents a larger number of things. Think of this as nested information, which is available if relevant.
  • We need to design our lessons and materials to reduce unproductive use of the limited capacity of the brain.
  • Two approaches are the Prototype (Rosch) and Sets (Aristotle). Suspicion that different disciplines lean more towards different ends of this spectrum. Type specimens in science are an interesting example. My standard example is of different Makaton signs for ‘bird’ and ‘duck’ and the confusion that follows. Links to discussion on twitter recently with @chemdrK about how we need to encourage students to see the difference between descriptions and definitions (tags and categories) when, for example, talking about particles.
  • Facts can be arranged in different ways including random (disorganised), list, network (connections) and hierarchical. By providing at least some of this structure, from an expert POV, we save students time and effort so recall (and fluency) is much more efficient. Statistic of 20% vs 70% recall quoted. Need to find the source of this and look into creating a demonstration using science vocab for workshops.
  • The periodic table is organised data, and so the structure is meaningful as well as the elements themselves. Alphabetical order, or the infamous song, are much less useful.
  • Learning as a Generative Activity, 2015 is recommended but expensive at ~£70.
  • Boundary conditions are a really important idea; not what works in education, but what works better, for which students, in which subjects, under X conditions. This should be a natural fit for science teachers who are (or should be) used to explaining the limitations of a particular model. This is where evidence from larger scale studies can inform teacher judgment about the ‘best’ approach in their setting and context.
  • Bottom-up and top-down approaches then become two ends of a spectrum, with the appropriate technique chosen to suit a particular situation and subject. To helpfully use the good features of a constructionist approach we must set clear boundaries and outcomes; my thought is that for a=F/m we give students the method and then ask them to collect and analyse data, which is very different to expecting them to discover Newton’s Laws unassisted. It might, of course, not feel different to them – they have the motivation of curiosity, which can be harnessed, but it would be irresponsible to give them free rein. From a climber’s perspective, we are spotting and belaying, not hoisting them up the cliff.

Missed Opportunities And My Jobs List

As you might expect, there were several sessions I would have loved to attend. In my fairly limited experience this is a problem with most conferences.  In particular I was very disappointed not to have the chance to hear the SLOP talk from @rosalindphys, but the queue was out of the door. The presentation is already online but I haven’t read it yet, because then I knew I’d never get my own debrief done. This applies to several other sessions too, but it was only sensible to aim for sessions which could affect my own practice, which is as a teacher-educator/supporter these days rather than a ‘real’ teacher.

After some tweeted comments, I’m reproducing my jobs list. This has already been extracted from my session notes and added to my diary for the coming weeks, but apparently it may be of interest. In case you’re not interested, my customary appeal for feedback. Please let me know what if any of this was useful for you, and how it compares with your own take-away ideas from the sessions. And if I didn’t catch up with you during the day, hopefully that will happen another time.

  • Talk to Dom about CPhys course accreditation
  • use references list to audit blended learning prototype module
  • add KS3 circuits example showing intrinsic/germane/extraneous load to workshop
  • review SOLO approach and make notes on links to facts/structured facts part of CLT
  • check with Pritesh if subject associations have been (or could be) involved with booklet development
  • read Kristy’s piece for RSC about doing your first piece of ed research
  • check references for advice on coding conversations/feedback for MRes project
  • search literature for similar approach (difficulty points scores) for physics equation solving
  • share idea re reports: a gap in comments may itself be an implicit comment
  • check an alert is set with EEF for science-specific results
  • use Robin’s presentation links to review roles for a research-informed school – might be faster to use Niki’s Research Lead presentation
  • build retrieval practice exercise for a physics topic that is staged, and gives bonus points for recall of ‘earlier’ concepts
  • TILE livestream from Dundee Uni; sign-up form?
  • follow Damian Benny
  • share ionic movement prac with Niki
  • add Cousin, 2006 to reading list
  • write examples of singapore bar model approach for physics contexts – forces?
  • pre-order Understanding How We Learn
  • use Oliver’s links as a way to describe periodic table organisation – blog post?
  • find correct reference from Oliver’s talk, AGHE et all 1969 about self-generated vs imposed schema changing recall percentages

You’ll have to check in with me in a month to see how many of these have actually been done…

Advertisements

#rEDRugby 1/2

Going to a conference isn’t good CPD unless you reflect on the new information and apply it to your own practice. (This isn’t an original thought, of course; @informed_edu probably put it best a while back.) So although I found the day in Rugby really interesting – and all due congratulations to @judehunton and the team for a great day – if I want to make it worthwhile I need to think about it a little more. The same as feedback should be more work for the student than the teacher, reflection should be more intense for the participant than speaking was for the colleague leading a CPD workshop or talk.

photo of a notebook page from ResearchED Rugby

The notes I take during a talk are quite straightforward; I use a modified Cornell notes structure, adding key terms on the left before I leave to sum up, and tasks at the bottom I can tick off when completed. The bullet points for each session are from my notes, with italics marking out my thoughts and responses. Many of the speakers will be blogging or sharing their presentations, but I’ll update this in a week rather than waiting.

It’s not listed below, but one of the most valuable things for me about the day was talking to colleagues about their responses to the talks, how they planned to use the ideas and how I might get them involved in my projects. I was particularly touched by several colleagues, who I’ve ‘known’ through Twitter but not met before, who made a point of saying how they appreciated particular things I’ve done over the past few years. Always nice to be appreciated!

Cognitive Load session by Dom Shibli (@ShibliDom)

  • Emphasized that CLT (from John Sweller) is a really useful model but is disputed by some.
  • Load = intrinsic (which will vary depending on student and their starting point) + germane (which builds capacity) and extraneous (distractions or ambiguities which we as experts know to ignore but students worry about)
  • Being concise with instructions reduces extraneous load so they can focus on what is intrinsic/germane. This might involve training them for routines early on.
  • Curiosity drives attention so ration it through the lesson!
  • Explicitly providing subject-specific structures to pupils means they organise knowledge into an effective schema. The process of making those links itself adds to the cognitive load, which is something to be aware of but not avoid.
  • This feels a bit SOLO to me; meaningful connections themselves are a form of knowledge, but one which is harder to test.

Curriculum Design session by Pritesh Raitura (@mr_raichura), blogged by him at Bunsen Blue.

  • Acknowledged that his setting (Michaela) get a lot of attention from media/twitter and tends to polarise debate.
  • Spending time as a team on building a shared curriculum means more efficient use of that time; this is supported by school routines eg shared detentions.
  • Starting with the big ideas, break down content to a very small scale and then sequence. Bear in mind the nature of each facet; procedural vs declarative, threshold concepts, cultural capital, exam spec. One of my thoughts was that this must include knowledge about the subject, such as the issues described by @angeladsaini in her book _Inferior_.
  • Sequencing is a challenge when the logical order from the big ideas is contradicted by the exam spec order, which is supported by resources from the exam boards.
  • Booklets used which are effectively chapters of a self-written textbook. Really interesting approach, I’d love to see how students use these (write-on? annotate?) and the sources of explanations, links to learned societies etc.
  • Feedback to students may consist simply of the correct answers. I disagree with this, because which wrong answer they choose may be diagnostic and sharing the process with them may be useful to help them recognise their own ‘wrong tracks’. Also consider @chemDrK‘s post on students giving the right answer by rote, not understanding.
  • Some really interesting ideas, but my concern is that this is only possible if the whole school follows a very clear line. This is much harder to ensure with existing schools rather than a new approach from scratch. So it may not be scalable. Researcher/Teacher role session by Kristy Turner (@doc_kristy)
  • 0.6 Uni lecturer, 0.4 school teacher (plus freelance)
  • Teachers in school were slow to adopt evidence informed practice, so an attempt made to do some research looking at historical data (therefore no ethical issues)
  • Coding phrases from reports was a challenge. Codes were based on ideas from the A-Level Mindset book. I need to adapt this approach to analyse the reflective comments on workshops etc that will form the basis of my own MRes project.
  • Results showed that, rather than science, Physics teachers were the outlier (along with Music and Russian) about how often innate characteristics were praised.
  • Lots of the comments were vague, and this will itself inform report-writing. Many could be interpreted in different ways, and this is worth remembering for parents. My immediate thought is that some parents will be able to decode the comments much better than others (social issue?), and we as teachers may recognise that an absence of a comment may itself reflect a judgment eg if no comment about working hard, they may be lazy.
  • An ongoing study is looking at student answers to ten chemical equation Qs, scored for difficulty by teachers based on values of coefficients, number of elements etc, comparing them before and after summer break. Some evidence that older students do better (‘year 9 into 10’ vs ‘year 8 into 9’) even without explicit balancing equations work in that year – is this because of increasing maturity, drip-feeding chemical equations over the year or something else?
  • I need to look for an equivalent test (or write one) for physics equations, with the equations assessed for difficulty in the same way.

Research-Informed Schools by Robin Macpherson (@robin_macp)

  • We need to start with a model of teacher competency which is reflective, not deficit-based. Research-informed practice is often time-effective, but the ‘informed’ matters because it is always adjusted/filtered by our own approach and setting. Professional judgment is key!
  • the gap between research and practice is where weird ideas get in, and these are what cause us problems. I remember comments, years back, that some knowledge about ed-research is a vaccine against BrainGym and similar.
  • Building in ideas from, for example, Dunlovsky can be as simple as making sure there are bonus points on tests for questions relating to earlier topics. We’re making explicit that we appreciate and reward recall going back further than last week.
  • Not all ideas turn out to be useful. Differences in mindset seem to be real, but there’s growing evidence that these differences are slowly accumulated and not something we can change by displays or interventions.
  • A Research Lead will have many jobs to do, including but not limited to curation, distillation, signposting and distribution. (These words are my paraphrasing.) Making a school research-informed is a slow process, 5-10 years, not an instant fix. One link shared was TILE for good practice examples.

 

I’m flagging with lack of coffee and so will post the afternoon’s sessions tomorrow. Or maybe the day after!

 

Responding to “Secret Origins”

This post is a duplicate of the comment I’ve just left on a post at Vince Ulam’s blog; it’s here because otherwise the time I spent on formatting and adding hotlinks was wasted.

“These useful idiots, grateful for the imagined recognition and eager to seem important in the eyes of their peers, promote the aims and ideas of their recruiters across social media and via ticketed salons.”

It must be really nice to see yourself as immune to all this, too smart to fall for the conspiracy that everyone else has been duped by. Because, whether you intended it or not, that’s how much of the original post comes across. I think this is what put my back up, to be honest. I’ve attended two ResearchED events, one of which I spoke at. I’d like to think I earned that, rather than being recruited as a useful idiot. But then, in your viewpoint, it’s only natural I’d fall for it: I’m not as clever as you. The contrary argument might be that you’re resentful of not having the opportunity or platform for your views, but I’ve no idea if you’ve applied to present at ResearchED or anything similar. So how about we look at the facts, rather than the inferences and assigned motives you write about?

ResearchED in Context

From a local teachmeet up to national events, the idea of ‘grassroots’ activism in teaching is a powerful one. As bloggers, we both believe that practitioners can influence the ideas and work of others. And yes, I agree that appearing practitioner- or public-led, but actually being influenced by specific political parties or organisations, would be appealing to those organisations. It would lend legitimacy to very specific ideas. You only have to look at the funding of patient organisations by pharmaceutical companies, or VoteLeave and allied groups, to see the issues. But there is surely a sliding scale of influence here.

How we assess the independence of such a grassroots organisation could be done in several ways. Do we look at where the money comes from? Do we examine the people involved in organising or leading it? Do we look at the decisions they make, and how they are aligned with other groups? Do we look at who chooses to be involved, and who is encouraged/dissuaded, subtly or otherwise?

In reality we should do all of those. I think my issue with your post is that you seem to be putting ResearchEd in the same category as the New Schools Network among other groups, and (on Twitter) to be adding in the Parents and Teachers for Excellence Campaign too. I see them as very separate cases, and I’m much less hesitant about ResearchEd – partly because the focus is teacher practice and engagement, not campaigning. And you raise Teach First, which I have my own concerns about and am leaving to one side now as it’s not relevant.

The New Schools Network is (mostly) funded by government, and many have written about the rather tangled set of circumstances which led to the funding and positions expressed being so closely tied to a policy from one political party. I must admit, I find myself very dubious about anything that Dominic Cumming has had a hand in! Their advocacy and support for free schools, with so far limited evidence that they provide good value for money, frustrates me.

The PTE Campaign is slightly different. I’ve not spent time on searching for funding information but remember from previous news items – this from Schools Week for example – that it lacks transparency, to say the least. I think the name is misleading and their claim to be about moving power away from ‘the elites in Westminister and Whitehall’ to be disingenuous.

And let’s not even start with Policy Exchange.

From where I sit, if you want to group ResearchED with other education organisations, a much better match would seem to be Northern Rocks. The focus is improving and sharing classroom pedagogy, rather than campaigning. They’re both run on a shoestring. Classroom teachers are keen on attending and praise what they get out of the sessions. I can’t find anything on your blog about Northern Rocks, but that could be simple geography. (The bitter part of me suggests it’s not the first time anything happening past Watford gets ignored…)

Back to ResearchED: Funding and Speakers

“We have to hand it to Tom Bennett for his truly amazing accomplishment of keeping his international ‘grassroots’ enterprise going for four years without producing any apparent profits.”

Maybe it’s me seeing something which isn’t there, but your post seems to imply that there must be some big funding secret that explains why ResearchED is still going. What do you think costs so much money? The speakers are volunteers, as are the conference helpers. I don’t know if Tom gets a salary, but considering how much time it must be taking it would seem reasonable for at least a few people to do so. The catering costs, including staffing, are covered by the ticket price. The venues I remember are schools, so that’s not expensive.

As you’ve raised on Twitter during our discussions, the question of transport for UK-based speakers to overseas venues is an interesting one. I know that when I presented at Oxford (the Maths/Science one), my employer covered my travel costs; I assume that was the same for all speakers, or they were self-funding. If you have other specific funding concerns, I’ve not seen you describe them; you can hardly blame me for focusing on this one if you’d rather make suggestive comments than ask proper questions. I would also like to know if speakers can access funding support and if so, how that is decided. I can’t find that information on the website, and I think it should be there. I disagree with lots of what you say – or I wouldn’t have written all this – but that loses legitimacy if I don’t say where we have common ground.

I was surprised to find out how many ResearchED conferences there had been; I was vaguely thinking of seven or eight, which is why I was surprised by your suggestion that David Didau had presented at least six times. I stand corrected, on both counts. Having looked at the site, I’m also surprised that there’s no clear record of all the events in one place. A bigger ask – and one I have addressed to one of the volunteers who I know relatively well – would be for a searchable spreadsheet of speaker info covering all the conferences.

That would be fascinating, wouldn’t it? It would let us see how many repeat speakers there are, and how concentrated the group is. My gut feeling is that most speakers, like me, have presented only once or twice. Researchers would probably have more to say. I’d love to see the gender balance, which subject specialisms are better represented, primary vs secondary numbers, the contrast between state and independent sector teachers, researcher vs teacher ratios…

I’m such a geek sometimes.

You tweeted a suggestion I should ignore my personal experience to focus on the points in your post. The thing is that my personal experience of – admittedly only two – ResearchED conferences is that any political discussion tends to happen over coffee and sandwiches, and there’s relatively little of that. Maybe there’s more at the ‘strategic’ sessions aimed at HTs and policy-makers, rather than the classroom and department methods that interest me. If there’s animosity, it’s more likely to be between practitioners and politicians, rather than along party lines. I suspect I have more in common, to be honest, with a teacher who votes Tory than a left-leaning MP without chalkface experience. It’s my personal experience that contradicts the suggestions in your post about ResearchED being part of a shadowy conspiracy to influence education policy debate.

To return to Ben Goldacre, featured in your post as a victim of the puppet-masters who wanted a good brand to hide their dastardly plans behind: his own words suggest that in the interests of improving the evidence-base of policy, he’s content to work with politicians. Many strong views have been expressed at ResearchED. With such a wide variety of speakers, with different political and pedagogical viewpoints, I’m sure you can find some presentations and quotes that politicians would jump on with glee. And I’m equally sure that there are plenty they ignore, politely or otherwise. But I don’t believe the speakers are pre-screened for a particular message – beyond “looking at evidence in some way is useful for better education.” To be honest, I’m in favour of that – aren’t you? If there’s other bias in speaker selection, it was too subtle for me to notice.

But then, I’m not as clever as you.

Variations on a Theme

It turns out that I’m really bad at following up conference presentations.

Back in early June, I offered a session on teachers engaging – or otherwise – with educational research. It all grew out of an argument I had on Twitter with @adchempages, who has since blocked me after I asked if the AP Chem scores he’s so proud of count as data. He believes, it seems, that you cannot ever collect any data from educational settings, and that he has never improved his classroom practice by using any form of educational research.

But during the discussions I got the chance to think through my arguments more clearly. There are now three related versions of my opinion, quite possibly contradictory, and I wanted to link to all three.

Version the first: Learning From Mistakes, blogged by me in January.

Streamlined version written for the BERA blog: Learning From Experience. I wrote this a while back but it wasn’t published by them until last week.

Presentation version embedded below (and available from http://tinyurl.com/ian-redmatsci if you’re interested).

I’d be interested in any and all comments, as ever. Please let me know if I’ve missed any particular comments from the time – this is the problem with being inefficient. (Or, to be honest, really busy.) The last two slides include all the links in my version of a proper references section.

Thoughts from the presentation

Slide 8: it’s ironic that science teachers, who know all about using models which are useful even though they are by necessity simplified, struggle with the idea that educational research uses large numbers of participants to see overall patterns. No, humans aren’t electrons – but we can still observe general trends using data.

Slide 13: it’s been pointed out to me that several of the organisations mentioned offer cheaper memberships/access. These are, however, mainly institutional memberships (eg £50/yr for the IOP) which raises all kinds of arguments about who pays and why.

Slide 14: a member of the audience argued with this point, saying that even if articles weren’t open-access any author would be happy to share electronic copies with interested teachers. I’m sure he was sincere, and probably right. But as I tried to explain, this assumes that (1)the teacher knows what to ask for, which means they’ll miss all kinds of interesting stuff they never heard about and that (2)the author is happy to respond to potentially dozens of individual requests. Anyone other than the author or journal hosting or sharing a PDF is technically breaking the rules.

Slide 16: Ironically, the same week as I gave the presentation there was an article in SSR on electricity analogies which barely mentioned the rope model. Which was awkward as it’s one of the best around, explored and endorsed by the IOP among many others.

Slide 20: Building evidence-based approaches into textbooks isn’t a new idea (for example, I went to Andy’s great session on the philosophy behind the Activate KS3 scheme) but several tweeters and colleagues liked the possibility of explicit links being available for interested teachers.