Going to a conference isn’t good CPD unless you reflect on the new information and apply it to your own practice. (This isn’t an original thought, of course; @informed_edu probably put it best a while back.) So although I found the day in Rugby really interesting – and all due congratulations to @judehunton and the team for a great day – if I want to make it worthwhile I need to think about it a little more. The same as feedback should be more work for the student than the teacher, reflection should be more intense for the participant than speaking was for the colleague leading a CPD workshop or talk.
The notes I take during a talk are quite straightforward; I use a modified Cornell notes structure, adding key terms on the left before I leave to sum up, and tasks at the bottom I can tick off when completed. The bullet points for each session are from my notes, with italics marking out my thoughts and responses. Many of the speakers will be blogging or sharing their presentations, but I’ll update this in a week rather than waiting.
It’s not listed below, but one of the most valuable things for me about the day was talking to colleagues about their responses to the talks, how they planned to use the ideas and how I might get them involved in my projects. I was particularly touched by several colleagues, who I’ve ‘known’ through Twitter but not met before, who made a point of saying how they appreciated particular things I’ve done over the past few years. Always nice to be appreciated!
Cognitive Load session by Dom Shibli (@ShibliDom)
- Emphasized that CLT (from John Sweller) is a really useful model but is disputed by some.
- Load = intrinsic (which will vary depending on student and their starting point) + germane (which builds capacity) and extraneous (distractions or ambiguities which we as experts know to ignore but students worry about)
- Being concise with instructions reduces extraneous load so they can focus on what is intrinsic/germane. This might involve training them for routines early on.
- Curiosity drives attention so ration it through the lesson!
- Explicitly providing subject-specific structures to pupils means they organise knowledge into an effective schema. The process of making those links itself adds to the cognitive load, which is something to be aware of but not avoid.
- This feels a bit SOLO to me; meaningful connections themselves are a form of knowledge, but one which is harder to test.
Curriculum Design session by Pritesh Raitura (@mr_raichura), blogged by him at Bunsen Blue.
- Acknowledged that his setting (Michaela) get a lot of attention from media/twitter and tends to polarise debate.
- Spending time as a team on building a shared curriculum means more efficient use of that time; this is supported by school routines eg shared detentions.
- Starting with the big ideas, break down content to a very small scale and then sequence. Bear in mind the nature of each facet; procedural vs declarative, threshold concepts, cultural capital, exam spec. One of my thoughts was that this must include knowledge about the subject, such as the issues described by @angeladsaini in her book _Inferior_.
- Sequencing is a challenge when the logical order from the big ideas is contradicted by the exam spec order, which is supported by resources from the exam boards.
Colleagues hearing about big ideas vs fine level sequencing from @Mr_Raichura might find https://t.co/itIUSPcXVB from @sci_challenge interesting. #rEDRugby
— Ian (@teachingofsci) June 9, 2018
- Booklets used which are effectively chapters of a self-written textbook. Really interesting approach, I’d love to see how students use these (write-on? annotate?) and the sources of explanations, links to learned societies etc.
- Feedback to students may consist simply of the correct answers. I disagree with this, because which wrong answer they choose may be diagnostic and sharing the process with them may be useful to help them recognise their own ‘wrong tracks’. Also consider @chemDrK‘s post on students giving the right answer by rote, not understanding.
- Some really interesting ideas, but my concern is that this is only possible if the whole school follows a very clear line. This is much harder to ensure with existing schools rather than a new approach from scratch. So it may not be scalable. Researcher/Teacher role session by Kristy Turner (@doc_kristy)
- 0.6 Uni lecturer, 0.4 school teacher (plus freelance)
- Teachers in school were slow to adopt evidence informed practice, so an attempt made to do some research looking at historical data (therefore no ethical issues)
- Coding phrases from reports was a challenge. Codes were based on ideas from the A-Level Mindset book. I need to adapt this approach to analyse the reflective comments on workshops etc that will form the basis of my own MRes project.
- Results showed that, rather than science, Physics teachers were the outlier (along with Music and Russian) about how often innate characteristics were praised.
- Lots of the comments were vague, and this will itself inform report-writing. Many could be interpreted in different ways, and this is worth remembering for parents. My immediate thought is that some parents will be able to decode the comments much better than others (social issue?), and we as teachers may recognise that an absence of a comment may itself reflect a judgment eg if no comment about working hard, they may be lazy.
- An ongoing study is looking at student answers to ten chemical equation Qs, scored for difficulty by teachers based on values of coefficients, number of elements etc, comparing them before and after summer break. Some evidence that older students do better (‘year 9 into 10’ vs ‘year 8 into 9’) even without explicit balancing equations work in that year – is this because of increasing maturity, drip-feeding chemical equations over the year or something else?
- I need to look for an equivalent test (or write one) for physics equations, with the equations assessed for difficulty in the same way.
Research-Informed Schools by Robin Macpherson (@robin_macp)
- We need to start with a model of teacher competency which is reflective, not deficit-based. Research-informed practice is often time-effective, but the ‘informed’ matters because it is always adjusted/filtered by our own approach and setting. Professional judgment is key!
Paraphrasing @robin_macp: as all contexts vary, asking your own questions might be better than looking for someone else’s answers. #rEDRugby
— Ian (@teachingofsci) June 9, 2018
- the gap between research and practice is where weird ideas get in, and these are what cause us problems. I remember comments, years back, that some knowledge about ed-research is a vaccine against BrainGym and similar.
- Building in ideas from, for example, Dunlovsky can be as simple as making sure there are bonus points on tests for questions relating to earlier topics. We’re making explicit that we appreciate and reward recall going back further than last week.
- Not all ideas turn out to be useful. Differences in mindset seem to be real, but there’s growing evidence that these differences are slowly accumulated and not something we can change by displays or interventions.
My take on @robin_macp‘s comments on growth mindset: it’s a measurable difference, that helps, but it can’t be provided. Cc pattern of exercise vs pills for blood pressure. #rEdRugby
— Ian (@teachingofsci) June 9, 2018
- A Research Lead will have many jobs to do, including but not limited to curation, distillation, signposting and distribution. (These words are my paraphrasing.) Making a school research-informed is a slow process, 5-10 years, not an instant fix. One link shared was TILE for good practice examples.
I’m flagging with lack of coffee and so will post the afternoon’s sessions tomorrow. Or maybe the day after!