Ripples & Reflections

"Learning is about living, and as such is lifelong." Elkjaer.


Leave a comment

Bold Moves for Schools

This is a quick-and-dirty review of a book that ticks all the boxes for a curriculum nerd like me: Bold Moves for Schools, by Heidi Hayes Jacobs & Marie Alcock, from the ASCD (2017, 207 pages).

It’s a practical and comprehensive, yet concise and quotable handbook of where to take curriculum, learning and leadership for modern learners. Educators in international schools will see many familiar themes emerge, from student agency and creativity in the curriculum to effective assessment, learning spaces and teacher development. There is much here that can accelerate a well-implemented IB curriculum (or standards-based learning model), and this book will sing to coordinators as it does to me.

“Innovation requires courage coupled with a realistic sensibility to create new possibilities versus “edu-fantasies”. Moving boldly is not moving impulsively or for the sake of change. Moving boldly involves breaking barriers that need breaking.”

As a “pragmatic idealist” I like how the book connects a future-focused, genuinely student-centred education to the best of what we’re already doing. It avoids falling into the trap of trashing the traditional, instead framing bold moves through the antiquated (what do we cut?), the classical (what do we keep?) and the contemporary (what do we create?). Jacobs & Alcock insist throughout the book that these bold moves are mindful, that we are not “throwing the baby out with the bathwater” and that “meaningful curriculum composition versus meaningless imposition” is the goal.

How can we build a genuinely exciting contemporary educational experience that keeps the joy in the learning, the future in mind and the students in the driving seat? Through a systemic approach that focuses on what works and what could be: one which empowers teachers as self-directed professional learners and curriculum architects. For anyone trying to effect change in an existing (long-established) system, this respectful, well-reasoned handbook is worth a look.

“What is most critical is that the outcome reflect quality.”

I hope that much of what is in this book is not new to most curriculum leaders – particularly in the IB context – but it is great to have a volume that pulls it together in one place, with practical resources. This would make a great book study (guide here) for curriculum leaders and teachers. You will find interesting surprises, resources and provocations littered through the text, worthy of further discussion.

You may even make some bold commitments as a result…

Quick follow-up: I was at a Bold Moves Bootcamp with Marie Alcock recently, and it was great. There is a post about one of my outcomes (a DOK4 filter for transfer) here.

…………o0O0o…………

Check it out

Without being too spoilerific, here are some useful links and resources from the book:

 

Advertisements


1 Comment

Webb’s DOK4 & Transfer

115013bLast weekend I took part in a fabulous Bold Moves Curriculum Mapping Bootcamp, by Dr. Marie Alcock at ISKL. I was there to think about next steps for curriculum planning at CA, and it was a great opportunity to pick the brains of a true expert (and get lots done). I like the bootcamp model for PD: short, focused and with the opportunity to take immediate action with great feedback from colleagues in similar positions.

Through one of the discussions about high-quality assessment, Marie dug into Webb’s Depth of Knowledge (DOK) framework. It’s not a “wheel” of command terms as is often (and incorrectly) presented, but a way of framing how deeply students need to know and use information, skills and concepts. Similarly, DOK is not the same as Bloom’s Taxonomy, and is not a pyramid or a hierarchy of knowledge that “peaks” at DOK4. It can be accessed from any of the other three levels, and effectively sits in parallel.

For a decent explainer of how DOK levels work, see this by Erik Francis for ASCD Edge – I used his DOK descriptors in the teacher plansheet tool.

In practical terms, as explained by Marie, students should be able to access DOK4 from any one of the other DOK levels. This is important, as DOK4 essentially can act as a filter for transfer.

How else can the student use the knowledge, skills and content at this level? 

This means that in curriculum and task design and differentiation, teachers can set up situations for all students to pull their learning (even if only at a recall/DOK1 level) through to DOK4 by applying it in a new context – as long as it is the same skill/target. For example, this might mean taking a scientific skill and applying to a new experiment, or a writing technique applied to a new genre. This is knowledge augmentation. MYP Teachers will see the immediate connections here to higher-order objectives in the criteria.

 

Teacher Plansheet: A Practical Use

Transfer is a notoriously difficult skill to teach, even though it is included in the ATL framework, and so I sketched up this planning tool (pdf) in the hope that it can visualise how DOK4 can be used as a filter to make transfer explicit. Follow the arrows as you think about putting a target standard or learning outcome to work. What level (DOK1-2-3) is expected of the student? How else (DOK4) could it be used? For some excellent, practical resources on applying DOK in the various disciplines, check out Dr. Karin Hess’s Cognitive Rigor and DOK rubrics and resources.

……….o0O0o……….

Transferring the Transfer: Thinking Collaboratively

How else might this tool be put to use? Here are some quick thoughts on how this might work with the collaboration of the relevant experts or coaches in the school.

  • Technology Integration: using the DOK4 filter as an opportunity to amplify and transform (RAT model) the learning task (but still meet objectives).
  • Service Learning: In moving from “doing service” to service learning, could this be used to help frame students’ focus on planning, or post-service reflection? As students learn about issues of significance, how can they put it work through transfer to meaningful action? As they reflect on their learning, can they connect new and existing disciplinary knowledge?
  • Interdisciplinary Learning: How can students take their learning and use it meaningfully in a context that requires transfer between disciplines?

……….o0O0o……….

 


2 Comments

“This will revolutionize education.”

Being in a tech-rich school, we have access to a lot of different platforms, tools and ideas. We have a good group of educational technology ‘experts’ in the school and a pretty strong focus on putting the learning first when choosing and using educational technologies. Nothing is blocked and teachers are given opportunities learn more about (and try) different platforms, ideas and strategies using technology. Sometimes the best tech is low-tech; other times systems like Hapara help us to differentiate more subtly, give feedback more readily and think more carefully about task design, scaffolding and criteria.

Derek Muller (the awesome @veritasium), released this video and I was reminded of how poor some #EdTech implementation can be. This video gives a brief history of #EdTech hyperbole, from the ‘moving picture’ to the laser-disc, each step along the way seeking to automate the art (and science) of teaching. When the focus is on how teachers can be replaced by tech or simplistic educational inquiry (is X tech better than Y tech to do Z simple transmission process?), then the resources and energy spent on the technological innovation are wasted.

“What limits learning is what happens inside a student’s head. […] What experiences promote the kind of thinking that is required for learning?” 

It has become a cliche in #EdTech now to say “it’s about the learning, not the tech,” and that’s a great thing. The mantra sticks, and hopefully it forces us into careful thought about how we choose and use #EdTech tools. Learning is social, critical and personal; it requires the guidance of an expert, caring teacher and it needs inspiration, motivation and perseverance.

If we think about it this way then, as Derek says, we can evolve education, if not revolutionize it.

……….o0O0o……….

Now if only the companies that keep sending advertisements to my school and i-Biology inboxes for weak platforms (and worse PD) would watch this…


Leave a comment

Growth Mindsets in Differentiation & Feedback

Nothing suggests 'loser' quite like a table for one and a book with that cover blurb. ;)

Nothing suggests ‘loner’ quite like a table for one and a book with that cover blurb. 😉 #GrowthMindset

After reading/hearing so much about Carol Dweck’s Mindset over the last couple of years, I was finally able to read the book on the train from London to Bath. I’d become so familiar with the ideas that it felt like 200+ pages of déjà vu, although the main messages are perhaps worth reinforcing.

It’s an easy read, in a style similar to Pink, Goleman, Gladwell and co., though I did find myself skimming over yet another American sports example (Woods, Wooden, Jordan, Wie, Yankees, McEnroe). There were some interesting sections on leadership, parenting and relationships, though I was really looking for more practical advice on how to build growth mindsets in my students.

Some key messages for parents and educators

  • A fixed mindset is seen as a personal success or failure, a (permanent) label on a person of their worth.
  • Fixed-mindsets value ability over effort and when effort is put in it is in order to affirm one’s status at the top; they might be seen to ‘learn’ a lot as they perform highly in tests and assessments, but this may be due only to the effect of their achievement affirming their fixed mindset.
  • Fixed mindsets see difficulty as a weakness or threat and so may not put in the effort in case they fail.
  • Growth mindsets embrace the challenge of difficulty and see the value in learning as a journey.
  • Growth mindsets demonstrate resilience in failure and use difficulties to set workable plans for improvement
  • Growth mindset leaders and teachers embrace their own personal learning and seek to develop learning communities: it is OK to not know… yet.
  • Growth mindset leaders take time to listen, learn and evaluate fairly. They surround themselves with knowledgable inquirers and weed out the fixed mindset culture of fear and/or affirming status. They might be lower-key than the high-powered fixed-mindset hero-leaders, but they build a more sustainable and trusting culture.

Feedback and Mindsets

Samudra, determined to ride the space tower thing at LegoLand. Photo (c) Stephen Taylor.

Samudra, determined to ride the space tower thing at LegoLand. Photo (c) Stephen Taylor.

It is clear that our words and actions as parents and teachers reinforce kids’ views of themselves and their behaviour adjusts accordingly. By focusing on personal feedback (praise or criticism), we may affect the mindset of the child, either reinforcing the ego or damaging the student’s motivation to improve. By focusing on tasks and processes, looking at how we can improve, we might help students develop more growth mindsets. A good strategy for effective feedback that builds on the growth mindset might be Hattie’s Three Levels (Task, Process and Self-regulation).

Differentiation and Mindsets

When we focus on ability-related feedback, conversations or behaviours are we limiting the growth mindset? Dweck suggests that this is compounded when the curriculum is ‘dumbed-down’ and that having high expectations for all students, coupled with valuable feedback, will increase achievement. Sounds obvious, but may not always play out in class. Avoid the temptation to make the curriculum easier for the ‘less able’ students and instead Differentiate Up from a core. Challenge everyone, support everyone.

Approaches to Learning and Mindsets

We all want our students to do well, but more than that we should want them to love learning and become enthusiastic lifelong learners. Taking steps to weed out fixed-mindset behaviours and language from our classes and our cultures in order to develop strategies towards becoming more growth-oriented might bring us part of the way. This is where we can start to see the importance of the Affective skills clusters of the IB’s Approaches to Learning, and will likely be an area that requires significant teacher (and parent) professional development. Coupled with a strong curriculum and high-impact teaching and learning and we might just get there.

I used to think you were smart.” Calvin and Hobbes strip that neatly sums up fixed vs growth mindsets, used on p40 of Dweck’s Mindset.

………..o0O0o………..

I’ll admit, the idea of Mindset seems a little too neat for me – we are more nuanced and complex than either-or (which she recognises in the book). Personally, for example, I would see myself as very growth-mindset in that I seek development, learn more and reflect on everything; however, I can take perceived failure or criticism very personally, which is a more fixed-mindset trait. I also recognise that the book is aimed at a mass-market audience, and so there is much reference to ‘our research’ without a lot of depth. I would prefer a more academic, education-focused edition of this, with fewer popular-culture, big-CEO or sports stories and more about how this has been investigated.

As a tool for teachers, the language of fixed vs growth mindset will make it easier to have conversations with students and parents, and we can develop or make use of strategies that reinforce the nature of learning as a growth process. I am looking forward to seeing how schools start to put some of these ideas to use in their development of the Approaches to Learning.

I have added this book to the MYP Coordinator’s Bookshelf , but would really recommend any of the other books as good reads before moving onto this one. 

………o0O0o……….

This is a total cheese-fest, but anyone who says they don’t like Dolly has a heart of stone. Her recent single, Try, does a pretty neat job of capturing the Growth Mindset and the role of effort in success – and it’s the theme song for her literacy charity, Imagination Library.


1 Comment

The Gradebook I Want

As the MYP sciences roll into the Next Chapter and we mull over the new guides, objectives and assessment criteria, we have the opportunity to reflect on our assessment practices. The IB have provided a very clear articulation between course objectives and performance standards (see image), which should make assessment and moderation a more efficient process.

There is a clear connection between the objectives of the disciplines and their assessment descriptors in all subjects in MYP: Next Chapter.

There is a clear connection between the objectives of the disciplines and their assessment descriptors in all subjects in MYP: The Next Chapter.

Underpinning these objectives, however, are school-designed content and skills standards. These are left up to schools for articulation so that the MYP can work in any educational context and this is great, though it does leave the challenge of essentially tracking two sets of standards (or more) in parallel: the MYP objectives and the internal (or state) content-level standards. In a unit about sustainable energy systems, for example, I might have 15-20 content-level, command term-based assessment statements, each of which could be assessed against any (or many) of the multiple strands for each of four MYP objectives.

As I read more about standards-based grading (or more recently standards-based learning on Twitter), I become more dissatisfied with the incumbent on-schedule assessment practices presiding over grading and assessment. I want students to be able to demonstrate mastery of both the MYP objectives and the target content/skills but I am left with questions:

  • If they score well on a task overall but miss the point on a couple of questions/content standards, have they really demonstrated mastery? How can I ensure that they have mastered both content and performance standards?
  • If they learn quickly from their mistakes and need another opportunity to demonstrate their mastery on a single content-level standard (or performance-level standard), do they need to do the whole assignment again? What if time has run out or there is not opportunity to do it again?
  • As we move through the calendar in an effort to cover curriculum and get enough assessment points for a best-fit, are we moving too superficially across the landscape of learning?
  • More importantly, is the single number – their grade – for the task, a true representation of what they know and can do? How can I present this more clearly, to really track growth?

My aim with all this is to encourage a classroom of genuine inquiry (defined as critical, reflective thought), in which I know that students have effectively learned a solid foundation of raw materials (the ‘standards’, if you will), upon which they can ask deeper questions, make more authentic connections and evaluate, analyse and synthesise knowledge. 

Lucky we have Rick Wormeli videos for reference. Here he is on retakes, redos, etc and it is worth watching (and provocative). There is another part, as well. If you haven’t seen them yet, go watch them before reading the rest of this post (the videos are better, TBH).

……….o0O0o…………

What do I do already? 

  • Lots of formative assessment: practice, descriptors on worksheets, online quizzes.
    • In each of these, there is a rubric connecting it to MYP Objectives
      • Each question is labeled with the descriptor level and strand (e.g. L1-2a, L5-6b, c).
      • I don’t usually give a grade, though do check the work. Students should be able to cross-reference the questions with the descriptors, carry out their own ‘best fit’ and determine the grade they would get if they so desire. This puts feedback first.
    • Learning tasks usually include target content standards
  • Drafting stages through Hapara/GoogleDocs to keep track of work and give comments as we go
  • An emphasis on self-assessment against performance descriptors and content-level standards (and goals for improvement or revision).
  • I use command terms all the time: every sheet, question, lesson where possible.
  • Set deadlines with students where possible and am flexible where needed.
  • In some cases reschedule assessment or follow up with interview or retake (but not as standard practice). As Wormeli says above (part 2): “at teacher discretion.”
  • Track student learning at the criterion-level (MYP objectives), though with current systems (Powerschool), not in great detail at the objective strands (descriptors) level (e.g. A.i, A.ii, A.iii).
  • I do tests over two lessons, giving out a core section in the first, collecting and checking in-between classes. In the next session, this is supplemented with extra questions that should allow students to take at least one more step up. For example, a student struggling with Level 3-4 questions would get more opportunities to get to that level, whereas another who has shown competency will get the next level(s) up.

What do I want to do? 

  • I want to also be able to effectively track every student’s growth in the content standards and develop deeper skills in inquiry (critical reflective thinking).
  • Develop a system for better tracking learning against the individual strands within each criterion (e.g. A.i, A.ii, A.iii).
  • Better facilitate development of student mastery, allowing us to move further away from scheduled lessons and into more effective differentiation and pacing.

What would help? 

I would really like a standards-based, MYP-aligned, content-customisable gradebook and feedback system that is effective in at least three dimensions:

  • Task-level to put levels for each task, each of which might produce multiple scores, including:
    • Various target content-level standards
    • MYP objective strands at different levels of achievement
  • It would need to allow for retake/redo opportunities for any and all standards that need to be redone – not necessarily whole assessment tasks. 
  • It would have to focus student learning on descriptors and standards, not on the numbers, in order to help them move forwards effectively. Students would need to be able to access it and make sense of it intuitively so that they could decide their own next steps even before I do.
  • It would super-duper if the system could produce really meaningful report cards that focus on growth over the terminal nature of semester grading.
What would a three-dimensional gradebook look like?

What would a three-dimensional gradebook look like?

Here is Rick again, describing another approach to a 3D gradebook:


6 Comments

Making Feedback Visible: Four Levels Experiment

This quick brain-dump is based on ideas from Hattie’s Visible Learning for Teachers, Wiliam’s Embedded Formative Assessment and the pdf of The Power of Feedback (Hattie & Timperley) linked below. 

I spent much of today trying to grade a large project (Describing the Motion of the Rokko Liner, our local train), which was assessed for MYP Sciences criteria D, E, F. Based on some of our Student Learning Goal work on helping students cope with data presentation and interpretation, the lab had been broken into stages (almost all completed in-class), spread across A4 and A3 paper and GoogleDocs in Hapara.

Hattie & Timperley, Four Levels of Feedback. Click for the pdf of 'The Power of Feedback.'

Hattie & Timperley, Four Levels of Feedback. Click for the pdf of ‘The Power of Feedback.’ The image is on the page numbered 87.

The result: a lot of visible learning in that I could keep track of each student, see their work in progress and comment where needed. A lotof verbal feedback was given along the way, with some worked examples for students. Breaking the large assignment into stages helped keep it authentic and manageable for students, with some days dedicated to individual strands of the assessment criteria.

The challenge: a Frankenstein’s monster of a grading pile, part paper, part digital and all over the place. After trying to put comments on the various bits of paper and Google Docs I gave up, realising that I would be there for many hours and that potentially very little would be read carefully be students or actioned in the next assignment. I turned to Hattie (and Wiliam). Visible Learning for Teachers has a very useful section on Feedback (d=0.73, though formative assessment is d=0.9) and so I spent some time making up the following document, with the aim of getting all the feedback focused and in one place for students.

It is based on the four levels of feedback: task-level, process-level, self-regulation and self. In each of the first three sections I have check-boxed a few key items, based on things I am looking for in particular in this task and the common advice that I will give based on a first read through the pile. A couple of boxes will be checked for each student as specific areas for improvement, with the ‘quality’ statements explained in person. There is space under each for personal comments where needed. I fudged the ‘self’ domain a bit for the purpose of student synthesis of the feedback they are given – really making it a reflective space, geared towards the positive after the preceding three sections of constructive commentary.

Once I got the sheets ready, I chugged through the grading, paying attention most closely to the descriptors in the rubric, the task-specific instructions to students and then the points for action. However, I put very little annotation directly on the student work, instead focusing on this coversheet. It was marginally quicker to grade overall than the same task would have been normally, but the feedback this time is more focused. The double-sided sheet was given to them in class, attached to the paper components of their work, with the feedback facing out and the rubrics with grades hidden behind. This is a deliberate attempt to put feedback first. We spent about 25 minutes explaining and thinking through this in class.

Importantly, students were given time to think carefully about why certain notes had been made and boxes checked on their sheet. I asked them to respond to the feedback in the ‘self’ section, and make additional notes in the three sections of task-level, process-level and self-regulation. In discussion with individual students, we identified which were most pertinent – for some higher-achieving students they can take action in more detail at the task level, whereas others need to focus more on self-regulation. At the end of the lesson, the sheets and work were collected back, so I can read the feedback and use this to inform next teaching of lab skills.

The purpose of all this is to make it explicit where they need to focus their efforts for the next time, without having to wade through pages of notes. It hopefully serves to make the “discrepancy between the current and desired” performance manageable, and a sea of marking on their work will not help with this. I will need to frame this carefully with students – some need work on many elements, but I will not check or note them, instead focusing on the few that are most important right now. Incidentally, it also allows me to more quickly spot trends and potentially form readiness groupings based on clusters of students needing work on individual elements in the following lab.

At the end of the task I asked students for feedback on the process. They generally found the presentation of feedback in this way easier to manage than sifting through multiple multimedia components, and will keep this document as a reference for next time. A couple of higher-achieving students asked for more detailed feedback by section in their work, which is somthing I can do at request, rather than perhaps by default; I know these students will value and take action on it.

Here’s the doc embedded. If it looks as ugly on your computer as it does mine, click here to open it.

If you’ve used something like this, or can suggest ways to improve it without taking it over one side per section, I’d love to hear your thoughts in the comments or on Twitter. I’ll add to the post once I’ve done the lesson with the students.

UPDATE (2 December): Feedback-first, peer-generated

Having read that adding grades to feedback weakens the effect of the feedback, I’ve been thinking about ways to get students to pay more attention to the feedback first. For this task, a pretty basic spring extension data-processing lab, I checked the labs over the weekend and wrote down the scores on paper. In class I put students in groups of three and asked them to share the GoogleDoc of the lab with their partners. They then completed a feedback circle, using the coversheet below to identify specific areas for improvement and checking them. If they could suggest an improvement (e.g. a better graph title), they could add this as a comment.

This took about 15-20 minutes, after which students completed the process-level and self-regulation sections and returned the form to me, before continuing with the day’s tasks. Before the next class, I’ll add their grades to the form (rubrics are on the reverse of the copy I gave students) and log them in Powerschool. Delaying communication of the grade this way should, I hope, have helped students engage more effectively with the feedback – I learned last week that making changes in Powerschool resulted in automatic emails to students.

I was wary of doing this first thing on a Monday, but the kids were great and enjoyed giving and receiving feedback from peers. Of course some goofed off a little, but they were easy to get back on track. For the high-flyers who enojoyed the method less the first time, this gave them a chance to really pick through each others’ work to give specific feedback for improvement.

Here is the document:

……….o0O0o……….

……….o0O0o……….

The Power of Feedback (pdf). John Hattie & Helen Timperley, University of Auckland. DOI: 10.3102/003465430298487


1 Comment

First Unit Reflections: Is It Working?

Today we took the opportunity in the IBBio class to reflect on the unit we have just completed, including the tasks and assessment. As always with CA students, the results were constructive, positive and useful, with a general affirmation of the value of what we are doing as a class. The feedback included our personal GoogleSites project, with most students keen on continuing and feeling it helped them learn and with some interesting alternatives for those that it is not.

This kind of feedback is really useful once the class has settled in. They are open enough to be able to be honest, but it is early enough to change practices where needed. We will make some adjustments, though we are generally on the right track with this group. I’m really looking forward to seeing the process and products of the students who have elected to become science writers instead of GoogleSiters.

Here are the results in a summary presentation.