Ripples & Reflections

"Learning is about living, and as such is lifelong." Elkjaer.


“This will revolutionize education.”

Being in a tech-rich school, we have access to a lot of different platforms, tools and ideas. We have a good group of educational technology ‘experts’ in the school and a pretty strong focus on putting the learning first when choosing and using educational technologies. Nothing is blocked and teachers are given opportunities learn more about (and try) different platforms, ideas and strategies using technology. Sometimes the best tech is low-tech; other times systems like Hapara help us to differentiate more subtly, give feedback more readily and think more carefully about task design, scaffolding and criteria.

Derek Muller (the awesome @veritasium), released this video and I was reminded of how poor some #EdTech implementation can be. This video gives a brief history of #EdTech hyperbole, from the ‘moving picture’ to the laser-disc, each step along the way seeking to automate the art (and science) of teaching. When the focus is on how teachers can be replaced by tech or simplistic educational inquiry (is X tech better than Y tech to do Z simple transmission process?), then the resources and energy spent on the technological innovation are wasted.

“What limits learning is what happens inside a student’s head. […] What experiences promote the kind of thinking that is required for learning?” 

It has become a cliche in #EdTech now to say “it’s about the learning, not the tech,” and that’s a great thing. The mantra sticks, and hopefully it forces us into careful thought about how we choose and use #EdTech tools. Learning is social, critical and personal; it requires the guidance of an expert, caring teacher and it needs inspiration, motivation and perseverance.

If we think about it this way then, as Derek says, we can evolve education, if not revolutionize it.


Now if only the companies that keep sending advertisements to my school and i-Biology inboxes for weak platforms (and worse PD) would watch this…

Leave a comment

Growth Mindsets in Differentiation & Feedback

Nothing suggests 'loser' quite like a table for one and a book with that cover blurb. ;)

Nothing suggests ‘loner’ quite like a table for one and a book with that cover blurb. 😉 #GrowthMindset

After reading/hearing so much about Carol Dweck’s Mindset over the last couple of years, I was finally able to read the book on the train from London to Bath. I’d become so familiar with the ideas that it felt like 200+ pages of déjà vu, although the main messages are perhaps worth reinforcing.

It’s an easy read, in a style similar to Pink, Goleman, Gladwell and co., though I did find myself skimming over yet another American sports example (Woods, Wooden, Jordan, Wie, Yankees, McEnroe). There were some interesting sections on leadership, parenting and relationships, though I was really looking for more practical advice on how to build growth mindsets in my students.

Some key messages for parents and educators

  • A fixed mindset is seen as a personal success or failure, a (permanent) label on a person of their worth.
  • Fixed-mindsets value ability over effort and when effort is put in it is in order to affirm one’s status at the top; they might be seen to ‘learn’ a lot as they perform highly in tests and assessments, but this may be due only to the effect of their achievement affirming their fixed mindset.
  • Fixed mindsets see difficulty as a weakness or threat and so may not put in the effort in case they fail.
  • Growth mindsets embrace the challenge of difficulty and see the value in learning as a journey.
  • Growth mindsets demonstrate resilience in failure and use difficulties to set workable plans for improvement
  • Growth mindset leaders and teachers embrace their own personal learning and seek to develop learning communities: it is OK to not know… yet.
  • Growth mindset leaders take time to listen, learn and evaluate fairly. They surround themselves with knowledgable inquirers and weed out the fixed mindset culture of fear and/or affirming status. They might be lower-key than the high-powered fixed-mindset hero-leaders, but they build a more sustainable and trusting culture.

Feedback and Mindsets

Samudra, determined to ride the space tower thing at LegoLand. Photo (c) Stephen Taylor.

Samudra, determined to ride the space tower thing at LegoLand. Photo (c) Stephen Taylor.

It is clear that our words and actions as parents and teachers reinforce kids’ views of themselves and their behaviour adjusts accordingly. By focusing on personal feedback (praise or criticism), we may affect the mindset of the child, either reinforcing the ego or damaging the student’s motivation to improve. By focusing on tasks and processes, looking at how we can improve, we might help students develop more growth mindsets. A good strategy for effective feedback that builds on the growth mindset might be Hattie’s Three Levels (Task, Process and Self-regulation).

Differentiation and Mindsets

When we focus on ability-related feedback, conversations or behaviours are we limiting the growth mindset? Dweck suggests that this is compounded when the curriculum is ‘dumbed-down’ and that having high expectations for all students, coupled with valuable feedback, will increase achievement. Sounds obvious, but may not always play out in class. Avoid the temptation to make the curriculum easier for the ‘less able’ students and instead Differentiate Up from a core. Challenge everyone, support everyone.

Approaches to Learning and Mindsets

We all want our students to do well, but more than that we should want them to love learning and become enthusiastic lifelong learners. Taking steps to weed out fixed-mindset behaviours and language from our classes and our cultures in order to develop strategies towards becoming more growth-oriented might bring us part of the way. This is where we can start to see the importance of the Affective skills clusters of the IB’s Approaches to Learning, and will likely be an area that requires significant teacher (and parent) professional development. Coupled with a strong curriculum and high-impact teaching and learning and we might just get there.

I used to think you were smart.” Calvin and Hobbes strip that neatly sums up fixed vs growth mindsets, used on p40 of Dweck’s Mindset.


I’ll admit, the idea of Mindset seems a little too neat for me – we are more nuanced and complex than either-or (which she recognises in the book). Personally, for example, I would see myself as very growth-mindset in that I seek development, learn more and reflect on everything; however, I can take perceived failure or criticism very personally, which is a more fixed-mindset trait. I also recognise that the book is aimed at a mass-market audience, and so there is much reference to ‘our research’ without a lot of depth. I would prefer a more academic, education-focused edition of this, with fewer popular-culture, big-CEO or sports stories and more about how this has been investigated.

As a tool for teachers, the language of fixed vs growth mindset will make it easier to have conversations with students and parents, and we can develop or make use of strategies that reinforce the nature of learning as a growth process. I am looking forward to seeing how schools start to put some of these ideas to use in their development of the Approaches to Learning.

I have added this book to the MYP Coordinator’s Bookshelf , but would really recommend any of the other books as good reads before moving onto this one. 


This is a total cheese-fest, but anyone who says they don’t like Dolly has a heart of stone. Her recent single, Try, does a pretty neat job of capturing the Growth Mindset and the role of effort in success – and it’s the theme song for her literacy charity, Imagination Library.

1 Comment

The Gradebook I Want

As the MYP sciences roll into the Next Chapter and we mull over the new guides, objectives and assessment criteria, we have the opportunity to reflect on our assessment practices. The IB have provided a very clear articulation between course objectives and performance standards (see image), which should make assessment and moderation a more efficient process.

There is a clear connection between the objectives of the disciplines and their assessment descriptors in all subjects in MYP: Next Chapter.

There is a clear connection between the objectives of the disciplines and their assessment descriptors in all subjects in MYP: The Next Chapter.

Underpinning these objectives, however, are school-designed content and skills standards. These are left up to schools for articulation so that the MYP can work in any educational context and this is great, though it does leave the challenge of essentially tracking two sets of standards (or more) in parallel: the MYP objectives and the internal (or state) content-level standards. In a unit about sustainable energy systems, for example, I might have 15-20 content-level, command term-based assessment statements, each of which could be assessed against any (or many) of the multiple strands for each of four MYP objectives.

As I read more about standards-based grading (or more recently standards-based learning on Twitter), I become more dissatisfied with the incumbent on-schedule assessment practices presiding over grading and assessment. I want students to be able to demonstrate mastery of both the MYP objectives and the target content/skills but I am left with questions:

  • If they score well on a task overall but miss the point on a couple of questions/content standards, have they really demonstrated mastery? How can I ensure that they have mastered both content and performance standards?
  • If they learn quickly from their mistakes and need another opportunity to demonstrate their mastery on a single content-level standard (or performance-level standard), do they need to do the whole assignment again? What if time has run out or there is not opportunity to do it again?
  • As we move through the calendar in an effort to cover curriculum and get enough assessment points for a best-fit, are we moving too superficially across the landscape of learning?
  • More importantly, is the single number – their grade – for the task, a true representation of what they know and can do? How can I present this more clearly, to really track growth?

My aim with all this is to encourage a classroom of genuine inquiry (defined as critical, reflective thought), in which I know that students have effectively learned a solid foundation of raw materials (the ‘standards’, if you will), upon which they can ask deeper questions, make more authentic connections and evaluate, analyse and synthesise knowledge. 

Lucky we have Rick Wormeli videos for reference. Here he is on retakes, redos, etc and it is worth watching (and provocative). There is another part, as well. If you haven’t seen them yet, go watch them before reading the rest of this post (the videos are better, TBH).


What do I do already? 

  • Lots of formative assessment: practice, descriptors on worksheets, online quizzes.
    • In each of these, there is a rubric connecting it to MYP Objectives
      • Each question is labeled with the descriptor level and strand (e.g. L1-2a, L5-6b, c).
      • I don’t usually give a grade, though do check the work. Students should be able to cross-reference the questions with the descriptors, carry out their own ‘best fit’ and determine the grade they would get if they so desire. This puts feedback first.
    • Learning tasks usually include target content standards
  • Drafting stages through Hapara/GoogleDocs to keep track of work and give comments as we go
  • An emphasis on self-assessment against performance descriptors and content-level standards (and goals for improvement or revision).
  • I use command terms all the time: every sheet, question, lesson where possible.
  • Set deadlines with students where possible and am flexible where needed.
  • In some cases reschedule assessment or follow up with interview or retake (but not as standard practice). As Wormeli says above (part 2): “at teacher discretion.”
  • Track student learning at the criterion-level (MYP objectives), though with current systems (Powerschool), not in great detail at the objective strands (descriptors) level (e.g. A.i, A.ii, A.iii).
  • I do tests over two lessons, giving out a core section in the first, collecting and checking in-between classes. In the next session, this is supplemented with extra questions that should allow students to take at least one more step up. For example, a student struggling with Level 3-4 questions would get more opportunities to get to that level, whereas another who has shown competency will get the next level(s) up.

What do I want to do? 

  • I want to also be able to effectively track every student’s growth in the content standards and develop deeper skills in inquiry (critical reflective thinking).
  • Develop a system for better tracking learning against the individual strands within each criterion (e.g. A.i, A.ii, A.iii).
  • Better facilitate development of student mastery, allowing us to move further away from scheduled lessons and into more effective differentiation and pacing.

What would help? 

I would really like a standards-based, MYP-aligned, content-customisable gradebook and feedback system that is effective in at least three dimensions:

  • Task-level to put levels for each task, each of which might produce multiple scores, including:
    • Various target content-level standards
    • MYP objective strands at different levels of achievement
  • It would need to allow for retake/redo opportunities for any and all standards that need to be redone – not necessarily whole assessment tasks. 
  • It would have to focus student learning on descriptors and standards, not on the numbers, in order to help them move forwards effectively. Students would need to be able to access it and make sense of it intuitively so that they could decide their own next steps even before I do.
  • It would super-duper if the system could produce really meaningful report cards that focus on growth over the terminal nature of semester grading.
What would a three-dimensional gradebook look like?

What would a three-dimensional gradebook look like?

Here is Rick again, describing another approach to a 3D gradebook:


Making Feedback Visible: Four Levels Experiment

This quick brain-dump is based on ideas from Hattie’s Visible Learning for Teachers, Wiliam’s Embedded Formative Assessment and the pdf of The Power of Feedback (Hattie & Timperley) linked below. 

I spent much of today trying to grade a large project (Describing the Motion of the Rokko Liner, our local train), which was assessed for MYP Sciences criteria D, E, F. Based on some of our Student Learning Goal work on helping students cope with data presentation and interpretation, the lab had been broken into stages (almost all completed in-class), spread across A4 and A3 paper and GoogleDocs in Hapara.

Hattie & Timperley, Four Levels of Feedback. Click for the pdf of 'The Power of Feedback.'

Hattie & Timperley, Four Levels of Feedback. Click for the pdf of ‘The Power of Feedback.’ The image is on the page numbered 87.

The result: a lot of visible learning in that I could keep track of each student, see their work in progress and comment where needed. A lotof verbal feedback was given along the way, with some worked examples for students. Breaking the large assignment into stages helped keep it authentic and manageable for students, with some days dedicated to individual strands of the assessment criteria.

The challenge: a Frankenstein’s monster of a grading pile, part paper, part digital and all over the place. After trying to put comments on the various bits of paper and Google Docs I gave up, realising that I would be there for many hours and that potentially very little would be read carefully be students or actioned in the next assignment. I turned to Hattie (and Wiliam). Visible Learning for Teachers has a very useful section on Feedback (d=0.73, though formative assessment is d=0.9) and so I spent some time making up the following document, with the aim of getting all the feedback focused and in one place for students.

It is based on the four levels of feedback: task-level, process-level, self-regulation and self. In each of the first three sections I have check-boxed a few key items, based on things I am looking for in particular in this task and the common advice that I will give based on a first read through the pile. A couple of boxes will be checked for each student as specific areas for improvement, with the ‘quality’ statements explained in person. There is space under each for personal comments where needed. I fudged the ‘self’ domain a bit for the purpose of student synthesis of the feedback they are given – really making it a reflective space, geared towards the positive after the preceding three sections of constructive commentary.

Once I got the sheets ready, I chugged through the grading, paying attention most closely to the descriptors in the rubric, the task-specific instructions to students and then the points for action. However, I put very little annotation directly on the student work, instead focusing on this coversheet. It was marginally quicker to grade overall than the same task would have been normally, but the feedback this time is more focused. The double-sided sheet was given to them in class, attached to the paper components of their work, with the feedback facing out and the rubrics with grades hidden behind. This is a deliberate attempt to put feedback first. We spent about 25 minutes explaining and thinking through this in class.

Importantly, students were given time to think carefully about why certain notes had been made and boxes checked on their sheet. I asked them to respond to the feedback in the ‘self’ section, and make additional notes in the three sections of task-level, process-level and self-regulation. In discussion with individual students, we identified which were most pertinent – for some higher-achieving students they can take action in more detail at the task level, whereas others need to focus more on self-regulation. At the end of the lesson, the sheets and work were collected back, so I can read the feedback and use this to inform next teaching of lab skills.

The purpose of all this is to make it explicit where they need to focus their efforts for the next time, without having to wade through pages of notes. It hopefully serves to make the “discrepancy between the current and desired” performance manageable, and a sea of marking on their work will not help with this. I will need to frame this carefully with students – some need work on many elements, but I will not check or note them, instead focusing on the few that are most important right now. Incidentally, it also allows me to more quickly spot trends and potentially form readiness groupings based on clusters of students needing work on individual elements in the following lab.

At the end of the task I asked students for feedback on the process. They generally found the presentation of feedback in this way easier to manage than sifting through multiple multimedia components, and will keep this document as a reference for next time. A couple of higher-achieving students asked for more detailed feedback by section in their work, which is somthing I can do at request, rather than perhaps by default; I know these students will value and take action on it.

Here’s the doc embedded. If it looks as ugly on your computer as it does mine, click here to open it.

If you’ve used something like this, or can suggest ways to improve it without taking it over one side per section, I’d love to hear your thoughts in the comments or on Twitter. I’ll add to the post once I’ve done the lesson with the students.

UPDATE (2 December): Feedback-first, peer-generated

Having read that adding grades to feedback weakens the effect of the feedback, I’ve been thinking about ways to get students to pay more attention to the feedback first. For this task, a pretty basic spring extension data-processing lab, I checked the labs over the weekend and wrote down the scores on paper. In class I put students in groups of three and asked them to share the GoogleDoc of the lab with their partners. They then completed a feedback circle, using the coversheet below to identify specific areas for improvement and checking them. If they could suggest an improvement (e.g. a better graph title), they could add this as a comment.

This took about 15-20 minutes, after which students completed the process-level and self-regulation sections and returned the form to me, before continuing with the day’s tasks. Before the next class, I’ll add their grades to the form (rubrics are on the reverse of the copy I gave students) and log them in Powerschool. Delaying communication of the grade this way should, I hope, have helped students engage more effectively with the feedback – I learned last week that making changes in Powerschool resulted in automatic emails to students.

I was wary of doing this first thing on a Monday, but the kids were great and enjoyed giving and receiving feedback from peers. Of course some goofed off a little, but they were easy to get back on track. For the high-flyers who enojoyed the method less the first time, this gave them a chance to really pick through each others’ work to give specific feedback for improvement.

Here is the document:



The Power of Feedback (pdf). John Hattie & Helen Timperley, University of Auckland. DOI: 10.3102/003465430298487

1 Comment

First Unit Reflections: Is It Working?

Today we took the opportunity in the IBBio class to reflect on the unit we have just completed, including the tasks and assessment. As always with CA students, the results were constructive, positive and useful, with a general affirmation of the value of what we are doing as a class. The feedback included our personal GoogleSites project, with most students keen on continuing and feeling it helped them learn and with some interesting alternatives for those that it is not.

This kind of feedback is really useful once the class has settled in. They are open enough to be able to be honest, but it is early enough to change practices where needed. We will make some adjustments, though we are generally on the right track with this group. I’m really looking forward to seeing the process and products of the students who have elected to become science writers instead of GoogleSiters.

Here are the results in a summary presentation.


You Can’t Differentiate Mediocrity.

Good teaching is differentiation: knowing our students, knowing our curriculum, knowing and using a range of strategies and finding opportunities to give students what they need. It is knowing who is learning what and how and it is knowing our impact as the teacher in the classroom. An excellent differentiated curriculum and classroom needs to be first excellent, then differentiated: you can’t differentiate mediocrity. Differentiation depends on effective collaboration between teachers and between students and faculty. It needs an atmosphere of respect and inclusion and a common goal of student learning. 


A concept map for differentiating instruction, from Leadership for Differentiating Schools & Classrooms by Carol Ann Tomlinson and Susan Demirsky Allan.

Over this week at CA we have had Sandra Page come back in from ASCD to help teachers level-up from last year’s work, which was largely and introduction to differentiation and establishing a common language and set of strategies around it. Then  over the weekend I attended a separate JCIS weekend workshop at Osaka International School, led by Naomi Nelson (part of Bill and Ochan Powell’s Education Across Frontiers), on ‘Differentiation: Making Inclusion Happen.’ It was a powerful week of PD, with Naomi’s weekend sessions being particularly useful as a coordinator. With so much professional learning taking place – as MYP Co, science teacher and HOD Science – it will be a challenge to summarise this all into one post and you will likely recognise much of the ideas here.

An Overview

The differentiation content in each session was largely based on the work of Carol Ann Tomlinson, with the common language of differentiating Product, Process, Content (and Affect) by Readiness, Interest and Learning Profile. As a focus at the school we have been working mainly on building teachers’ readiness in Readiness, Process and – to a much lesser extent – Product (assessment).  The work we have been doing has been supported by resources on the school’s faculty guide and in the ATLAS planners, as well as department-based sessions with Sandra.

The Curriculum-Students Balance

Naomi did an great job of crystallizing the connections between an excellent concept-based curriculum with the practices of teaching in the differentiated classroom. Building on Tomlinson’s work, the mantra became an excellent differentiated classroom is first excellent, then differentiated. We need to build on a strong knowledge of an excellent curriculum, and the process of building and articulating that excellent curriculum is the foundation of progress. As part of this curriculum, we need to be aware of the greater conceptual understandings of our unit and the minimum acceptable evidence of understanding of our students to be successful in the unit. We must know where we need to go, and then think about how we might bring in readiness and interest to get there.

Screen Shot 2013-09-23 at 9.31.55 PMA strong curriculum doesn’t, however, mean a slavish devotion to content over all else. We are educators, not fact transmitters, and must ensure that the students remain in the balance. By knowing our students – their interests and readiness as a group and as individuals and what makes a successful learning environment – we can start to meet their needs as learners. We should use formative and summative assessment data as a regular part of our own teaching feedback cycle.  A differentiated classroom is responsive; the opportunities to respond are planned.

A good differentiated classroom encourages inquiry, but does not lose the curriculum in the balance: a classroom too student-oriented doesn’t easily help progression or maintain ‘standards’ (and as a result, open inquiry as curriculum ranks pretty low on Hattie’s impacts). However, if we focus only on the content, insisting that all students must meet our personal standards at the same time in the same way in order to be ‘successful’, then we are doing our learners a huge disservice.

“Differentiate Up”

A successful differentiated classroom does not sacrifice standards or make things ‘easier’ for students. We don’t give everyone an undeserving top grade because they worked hard or we feel bad for them. We certainly don’t adjust our grading fairness. Instead, we ‘differentiate up’ by making clear our expectations of all students and providing extension that takes the most ready to the next level. We do not differentiate the significant concepts, unit questions or key content by readiness – instead we make it clear how students can go beyond, to extend themselves. We ensure that students sit in the zone of proximal development, an area of tension where they are forced to learn not through giant leaps but through an invitation to challenge and to flow. For those less ready we can provide more process support, scaffolding, coaching and clarification. When all students are clear on what they are required to understand, know and do then we have a solid foundation for differentiation.

By differentiating up, we avoid dumbing down.


Developing a Repertoire of Strategies for Effective Differentiation

Strategies for differentiation might be a good entry point for teachers who want to see it in action, and to learn to see the benefit of putting the learner at the centre of learning, though they can only go so far if we are not also thinking critically about curriculum development. Both Sandra and Naomi had plenty of strategies to share – here are a few that I have tried and know to be effective in my own classes, which largely hinge on formative assessment, feedback and adjusting my practice, student groupings or learning processes. If you have read this far, you might want to put some of your favourites in the comments.

  • Exit Tickets
    • 1-minute essay (summary of learning)
    • 3,2,1 (3 things I learned, 2 I will practice, 1 question I have)
    • Response to a conceptual or challenge question
  • Socrative Space Races
    • Usually used as a warmer to get groups working together
  • Quia Quizzes
    • Strictly formative, these are for practice and immediate feedback
    • Based on content or skills of the lesson/ subtopic
    • Results help me decide – before class – who needs what help and who needs extension
  • Think-Pair-Share, Headlines, and other Making Thinking Visible Routines
  • Drafting stages of assignments (and feedback, through GoogleDocs and Hapara), to differentiate assignment-based lessons by readiness in terms of completion, skills to develop further or content-based understandings
  • Interest-based choices for students in topics for assignments, essays, research questions

Some strategies I want to try more: 

  • “Tell Me Something” paired reading
  • Cognitive Coaching in classes
  • Round Robin Reflections
  • More effective use of different ‘entry points’ to units as part of the tuning-in process

Respectful Tasks ≠ Labeling Students

A differentiated classroom feels like a community of learners, rather than rows of pupils. With flexible grouping and respectful tasks built on a supportive learning environment and a genuine care for students we can differentiate to meet students’ needs. It is often raised in differentiation sessions that teachers are wary of stigmatising students with the label of being ‘needy’ – and ‘not labeling students’ is a high-impact strategy on Hattie’s meta-analysis. However, giving students that they need, in a manner that encourages growth is not the same as permanently or obviously labeling a student. If we manage students effectively in a caring environment, we can ensure that students are given an appropriate level of challenge (and they will appreciate it).

If we differentiate by readiness only, all the time, we run the risk of creating a ‘tracking’ system in the class – but there are many ways to keep the groups flexible – by interest, level of completion of a task, preference of style (where appropriate, such as direct instruction, reading, problem-solving) or just simply through random groupings.

Students like to know why they have been grouped and in a supportive learning environment, it is OK to share our reasoning. Teacher-student relationships are high-impact on Hattie’s scale, and effort spent in cultivating them is energy well spent.

Differentiation as a Collaborative Process

One of the strengths of Naomi’s workshop was the focus on collaboration as a foundation of effective differentiation. We spent time looking at student responses in groups, trying to deduce students’ thought processes and it was a really useful task to look at the problems from others’ perspectives. She gave an overview of and time to practice Cognitive Coaching techniques, as well as an opportunity to use case studies in  groups to think about the seven norms of collaborative work:

  • pausing (the ‘gift of time’)
  • paraphrasing (“So you’re saying…”)
  • putting inquiry at the centre (of the issue)
  • probing for specificity (“Tell me more about…”)
  • putting ideas on the table (and knowing when to take them off)
  • paying attention to self and others
  • presuming positive intentions (one of my favourites and one if which we must always be mindful)

I wonder what the novice differentiators made of these sessions that were a step away from the direct student-teacher practice of differentiation, but I could really see the value of them as an MYPCo and HOD.  I think we could use up-skilling as HODs in thee practices in order to run more effective, supportive and collaborative meetings in our departments.

Where Would I Like to Go Next?

As a coordinator in the school, I tend to see lots of opportunities for development. A small breakthrough for me over the last couple of weeks (and in part due to attending IB School Visiting Team Member Training) is how we can develop the practices of differentiation and collaboration in-step with curriculum review and strengthening. I would like to have sessions and differentiated PD that build on our work on ATLAS to really connect curriculum to practice through strengthening our curriculum and assessment while developing strategies for formative assessment and differentiation. I really want to open up classrooms, build a stronger community around professional learning and peer-support. We should form vertical curriculum groups, including elementary teachers, to look critically at the standards underlying our curriculum.

I think if we were to have Naomi come to the school next year, she could work with the whole faculty on differentiation strategies and student learning goals and with the HODs on collaboration, cognitive coaching and leading effective meetings. In ongoing Wednesday-afternoon PD we can continue to focus on practices and building an excellent, concept-based, rigorous curriculum and careful collaborative moderation of student work.

I really want to develop our ties with local IB schools more carefully – a shared PD day with OIS would give an opportunity to have day-long jobalikes and a keynote, and if we go a step further we can implement the model we used in IBDunia in Indonesia for the IB teachers’s conference, drawing on the wealth of talent our community has in the classrooms to teach the teachers.


We’ve come a long way as a school over the last few years – we’re ready to really level-up and MYP: Next Chapter is the perfect opportunity to do this by thinking carefully about who we are, what we teach and how we get there. Finally, the graphic below is an attempt to communicate (in a single slide) how we can use readiness and interest most easily in MYP and DP.

An attempt to capture how we can differentiate by readiness and interest in the MYP and DP. This is in repsonse to teachers' concerns about how we get started and avoid 'dumbing down' or work within the boundaries of our curriculum framework and assessment regulations.

An attempt to capture how we can differentiate by readiness and interest in the MYP and DP. This is in repsonse to teachers’ concerns about how we get started and avoid ‘dumbing down’ or work within the boundaries of our curriculum framework and assessment regulations.

Using personal GoogleSites for learning, assessment & feedback in #IBBio

Click to see an example of how the GoogleSite was set up.

This is reposted from my blog. To comment, please go there.


Over the last two years, My IB Bio class have been keeping individual GoogleSites as records and reflections of their learning. Based on this experience and their feedback, I have tweaked the project to try to make it more effective as a learning tool.


With the bulk of our resources online (here on, Slideshare and elsewhere), as well as a 1:1 laptop and GoogleApps environment, it doesn’t make much sense to be using too much paper. The aim of this project was to empower students to build skills and knowledge connected to the IB Biology course, whilst making their thinking visible to me as a teacher. Through this process, students are able to track their progress, stay on top of their grades and prepare at their own pace (especially if they are working ahead).

Continue reading