Do they understand what I taught?

Slides

Context

Perhaps the biggest thing for new teachers to learn is the difference between "I taught it" and "they learned it".

It is very easy to bring into teaching a subconscious assumption that your knowledge is to stand at the front of the room and pass on words of wisdom that are gobbled up by your audience.

If only teaching were that easy...!

The overarching goal of this training session, "Do they understand what I taught?", is to embed awareness that a big responsibility in teaching is checking for understanding, and giving some practical tools for that.

Video

Trainer notes

"Does that make sense?" (slide 3)

This is probably one of the single most tempting things to say as a teacher, but there is almost always a better alternative available.

The intention with this slide is to pose the question - why might this not be the best thing to ask (assuming we sincerely care about checking whether what we said made sense to students)?

The reasons why I think this is often not the best question: 1. It's leading - it pushes towards a yes (and silence is interpreted as "yes" - lots of false positives) 2. It's often used as a classic signal for "I want to move on [and don't necessarily actually care about whether or not you have questions]" - particularly because the absence of questions is often followed up with "Okay, great"!, putting anybody who wants to ask a question in the position of being an irritation / barrier 3. Even if it were not for the fact that it applies strong pressure on students to not say anything, students are not good judges of their own understanding - e.g. classic case in coding of a student who watches a YouTube video, tells you confidently that they understand it, but cannot reproduce anything from the video

So, as a rule of thumb, trainees should try to reject self-report (like "Any questions?") where possible.

Check understanding through assessment (Slide 4-5)

Rather than asking "Does that make sense?", "Got it?" or "Any questions?", trainees can do better by asking themselves, "What precisely is it I want to check whether or not they understand?", and asking a question designed to test that.

In the example on Slide 4, we could ask the students questions to assess whether they understand what has been asked, and capture data that is objective.

So, in Slide 5, we're putting that into practice - assessing whether trainees understand this concept by asking them to come up with questions to check understanding for the array instruction.

(The original intention was for this to be done in two different Slack threads, but this could instead be done by Zoom chat, or by giving the trainees a couple of minutes and choosing a couple to share.)

Build self-monitoring through self-assessment followed by objective assessment (Slides 6-8)

The idea of this slide is not for trainees to necessarily leave with a good understanding of metacognition, but to have a specific 'aha!' moment.

Put up slide 6 and invite trainees to read, and then rate how well they think they understand metacognition - either through Slack emoji voting or it could be a 1-3 'vote by fingers [1 is low, 3 is high]'.

After the results are up, move to slide 7, and see if anybody can give a good definition of metacognition...!

The 'aha' that this is meant to trigger is how relatively blasé trainees probably were about rating themselves as a 2 or 3 about understanding metacognition, but - when actually put on the spot and required to define metacognition - most will realise that they probably understand it less than they self-assessed.

Reflections (Slide 10)

I would typically ask them to spend 1-2 minutes thinking about this, and then invite a couple of trainees to share.