🚽 #LearningInTheLoo: Efficient Grading Practices

This week’s Learning In the Loo is a remix & continuation of this blog post I wrote a couple of months ago:

As always, all my past editions (including this one) of Learning in the Loo can be found here.

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

Learning Skills & End of Course Reflections #OntEd

Here in Ontario we have to report on learning skills at report card time. And even more so when virtual compared to face to face, some of the things we’re meant to assess in learning skills are not always visible to me. So I like to ask my students to complete a self-reflection about their learning skills in my class using a rubric I created years back from the growing success document.

Here are the learning skills criteria outlined in Growing Success (page 17):

So I took that doc and created a rubric with the 4 levels of achievement students see for learning skills on their report card; Excellent, Good, Satisfactory & Needs Improvement.

I ask students to read it over and choose one description for row that best describes their work habits in our course. This is useful feedback to me because sometimes I don’t realise, for example, how much help a student is seeking outside of class time to persevere with the course material (tutoring, homework club, etc.).

Then on the back (I do this as a paper task when we’re face to face for in-school instruction) I ask students to review their evidence record that I’ve emailed them and tell me what level they think they’re currently achieving in the course, as well as which expectations are strong or weak:

This accomplishes a few things:

  • Allows me to see how accurately the student understands their level of achievement in the course thus far. When a student sitting around a 1- tells me they think they’re getting a level 3, then I know I need to have a conversation with that student where we look over their evidence record together to clarify what it’s telling us.
  • Asking them to identify the curriculum expectations where they have not yet demonstrated a passing level of achievement ensures they recognize the gaps they need to fill in their evidence of learning before the end of the course (redo assignments or propose another way to show me evidence of their learning).
  • Identifying their strengths & weaknesses by curriculum expectation can help me when writing up their report card comments as we are asked to provide one comment about their strength, one about a weakness, and one for next steps. I’m not a fan of the comment bank provided to us (it’s still based on the categories in the achievement chart instead of the curriculum expectations) so I choose to write my own comments (using very similar wording to those in our comment bank) that are based on the curriculum expectations since we have transitioned to grading by curriculum expectations (aka. standards-based grading).

The last section asks students to reflect on certain aspects of the course that help provide me with feedback for the next semester (or quadmester now). These can obviously be adapted to match the work done in your class as well as the criteria you’re interested to get feedback on:

When we teach face to face I print this document up on legal size paper & give one to each student to fill out & write their name at the top. When virtual I assign it in Google Classroom choosing to make a copy for each student which automatically puts their name in the title. If you want anonymous feedback for that last part to ensure they’re willing to give you the honest goods, you can turn that last set of questions into a Google Form that doesn’t collect identifying info instead.

As always, here’s the whole document (the virtual teaching version – find the face to face version in the Version History) so you can make a copy & edit as you see fit. Are there reflection questions you love asking at the end of a course that you don’t see here? I’d love to hear what they are in the comments below!

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

Grading Tips for Efficiency (aka why the Twitter community is so awesome)

This year is tough. No doubt about it. Colleagues are having conversations around different ways to assess in the virtual online teaching environment in order to ensure students aren’t cheating by, for example, using an app like Photomath to solve equations for them. And sometimes when you look for more open-ended prompts that allow for variety & no one exact solution, it winds up taking longer to mark. So I thought this question from Sabrina on Twitter was timely:

Have a look through all the replies but I thought I would feature a few that stood out to me here. Love this idea from Karen about giving herself a timeline for returning work, graded or not. It happened so often that I trucked piles of marking home, night after night, to wind up not even taking it out of the bag because “I’ll do it tomorrow before class when I get to school & am fresh with renewed energy & motivation”.

Remembering that here in Ontario our final grade is meant to be based on observation, conversation and product; all three. Most of us tend to rely on too much product, me included. It’s tough to get a recorded level written down based on observation & conversations because they happen in the moment & you don’t necessarily have the time to stop & record a level on a checklist. But if you can’t do it in real time, a chunk of time after class is over to use some well-planned observational rubrics, like Meaghan suggests here, and record levels of achievement or anecdotal notes based on your observations & conversations from that class can be helpful:

I appreciated this response from Krista because I could really see myself in it. My style is to mark an entire batch at once. But as Krista points out, it’s hard to carve out that magical block of time to do that. She suggests chunking down to smaller sets. For me, that involved marking an entire set of 1 question. Then coming back to mark the entire set of question #2, etc.

If you’re like me and watch a lot of videos on YouTube (or listen to podcasts) at a speed of 1.25 up to 2 times faster than recorded, then this suggestion from Michael might appeal to you. Have small group Meets & record them. You could open several meets & move between to supervise after hitting record in each. Then rewatch them at a later date but sped up (click the settings cog along the bottom of the video saved to Drive & choose Playback Speed to adjust as desired):

In my first practicum in a grade 8 science classroom I planned a ton of hands-on labs because what’s better than hands-on science, right? My associate teacher smiled and asked if I planned to mark each of the lab reports? Of course, I said! Oh man!!! Her smile should have told me she knew something I didn’t. All those labs are sooooo much work to read through & mark. So I love this suggestion from Audra about only asking for certain sections of the lab report for each lab. This strategy could be applied to other assignment styles beyond science lab reports too:

Another strategy I’ve played with over the years is audio feedback. For a while I was uploading images of student Math tests into Explain Everything & posting a personal video to each student with their feedback as I circled and pointed to that part of their work in the image on the screen. I don’t know if it saved me time or if I spent the same amount of time giving more detailed and informative feedback, but either one is a win. So I liked this reminder from Melanie about audio feedback:

Have you got tips for being a more efficient marker/grader? How to give better or more detailed feedback in the same time or less? Leave a comment below – I’d love to hear your strategies!

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

Reflecting on our First Test

My grade 10 applied class this year has some students with some serious gaps in their math abilities/knowledge. We had our first test last week (which is late – about 5 weeks in – too many interruptions to class so far; assemblies, etc). For the first time I tried Howie Hua’s strategy with my class:

I asked my Tweeps if they do VRG for this or let students choose. Almost everyone said they let students choose. I may try VRG next time as there were a couple of students who didn’t get up to talk to anyone. I’ll be asking them for feedback today about how they thought that helped them (or whether or not it did).

Unfortunately on test day due to an assembly running long that morning, they took 10 minutes away from my period. A number of students had trouble finishing. I struggle with that b/c I think many of them want more time, but simply spend the time staring at the page, not being productive in solving. This class is mostly ELLs thought (more than usual) and in the past when that’s been the case & I have slower test takers I have made shorter more frequent tests.

So normally I test ever 2 to 3 weeks once we’ve done activities & practice that cover 4 or 5 of the 9 overall expectations for the course. Then the test is 2 pages double sided, each side of a page is 1 overall expectation (usually one or two problem solving tasks). In the past I’ve changed that to testing every 1 to 1.5 weeks on 2 of the 9 expectations instead. I think that’s what I’ll need to do here so that if a student needs more time they can have it within that class period.

I haven’t yet returned their marked tests (I put feedback only on the test & they receive their grade separately a day later on their evidence record via email; research shows that mark + feedback results in students caring only about the mark, not the feedback). Yesterday I sketched on the board the same triangle based prism they’d had in a Toblerone bar question on the test but with different dimensions. I asked them to find surface area & volume (dimensions were such that they needed to use Pythagorean Theorem to find the height of the triangular base). Most groups took almost the entire period to solve this!!! One group never got beyond the Pythagorean Theorem part. I ran around like a chicken with my head cut off trying to facilitate, correct misconceptions, etc.

As an aside: A colleague came by to watch (said he’s been meaning to for a while now) and I had to ask him not to write on the students’ boards or tell them how to do the next step. Reminded me how hard it is to teach other teachers the skill of not telling students the answers always, but asking questions that help them figure it out for themselves. He said “but they’re nodding so they understand what I’m showing them”. I explained I want them doing the math, not him. I asked him to talk with them but don’t do the math for them.

I also got a short video of the groups getting started on the problem if you’re interested:

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

#LearningInTheLoo: actionable feedback strategies

Inspired by this tweet …

I asked my PLN to share their strategies for getting students to take action on the feedback we leave them on their work:

Their responses are compiled in my latests edition of Learning in the Loo:

Learning in the Loo

The archive of my past editions can be found here in case you want to put some up in the bathrooms of your school too!

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

Captive Audience: #LearningInTheLoo

Do you ever read a great article or blog post and think I HAVE to share this with my colleagues! So you email everybody the link & say you have to read this. And then maybe 1 or 2 people actually read it?

I find so many great things on Twitter & blogs (#MTBoS) that I want to share with my colleagues, but they often don’t have (or make) the time to check them out. So when I happened upon a tweet about Learning in the Loo I thought it was genius – a captive audience!

So I have made it a habit to create & post a new Learning in the Loo 11×17″ poster in each staff toilet in our school every 1 or 2 weeks this semester. I curate the amazing things I learn about online & turn them into quick read how-tos or ideas to read while you … “go”. And it just occured to me that I should have been posting them to my blog as I made them. But now you can get a whole whack of them at once and next year I’ll try to remember to post them as I make them.

The whole collection so far can be found here with printing instructions.
Feel free to make a copy (File –> make a copy). Also the sources of images & ideas are in the notes of the doc above too.

Here they are:

Learning in the Loo Assessment FeedbackLearning in the Loo Cell Phone Work Life BalanceLearning in the Loo EdPuzzleLearning in the Loo Adobe Spark VideoLearning in the Loo TwitterLearning in the Loo Google ClassroomLearning in the Loo Grouping StrategiesLearning in the Loo KahootLearning in the Loo Google Docs

What would you share in your school’s first Learning In The Loo poster?

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

OAME sketchnotes

At the start of May I attended the OAME conference in Barrie. This was my 2nd year attending. I was disappointed to have my session cut due to low enrollment 5 weeks before registration closed, but c’est la vie! Next year in Kingston I have an idea of how to better “sell” my session in the description. Fingers crossed to not get the final session block on the Saturday either – that drags your numbers down for sure.

The food was the definite low point of the trip. Georgian College offered a poor continental breakfast in the residence and OAME provided all vegetarians with gluten free bread that wasn’t suited for human consumption. Let’s hope the Kingston organizers manage something a notch above.

I thought I would share some sketchnotes I made in order to summarize my new learnings. Let’s start with the Ignite sessions which I think are my highlight of the conference each year. Ignite speakers get 20 slides that auto-advance every 15 seconds to total 5 brief minutes to try & get a strong message across.

OAME Ignite 2016 Part 1

OAME Ignite 2016 Part 2.PNG

I was pretty active on the Twitter feed for the conference as well:

Lastly, I usually try to make an effort to seek out OAME sessions by teachers that I can’t see or work with at home but my colleague Lynn Pacarynuk‘s session on test design & assessment made me think more & harder about my own practices. So much so that I summarized some of her ideas in 2 different sketchnotes:

OAME Test Design Process Lynn Pacarynuk.PNG

OAME Shifts in Assessment & Test Design Lynn Pacarynuk.PNG

Until next year, OAME!

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

What Does Level 4 Mean? Making better rubrics.

A couple of years ago I was teaching Géographie to the French Immersion students at my school. I had just recently attended a workshop by Garfield Gini-Newman on Critical Thinking skills and was trying to approach my Geography curriculum through questions that would require my students to think critically.

My tests changed to open-ended essay questions in order to allow students to show me their own learning based on the investigations they undertook in class (which may have been different from the student sitting next to them). But I discovered that they were not very good at answering open-ended questions. They kept giving me short answers, with no real explanation or justification and often repeated the same idea multiple times using different words. In my search for a remedy, I discovered the 11-sentence paragraph which looks like this:

Opening sentence
Paragraph 1 (3 sentences):
1st point
Explanation w/ more detail
Example/quote/analysis
Paragraph 2:
2nd point
Explanation w/ more detail
Example/quote/analysis
Paragraph 3:
3rd point
Explanation w/ more detail
Example/quote/analysis
Concluding sentence

After working on this answer structure, my students’ marks increased significantly because it prompted them to really explain their thinking and justify it with specific examples from what we’d been learning.

I’ve also used a similar type of idea when prepping my students for the grade 10 Ontario Literacy Test (a graduation requirement; and not just the English teacher’s job to get our students ready!). For the short paragraph answers, I encourage them to write their answer in this format:

Full sentence answer that contains the question within it. Because . . . . For example . . . .

In reading through the summative projects for one of my non-Math classes this past week, I realize that I did not spend enough time teaching them HOW to express their knowledge in order to get a level 4 (highest level of achievement). For example, in their exit interview, my students were asked the following question:

How have you used teamwork to become a better leader this year?

Many gave vague, broad-sweeping statements such as “teamwork is really important because, without each of us helping, we couldn’t run the events that we do”. While true, this sort of answer doesn’t tell me much about their personal teamwork experience, nor does it explain how teamwork made them a better leader than if they had done it alone.

I should have taught them the 11-sentence paragraph. Or spent more time on the “answer … because … for example …” format (because we actually did cover this one in class).  And also my rubric should have reflected this. Here is the rubric (well, a checkbric really) that I used: Capture

The descriptors are those given in the curriculum documents themselves. But I’m not sure it really tells the students what a level 4 answer entails. Not that you have to give away the answer to the question, but they should know what a “considerable” response entails in order to get a level 3. In math class I teach my students through exemplars; we assess solutions together (moderated marking) as a class so that we all have a clear understanding of what each level of achievement looks like.

Here is what I will use as a rubric next time:Capture

So now there will be no question about what I am looking for in their answers.

I know it’s a busy time of year with exams and summatives to mark as well as prepping our new classes for Monday, but I think it’s so important for us to take time also to reflect on what worked and what didn’t this past semester. So as I complete my marking, I also write “next time …” notes to myself to remind me of how I can improve based on what I’m observing now.

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

No blank answers on tests, please!

Why did it take me 9 years of teaching to figure this out???  . . . It seems so simple now that I am doing it.

Maybe I only just noticed how big of a problem this is because I taught ALL applied level courses this year:

BLANK ANSWERS!

It’s like my students just gave up. On test problems I KNEW they could answer.

“But I’ve seen you do this in class!” I’d cry, “Why did you leave it blank?”.

And yet only now, during exam week, did I find a solution for this. And now that I’ve tried it, I can’t believe I didn’t think of it years earlier . . . it seems so simple:

“Do you have blank answers still? Then I’m not taking your exam. I’ll give you a hint if you like . . .”

And that’s how it went. As each student handed me their exam, I leafed through each page and if I saw any problem unanswered I would hand it back.

“But I have no clue how to do it!” they protested.

So I gave them hints. Showed them similar problems. Drew diagrams for them. I wrote each hint right on their test paper in pen to remind myself when I’m marking of just how much help they needed.

But every student answered every problem.

And that’s so important – because I can’t find evidence of learning when you’ve written nothing.

And the only thing I can’t figure out now, is why it took me 9 years to start doing this?! Thanks to Mary Bourassa for the inspiration!

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

Evidence Records

Part 4 of 4 in a series of blog posts about Standards-Based Grading

No more MarkBook

We’ve arrived at the last piece in the A&E puzzle; the evidence record. This is where you will be recording the levels each student received for the various expectations on each test/task. MarkBook (a marks management software) no longer meets our needs as it will simply calculate an average based on the weighting of each test which would be set out ahead of time and remain the same for each student. An average is not necessarily the best representation of what a student knows in a course. Growing success describes how to determine a student’s final grade as such:

“Determining a report card grade will involve
teachers’ professional judgement
and interpretation of evidence
and should reflect the student’s most consistent
level of achievement,
with special consideration given to more recent evidence.”

Most consistent, more recent.

The idea being that if a student performs poorly at the beginning of a course, but puts in the work needed to catch up and learn the material, they should be assigned a mark that reflects their current ability and knowledge in the course at the end (despite the fact that they had a rough start). An average will negatively impact this student by always pulling their grade down due to the low marks early in the course.

For example, let’s look at the example below which looks at the marks that sky diving students received for packing parachutes:

All 3 students have the same average. But when asked “which student would you want to pack your parachute?” I believe we would all have the same answer. So clearly, an average does not always give us the most accurate picture of student achievement.

Evidence Record Template

Below is an example of an evidence record for a math course. This would be printed out as one page per student. It does mean a binder full of these sheets in order to track a student’s grade and progress. Some teachers are not keen on these being a paper-pencil tool. There are electronic versions of evidence records starting to be made; the OCDSB is piloting the online MaMa+ version this semester, and Bruce McLaurin over at Glebe C.I. has created some excel spreadsheets that self-populate the marks into evidence records for you. However, I have found that trying evidence records for the first time with pencil and paper gives you a much better feel for how they work & what they represent. I find that by handwriting a student’s achievement on to this paper record, I notice trends in the students’ marks much better than if I were relying on a program to create the evidence record from my class list of marks.

Math evidence record

Update (2014.12.04): This fall, OCDSB teachers have access to the MaMa (Marks Manager) software online which allows us to keep electronic evidence records.
Here are some screenshots of what it looks like.
Inputting test marks by class list:
MaMaMarkEntryMaMa then automatically populates each student’s evidence record with the marks from the test:MaMaEvidenceRecord

How to use evidence records

So a completed evidence record might look something like the following:

Evidence Record U

We can see, for example, the students quiz marks (code Q) and test marks (code T). Each code may appear in more than one row because a test can evaluate multiple expectations. The numbers attached to each code indicate the chronological order of the tests: T1=Test 1 and T2 = Test 2. The exam mark has been labeled with code E in the grey row above each strand. Notice the exam receives 3 separate marks; one for each strand of the course. The summative, code ST, has been recorded in its own row below the term work.

What final level of achievement would you assign to the student with the above evidence record?

Here’s a video clip of the type of discussion one has with colleagues as we learn to interpret these evidence records, and still later on when we collaborate with colleagues to interpret evidence that may be inconsistent and therefore difficult to assign a final grade. Note that they are using a different evidence record in the video than the one I showed above:

This is the type of moderation I have teachers try when I offer workshops to my colleagues about this new A&E. I find it really useful to hear my colleagues interpret the evidence presented, and comforting to see how consistent we usually are in determining a final grade.

Levels VS Percents

Unfortunately, at this time, the Ministry of Education still requires us to convert each student’s final overall level of achievement into a percent for the report card. The OCDSB has provided the following “peg chart” in order to do so:PegChartOCDSB

It’s not ideal to be converting the levels back to percents in the old style, but it will do until the Ministry changes report cards to match the levels of achievement from their curriculum documents.

Here are a couple more completed evidence records to look at and practice determining an overall final grade. Note that MT is Makeup Test.

evidence record Z. . . . and another . . .

evidence record E

The more familiar we become with this template, the more conversations we have with our colleagues, the more comfortable we become with using our professional judgement in order to determine the student’s final grade. This professional judgement replaces our reliance on software that calculates averages for us that feel accurate, but that I don’t think are any more accurate than my own well-informed professional judgement (but that’s a conversation / debate for another blog post!).

The End! But it’s only the beginning . . . 

So here we are at the end of my blog series about the new A&E in the OCDSB. It’s really only the beginning; this process will evolve as we implement it in our classrooms. We’re learning how to best use these templates in order to inform not only our evaluation practices, but our instructional practices as well.

How are you feeling about the new A&E?

Start a conversation with your colleagues in your school or in your department.

Update (2014.12.04): Here is what the OCDSB published for parents this fall on the topic of their assessment & evaluation policy:
Parent Guide to Assessment, Evaluation and Reporting – Grades 9-12

I would love to hear your comments, questions, concerns . . . Leave a comment below or get in touch via Twitter!

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)