Reflecting on our First Test

My grade 10 applied class this year has some students with some serious gaps in their math abilities/knowledge. We had our first test last week (which is late – about 5 weeks in – too many interruptions to class so far; assemblies, etc). For the first time I tried Howie Hua’s strategy with my class:

I asked my Tweeps if they do VRG for this or let students choose. Almost everyone said they let students choose. I may try VRG next time as there were a couple of students who didn’t get up to talk to anyone. I’ll be asking them for feedback today about how they thought that helped them (or whether or not it did).

Unfortunately on test day due to an assembly running long that morning, they took 10 minutes away from my period. A number of students had trouble finishing. I struggle with that b/c I think many of them want more time, but simply spend the time staring at the page, not being productive in solving. This class is mostly ELLs thought (more than usual) and in the past when that’s been the case & I have slower test takers I have made shorter more frequent tests.

So normally I test ever 2 to 3 weeks once we’ve done activities & practice that cover 4 or 5 of the 9 overall expectations for the course. Then the test is 2 pages double sided, each side of a page is 1 overall expectation (usually one or two problem solving tasks). In the past I’ve changed that to testing every 1 to 1.5 weeks on 2 of the 9 expectations instead. I think that’s what I’ll need to do here so that if a student needs more time they can have it within that class period.

I haven’t yet returned their marked tests (I put feedback only on the test & they receive their grade separately a day later on their evidence record via email; research shows that mark + feedback results in students caring only about the mark, not the feedback). Yesterday I sketched on the board the same triangle based prism they’d had in a Toblerone bar question on the test but with different dimensions. I asked them to find surface area & volume (dimensions were such that they needed to use Pythagorean Theorem to find the height of the triangular base). Most groups took almost the entire period to solve this!!! One group never got beyond the Pythagorean Theorem part. I ran around like a chicken with my head cut off trying to facilitate, correct misconceptions, etc.

As an aside: A colleague came by to watch (said he’s been meaning to for a while now) and I had to ask him not to write on the students’ boards or tell them how to do the next step. Reminded me how hard it is to teach other teachers the skill of not telling students the answers always, but asking questions that help them figure it out for themselves. He said “but they’re nodding so they understand what I’m showing them”. I explained I want them doing the math, not him. I asked him to talk with them but don’t do the math for them.

I also got a short video of the groups getting started on the problem if you’re interested:

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

#LearningInTheLoo: actionable feedback strategies

Inspired by this tweet …

I asked my PLN to share their strategies for getting students to take action on the feedback we leave them on their work:

Their responses are compiled in my latests edition of Learning in the Loo:

Learning in the Loo

The archive of my past editions can be found here in case you want to put some up in the bathrooms of your school too!

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

Captive Audience: #LearningInTheLoo

Do you ever read a great article or blog post and think I HAVE to share this with my colleagues! So you email everybody the link & say you have to read this. And then maybe 1 or 2 people actually read it?

I find so many great things on Twitter & blogs (#MTBoS) that I want to share with my colleagues, but they often don’t have (or make) the time to check them out. So when I happened upon a tweet about Learning in the Loo I thought it was genius – a captive audience!

So I have made it a habit to create & post a new Learning in the Loo 11×17″ poster in each staff toilet in our school every 1 or 2 weeks this semester. I curate the amazing things I learn about online & turn them into quick read how-tos or ideas to read while you … “go”. And it just occured to me that I should have been posting them to my blog as I made them. But now you can get a whole whack of them at once and next year I’ll try to remember to post them as I make them.

The whole collection so far can be found here with printing instructions.
Feel free to make a copy (File –> make a copy). Also the sources of images & ideas are in the notes of the doc above too.

Here they are:

Learning in the Loo Assessment FeedbackLearning in the Loo Cell Phone Work Life BalanceLearning in the Loo EdPuzzleLearning in the Loo Adobe Spark VideoLearning in the Loo TwitterLearning in the Loo Google ClassroomLearning in the Loo Grouping StrategiesLearning in the Loo KahootLearning in the Loo Google Docs

What would you share in your school’s first Learning In The Loo poster?

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

OAME sketchnotes

At the start of May I attended the OAME conference in Barrie. This was my 2nd year attending. I was disappointed to have my session cut due to low enrollment 5 weeks before registration closed, but c’est la vie! Next year in Kingston I have an idea of how to better “sell” my session in the description. Fingers crossed to not get the final session block on the Saturday either – that drags your numbers down for sure.

The food was the definite low point of the trip. Georgian College offered a poor continental breakfast in the residence and OAME provided all vegetarians with gluten free bread that wasn’t suited for human consumption. Let’s hope the Kingston organizers manage something a notch above.

I thought I would share some sketchnotes I made in order to summarize my new learnings. Let’s start with the Ignite sessions which I think are my highlight of the conference each year. Ignite speakers get 20 slides that auto-advance every 15 seconds to total 5 brief minutes to try & get a strong message across.

OAME Ignite 2016 Part 1

OAME Ignite 2016 Part 2.PNG

I was pretty active on the Twitter feed for the conference as well:

Lastly, I usually try to make an effort to seek out OAME sessions by teachers that I can’t see or work with at home but my colleague Lynn Pacarynuk‘s session on test design & assessment made me think more & harder about my own practices. So much so that I summarized some of her ideas in 2 different sketchnotes:

OAME Test Design Process Lynn Pacarynuk.PNG

OAME Shifts in Assessment & Test Design Lynn Pacarynuk.PNG

Until next year, OAME!

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

What Does Level 4 Mean? Making better rubrics.

A couple of years ago I was teaching Géographie to the French Immersion students at my school. I had just recently attended a workshop by Garfield Gini-Newman on Critical Thinking skills and was trying to approach my Geography curriculum through questions that would require my students to think critically.

My tests changed to open-ended essay questions in order to allow students to show me their own learning based on the investigations they undertook in class (which may have been different from the student sitting next to them). But I discovered that they were not very good at answering open-ended questions. They kept giving me short answers, with no real explanation or justification and often repeated the same idea multiple times using different words. In my search for a remedy, I discovered the 11-sentence paragraph which looks like this:

Opening sentence
Paragraph 1 (3 sentences):
1st point
Explanation w/ more detail
Paragraph 2:
2nd point
Explanation w/ more detail
Paragraph 3:
3rd point
Explanation w/ more detail
Concluding sentence

After working on this answer structure, my students’ marks increased significantly because it prompted them to really explain their thinking and justify it with specific examples from what we’d been learning.

I’ve also used a similar type of idea when prepping my students for the grade 10 Ontario Literacy Test (a graduation requirement; and not just the English teacher’s job to get our students ready!). For the short paragraph answers, I encourage them to write their answer in this format:

Full sentence answer that contains the question within it. Because . . . . For example . . . .

In reading through the summative projects for one of my non-Math classes this past week, I realize that I did not spend enough time teaching them HOW to express their knowledge in order to get a level 4 (highest level of achievement). For example, in their exit interview, my students were asked the following question:

How have you used teamwork to become a better leader this year?

Many gave vague, broad-sweeping statements such as “teamwork is really important because, without each of us helping, we couldn’t run the events that we do”. While true, this sort of answer doesn’t tell me much about their personal teamwork experience, nor does it explain how teamwork made them a better leader than if they had done it alone.

I should have taught them the 11-sentence paragraph. Or spent more time on the “answer … because … for example …” format (because we actually did cover this one in class).  And also my rubric should have reflected this. Here is the rubric (well, a checkbric really) that I used: Capture

The descriptors are those given in the curriculum documents themselves. But I’m not sure it really tells the students what a level 4 answer entails. Not that you have to give away the answer to the question, but they should know what a “considerable” response entails in order to get a level 3. In math class I teach my students through exemplars; we assess solutions together (moderated marking) as a class so that we all have a clear understanding of what each level of achievement looks like.

Here is what I will use as a rubric next time:Capture

So now there will be no question about what I am looking for in their answers.

I know it’s a busy time of year with exams and summatives to mark as well as prepping our new classes for Monday, but I think it’s so important for us to take time also to reflect on what worked and what didn’t this past semester. So as I complete my marking, I also write “next time …” notes to myself to remind me of how I can improve based on what I’m observing now.

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

No blank answers on tests, please!

Why did it take me 9 years of teaching to figure this out???  . . . It seems so simple now that I am doing it.

Maybe I only just noticed how big of a problem this is because I taught ALL applied level courses this year:


It’s like my students just gave up. On test problems I KNEW they could answer.

“But I’ve seen you do this in class!” I’d cry, “Why did you leave it blank?”.

And yet only now, during exam week, did I find a solution for this. And now that I’ve tried it, I can’t believe I didn’t think of it years earlier . . . it seems so simple:

“Do you have blank answers still? Then I’m not taking your exam. I’ll give you a hint if you like . . .”

And that’s how it went. As each student handed me their exam, I leafed through each page and if I saw any problem unanswered I would hand it back.

“But I have no clue how to do it!” they protested.

So I gave them hints. Showed them similar problems. Drew diagrams for them. I wrote each hint right on their test paper in pen to remind myself when I’m marking of just how much help they needed.

But every student answered every problem.

And that’s so important – because I can’t find evidence of learning when you’ve written nothing.

And the only thing I can’t figure out now, is why it took me 9 years to start doing this?! Thanks to Mary Bourassa for the inspiration!

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

Evidence Records

Part 4 of 4 in a series of blog posts about Standards-Based Grading

No more MarkBook

We’ve arrived at the last piece in the A&E puzzle; the evidence record. This is where you will be recording the levels each student received for the various expectations on each test/task. MarkBook (a marks management software) no longer meets our needs as it will simply calculate an average based on the weighting of each test which would be set out ahead of time and remain the same for each student. An average is not necessarily the best representation of what a student knows in a course. Growing success describes how to determine a student’s final grade as such:

“Determining a report card grade will involve
teachers’ professional judgement
and interpretation of evidence
and should reflect the student’s most consistent
level of achievement,
with special consideration given to more recent evidence.”

Most consistent, more recent.

The idea being that if a student performs poorly at the beginning of a course, but puts in the work needed to catch up and learn the material, they should be assigned a mark that reflects their current ability and knowledge in the course at the end (despite the fact that they had a rough start). An average will negatively impact this student by always pulling their grade down due to the low marks early in the course.

For example, let’s look at the example below which looks at the marks that sky diving students received for packing parachutes:

All 3 students have the same average. But when asked “which student would you want to pack your parachute?” I believe we would all have the same answer. So clearly, an average does not always give us the most accurate picture of student achievement.

Evidence Record Template

Below is an example of an evidence record for a math course. This would be printed out as one page per student. It does mean a binder full of these sheets in order to track a student’s grade and progress. Some teachers are not keen on these being a paper-pencil tool. There are electronic versions of evidence records starting to be made; the OCDSB is piloting the online MaMa+ version this semester, and Bruce McLaurin over at Glebe C.I. has created some excel spreadsheets that self-populate the marks into evidence records for you. However, I have found that trying evidence records for the first time with pencil and paper gives you a much better feel for how they work & what they represent. I find that by handwriting a student’s achievement on to this paper record, I notice trends in the students’ marks much better than if I were relying on a program to create the evidence record from my class list of marks.

Math evidence record

Update (2014.12.04): This fall, OCDSB teachers have access to the MaMa (Marks Manager) software online which allows us to keep electronic evidence records.
Here are some screenshots of what it looks like.
Inputting test marks by class list:
MaMaMarkEntryMaMa then automatically populates each student’s evidence record with the marks from the test:MaMaEvidenceRecord

 How to use evidence records

Here’s a video clip explaining how to take the levels of achievement earned on the test and record that on your evidence record.
(skip to 2:50 for evidence records)

So a completed evidence record might look something like the following:

Evidence Record U

We can see, for example, the students quiz marks (code Q) and test marks (code T). Each code may appear in more than one row because a test can evaluate multiple expectations. The numbers attached to each code indicate the chronological order of the tests: T1=Test 1 and T2 = Test 2. The exam mark has been labeled with code E in the grey row above each strand. Notice the exam receives 3 separate marks; one for each strand of the course. The summative, code ST, has been recorded in its own row below the term work.

What final level of achievement would you assign to the student with the above evidence record?

Here’s a video clip of the type of discussion one has with colleagues as we learn to interpret these evidence records, and still later on when we collaborate with colleagues to interpret evidence that may be inconsistent and therefore difficult to assign a final grade. Not they are using a different evidence record in the video than the one I showed above:

This is the type of moderation I have teachers try when I offer workshops to my colleagues about this new A&E. I find it really useful to hear my colleagues interpret the evidence presented, and comforting to see how consistent we usually are in determining a final grade.

Levels VS Percents

Unfortunately, at this time, the Ministry of Education still requires us to convert each student’s final overall level of achievement into a percent for the report card. The OCDSB has provided the following “peg chart” in order to do so:PegChartOCDSB

It’s not ideal to be converting the levels back to percents in the old style, but it will do until the Ministry changes report cards to match the levels of achievement from their curriculum documents.

Here are a couple more completed evidence records to look at and practice determining an overall final grade. Note that MT is Makeup Test.

evidence record Z. . . . and another . . .

evidence record E

The more familiar we become with this template, the more conversations we have with our colleagues, the more comfortable we become with using our professional judgement in order to determine the student’s final grade. This professional judgement replaces our reliance on software that calculates averages for us that feel accurate, but that I don’t think are any more accurate than my own well-informed professional judgement (but that’s a conversation / debate for another blog post!).

The End! But it’s only the beginning . . . 

So here we are at the end of my blog series about the new A&E in the OCDSB. It’s really only the beginning; this process will evolve as we implement it in our classrooms. We’re learning how to best use these templates in order to inform not only our evaluation practices, but our instructional practices as well.

How are you feeling about the new A&E?

Start a conversation with your colleagues in your school or in your department.

Update (2014.12.04): Here is what the OCDSB published for parents this fall on the topic of their assessment & evaluation policy:
Parent Guide to Assessment, Evaluation and Reporting – Grades 9-12

I would love to hear your comments, questions, concerns . . . Leave a comment below or get in touch via Twitter!

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

Tests/Tasks & their Rubrics

Part 3 of 4 in a series of blog posts about Standards-Based Grading.

Making a test, task, or project

The first step is to decide which overall expectation(s) [OEs] you will be evaluating with the current task. You may evaluate more than one OE at a time. Then you’ll need to design a task or a set of test questions that allow your students to demonstrate their ability or knowledge for that skill or content. I find it helpful to organize my test so that all of the questions for a given OE are together (as opposed to grouping them by the categories – K/U, T, C, A – as we used to do). For example, in the grade 9 math test below, each page has questions for one of the 4 OEs being evaluated. Additionally, I have organized each page so that the simpler problems (more K/U  or level 1-2 type questions) are first, near the top of the page. The more complex problems (more Application or level 3-4 type questions) are at the bottom of the page.

mfm1P testa

mfm1P testb


PDF version of above Math test: Test 3 no answers

“Is it necessary to make a new, different rubric for each and every test or task?”

No, it is not. In fact, you could simply print out the achievement chart rubric from your curriculum document and attach that to your task or test.

achievement chart

However, I find that rubric too wordy (the students don’t bother reading it) and sometimes too vague for a specific project.

“Are we simply getting rid of the categories; K/U, T, C, A? Why include them in the assessment plan if we’re not using them to organize our tests?”

We are not getting rid of the categories. We will embed them into our test questions and tasks and often even use them to build our rubrics. For example,for the math test above, I used three of the categories from the achievement chart in order to build my rubric for the test. Notice also that I repeat the rubric 4 times; once for each of the OEs being evaluated on that test:

Math rubric

I check off the appropriate level for each of the 3 categories – which then allows me to determine an overall level for that OE. I do this 4 times; once for each OE (which also happens to be once for each page of the test since each page corresponds to a separate OE). At the end, I return the test & rubric to the student with the 4 levels. There is no “overall average” on the test, the student attempted for 4 different skills/expectations and so receives 4 separate levels of achievement.

How do we record these levels in our mark book? For that, we have Evidence Records.

More examples of tests/tasks w/ rubrics:

Gr. 9 geography test (using checkbricks): 3 test human geo

Do you have a test w/ a rubric you’d be willing to share here for colleagues in your subject area to see?
Get in touch:

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

The Assessment Plan

Part 2 of 4 in a series of blog posts about Standards-Based Grading.

We all learn about backwards planning in teacher’s college; for a given unit, that means you start by creating the unit test (deciding what are the most important skills & topics to evaluate) and then building learning & assessment activities that build towards that unit test appropriately. And this makes all the sense in the world.

But then the real world of teaching smacks you in the face; three new courses to prep, very little free time what with all the extra-curriculars, etc. you’re helping with . . . all of a sudden it’s all you can do to prep something for the next day’s lessons, let alone backwards plan your entire unit for each of your courses.

The Assessment Plan (AP) brings us back to this idea of backwards planning. We use the AP to plan the number of evaluation tasks & tests we will have in a semester and to outline which of the overall expectations (OEs) and achievement chart categories (K/U, App, T&I, Comm) that the test covers. The intent is to have a variety of task types that cover a variety of OEs and categories. If your AP is a good one, there should be no gaps in OEs evaluated or in the categories that they incorporate.

Let’s have a look at a completed AP for a course, in this case a math course:
AP math

For a closer look, here is a pdf version: MPM2D Assessment Plan 2012 rev

The AP is broken into two sections of columns; the first section lists all of the Overall Expectations, divided into their strands, while the second section lists the category expectations (K/U, App, T&I, Comm). Each row represents an evaluation task or test. The left-most column is where you write the title of your evaluation tool (ie. “Quiz 1 – solve by graphing”). There are checkmarks for each of the OEs that are being evaluated by that tool, and also each of the category expectations that are built into the tool.

For example, let’s look at “Test 3” in the above AP. The title of the test is written in the first column along with the code “T3” which will be used to record marks on each student’s evidence record. Then there are symbols, in this case *, marking which of the OEs are covered on the test. So this test 3 covered 3 out of the 4 overall expectations in the first strand. We can see the 3 *s under the OEs. We can also see that this test incorporated all of the Knowledge & Understanding (K/U) and Application category expectations as well as some of the Thinking and Communication expectations.

What we can easily see is that each of the overall expectations is evaluated more than once over the course of the semester and that all of the category expectations have been met across the span of the course as well.

Here’s a video by the OCDSB that might help illustrate how the assessment plan fits into the overall framework:

Homework: Here is my suggestion for the best way to practice using an assessment plan before next September.

  1. Pick one course that you are currently teaching and feel most comfortable in.
  2. Get a copy (electronic or printed) of the AP for that course. You can find the AP for any course you teach on OCDSB’s Desire to Learn platform:
    Go to and sign in using the same username & password you use to sign on to any OCDSB computer.d2L AP
    Under “Teacher Resources” on the right hand side of the page, click on “Secondary Assessment Plan templates by Subject”.
    From there you will click on the folder for your subject area and click on the AP for your course.
    The AP will open in a viewing window.
    Click on the download button google drive download button in the bottom right-hand corner to save the file to your own computer – and from there you can open the file, edit the file, and print  a copy.
    I suggest you print out a copy.
  3. Fill it out with the quizzes, tests, & tasks that you have been using so far this semester for that course; anything for which you recorded a mark in your mark-book. Start by listing the tests/tasks in chronological order down that left-hand column.
  4. Next, check off the OEs that it touched on as well as the categories that it touched on. If you need a refresher on the OEs or categories for your course have a read through your curriculum documents. I can’t emphasize enough how you need to be intimately familiar with your curriculum documents.
  5. Finally, have a look at your AP as it stands & take a moment to reflect:
    • Have you checked off each of the overall expectations at least once?
      • If not, which ones are missing? What sort of test/task could you add in order to evaluate them?
      • If so, have you provided multiple opportunities for each overall expectation? How can you build your tests/tasks so that student have more than one chance to provide you with evidence of their learning on a particular OE?
    • Have you checked off each of the achievement chart categories at least once?
      • If not, which ones are missing? How could you modify/amend your tests/tasks to incorporate them all. Alternatively, what sort of test/task could you add in order to touch on those that have been missed?

Don’t panic if you notice big gaps in your assessment plan’s checkmarks. This is the time to notice the gaps and work to fill them in. The AP is a handy a tool to help you ensure that you have a well-balanced set of evaluation tools which will allow your students to provide the best evidence possible of their learning across the expectations of your course.

Next up: Creating tests/tasks & the rubrics to go with them.

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

Standards-Based Grading: the Framework

Part 1 of 4 in a series of blog posts about Standards-Based Grading

For myself, I have spent several years shifting into a Standards-Based Grading (SBG) model of assessment & evaluation (A&E). I have gradually adopted what I considered to be the best practices of colleagues I met within my school board, teachers from other boards at educational conferences and from teachers in the educational blogosphere.

For many OCDSB teachers, they have simply heard that the school board’s A&E policies & documentation are changing, effective September, 2014. I’ve heard many teachers express their anxiety over what they perceive to be a lack of training with these new documents. The school board recently put in place some on-line training modules, but is also hoping that the teachers who have been implementing this type of A&E over the last few years will lead a bottom-up training approach to getting their colleagues on board.

What we were doing before:
Teachers taught their courses by unit. Sometimes those units came from the curriculum documents, sometimes they came from the textbook which divides the course up into chapters. We taught each unit, and at its end assigned a test on that unit or chapter. Those tests were divided up into 4 sections of problems/questions; Knowledge/Understanding, Application, Thinking, & Communication (the achievement chart categories; can be found at the beginning of each curriculum document).

achievement chart

Students received 4 different marks, one for each category. Those marks then got input into a grading software like MarkBook by “bin” that would weight the categories the way we wanted and calculate an average for us.
Marks were collected & recorded by unit/chapter and by the 4 categories.

What we’re doing now:
Use the overall expectations in our curriculum documents as a way to divide up our teaching and our evaluation. We test a student on whether or not they are proficient at a certain curriculum expectation. We evaluate their proficiency using levels (R, 1, 2, 3, 4).
OE vs SE

The overall expectations (OEs) are what we need to evaluate or test. The specific expectations (SEs) are what we need to teach. Of course we will evaluate some of the SEs since they make up the OEs, but we do not need to test students on every single SE.
“All curriculum expectations must be accounted for in instruction, but evaluation focuses on students’ achievement of the overall expectations. A student’s achievement of the overall expectations is evaluated on the basis of his or her achievement of related specific expectations (including the process expectations). The overall expectations are broad in nature, and the specific expectations define the particular content or scope of the knowledge and skills referred to in the overall expectations.Teachers will use their professional judgement to determine which specific expectations should be used to evaluate achievement of the overall expectations, and which ones will be covered in instruction and assessment (e.g., through direct observation) but not necessarily evaluated.” Gr. 9-10 Mathematics Curriculum in Ontario

Most curricula have anywhere from 9-12 overall expectations. These OEs are the “standards” according to which you will be evaluating your students. For example, in my mathematics curriculum I might evaluate my students’ ability to meet the expectation of “solving a linear equation”.

One thing I love about this is that it forces us, as teachers, to really become familiar with our curriculum documents. This is in contrast to a reliance on textbooks created by companies for profit but not necessarily well-matched to the curriculum (but that’s a conversation for another day).

Another thing that is so great about SBG is the ability to pinpoint the topics/skills (by OE) that a student is strong in or in which they need to improve. Even better is when you have the students track their progress too so that they always know what areas they need to work on. More to come on this when we talk about Evidence Records later on.

The placemat:

The OCDSB has created what they call “the placemat” which is meant to give an overview of the documents we will be using to support this shift in our assessment & evaluation practices:


placemat (in pdf)

Get reading:

The best way to really get a good grip on the framework and philosophy behind SBG is to read a lot about it. That’s what I did to really understand what we were trying to do beyond the A&E documents the school board is focusing on (Assessment Plan & Evidence Record).

Here’s a reading list about standards-based grading … jump in!
Daniel Schneider
Frank Noschese
Sam Shah
Jason Buell
Shawn Cornally
Dan Meyer
Jim Pai
#SBGchat on Twitter

Stay tuned for part 2: The Assessment Plan.

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)