Captive Audience: #LearningInTheLoo

Do you ever read a great article or blog post and think I HAVE to share this with my colleagues! So you email everybody the link & say you have to read this. And then maybe 1 or 2 people actually read it?

I find so many great things on Twitter & blogs (#MTBoS) that I want to share with my colleagues, but they often don’t have (or make) the time to check them out. So when I happened upon a tweet about Learning in the Loo I thought it was genius – a captive audience!

So I have made it a habit to create & post a new Learning in the Loo 11×17″ poster in each staff toilet in our school this semester. I curate the amazing things I learn about online & turn them into quick read how-tos or ideas to read while you … “go”. And it just occured to me that I should have been posting them to my blog as I made them. But now you can get a whole whack of them at once and next year I’ll try to remember to post them as I make them.

The whole collection so far can be found here with printing instructions.
Feel free to make a copy (File –> make a copy). Also the sources of images & ideas are in the notes of the doc above too.

Here they are:

Learning in the Loo Assessment FeedbackLearning in the Loo Cell Phone Work Life BalanceLearning in the Loo EdPuzzleLearning in the Loo Adobe Spark VideoLearning in the Loo TwitterLearning in the Loo Google ClassroomLearning in the Loo Grouping StrategiesLearning in the Loo KahootLearning in the Loo Google Docs

What would you share in your school’s first Learning In The Loo poster?

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

OAME sketchnotes

At the start of May I attended the OAME conference in Barrie. This was my 2nd year attending. I was disappointed to have my session cut due to low enrollment 5 weeks before registration closed, but c’est la vie! Next year in Kingston I have an idea of how to better “sell” my session in the description. Fingers crossed to not get the final session block on the Saturday either – that drags your numbers down for sure.

The food was the definite low point of the trip. Georgian College offered a poor continental breakfast in the residence and OAME provided all vegetarians with gluten free bread that wasn’t suited for human consumption. Let’s hope the Kingston organizers manage something a notch above.

I thought I would share some sketchnotes I made in order to summarize my new learnings. Let’s start with the Ignite sessions which I think are my highlight of the conference each year. Ignite speakers get 20 slides that auto-advance every 15 seconds to total 5 brief minutes to try & get a strong message across.

OAME Ignite 2016 Part 1

OAME Ignite 2016 Part 2.PNG

I was pretty active on the Twitter feed for the conference as well:

Lastly, I usually try to make an effort to seek out OAME sessions by teachers that I can’t see or work with at home but my colleague Lynn Pacarynuk‘s session on test design & assessment made me think more & harder about my own practices. So much so that I summarized some of her ideas in 2 different sketchnotes:

OAME Test Design Process Lynn Pacarynuk.PNG

OAME Shifts in Assessment & Test Design Lynn Pacarynuk.PNG

Until next year, OAME!

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

Khan Academy in the Math Classroom

Image

This year I decided to try & integrate Khan Academy (KA) into my Math classes. This semester I have 2 sections of grade 10 applied math. Past experience has shown me that many of these students come in with significant skill gaps & little to no willingness to do any homework. On the 2nd day of school I took my classes to the computer lab & had them all sign up for KA accounts & add me as their “coach”.

KA does an assessment of what the student knows when they sign up by having them answer a variety of questions. After which it tailors the practice problems to their level. I can also recommend practice problem sets as their coach, which my students will see at the top of their list of exercises to work on.

Here’s how we’ve been using KA so far:

Lesson Plan for Supply Teachers:

I am out of the classroom a lot. This semester I am teaching at the University of Ottawa each Monday (and therefore not in my high-school classroom each Monday), a member of the digital learner’s advisory panel which will meet a few times throughout the year, a mentor to a new teacher for which there are meetings & workshops to attend, … the list goes on. I am often away for 2 of the 5 school days in a week. Now when I am planning for the supply teachers, I more often than not book us a computer lab, mobile cart of chromebooks, or set of iPads and TA-DAAAAA; lesson planned! Students work on the recommended problem sets and the supply teacher can help students as needed.

Homework:

Applied students are not into doing homework for the most part. But some will ask for extra work or problems to do at home. KA to the rescue. I have recommended that each student spends 20 minutes per night on KA doing practice problems. There is no consequence for not doing so, but we’ve talked about why it is beneficial to do so. I can also see how much time each student has spent doing problems via my coach’s “dashboard”. I can also recommend a problem set to my class (it pushes it out to their accounts) if I want them to practice something specific after the day’s activities.

Differentiation & Remediation:

I spent much of this past weekend marking tests. One of my students was having difficulty using the surface area formula, in part because they weren’t following “order of operations” when simplifying the expression after substituting the appropriate values:IMG_6775With KA I was able to suggest a problem set on “order of operations” for this one student only. The student will see it at the top of their list of exercises the next time they log in. This is useful because order of operations is not grade 10 curriculum, so I do not need to suggest it to all of my students in that class.

Things I like about Khan Academy so far:

  • Differentiation: Each student works on what they need to work on (according to both my & KA’s assessments) – which may be skills that they should have learned before grade 10 math.
  • Coach’s recommendations: I can recommend exercises to students based on what we’ve done in class, or a skill I’ve noticed them struggling with.
  • Immediate right/wrong feedback: It tells them right away if they have the right or wrong answer.
  • Hints & videos: If students are stuck, they can see hints of the next step to take or watch a video of how to solve similar problems.

What I don’t like so far:

  • Too many fractions: KA puts fractions (and not easy ones like 1/2 or 1/4, but 8/3 type fractions) into exercises for solving equations. My students are panicked by these complicated fractions, aren’t sure how to work with them, and often give up when they see them.
  • Irrational numbers expressed as square roots: My applied students haven’t learned how to work with a number like 5√2. Because KA uses these in even their simplest trig practice problem sets, I can’t use them with my students.
  • Display on cell phones: The practice problems don’t always display nicely on a cell phone. Although it’s better if you hold your phone in landscape mode. And some graphics where you have to play with the graph by dragging things around don’t work well on smartphones either. This means I need to have some iPads handy if the students are working on these types of problems.

How do you use Khan Academy in your classroom? Let me know in the comments below!

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

 

No blank answers on tests, please!

Why did it take me 9 years of teaching to figure this out???  . . . It seems so simple now that I am doing it.

Maybe I only just noticed how big of a problem this is because I taught ALL applied level courses this year:

BLANK ANSWERS!

It’s like my students just gave up. On test problems I KNEW they could answer.

“But I’ve seen you do this in class!” I’d cry, “Why did you leave it blank?”.

And yet only now, during exam week, did I find a solution for this. And now that I’ve tried it, I can’t believe I didn’t think of it years earlier . . . it seems so simple:

“Do you have blank answers still? Then I’m not taking your exam. I’ll give you a hint if you like . . .”

And that’s how it went. As each student handed me their exam, I leafed through each page and if I saw any problem unanswered I would hand it back.

“But I have no clue how to do it!” they protested.

So I gave them hints. Showed them similar problems. Drew diagrams for them. I wrote each hint right on their test paper in pen to remind myself when I’m marking of just how much help they needed.

But every student answered every problem.

And that’s so important – because I can’t find evidence of learning when you’ve written nothing.

And the only thing I can’t figure out now, is why it took me 9 years to start doing this?! Thanks to Mary Bourassa for the inspiration!

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

Evidence Records

Part 4 of 4 in a series of blog posts about Standards-Based Grading

No more MarkBook

We’ve arrived at the last piece in the A&E puzzle; the evidence record. This is where you will be recording the levels each student received for the various expectations on each test/task. MarkBook (a marks management software) no longer meets our needs as it will simply calculate an average based on the weighting of each test which would be set out ahead of time and remain the same for each student. An average is not necessarily the best representation of what a student knows in a course. Growing success describes how to determine a student’s final grade as such:

“Determining a report card grade will involve
teachers’ professional judgement
and interpretation of evidence
and should reflect the student’s most consistent
level of achievement,
with special consideration given to more recent evidence.”

Most consistent, more recent.

The idea being that if a student performs poorly at the beginning of a course, but puts in the work needed to catch up and learn the material, they should be assigned a mark that reflects their current ability and knowledge in the course at the end (despite the fact that they had a rough start). An average will negatively impact this student by always pulling their grade down due to the low marks early in the course.

For example, let’s look at the example below which looks at the marks that sky diving students received for packing parachutes:

All 3 students have the same average. But when asked “which student would you want to pack your parachute?” I believe we would all have the same answer. So clearly, an average does not always give us the most accurate picture of student achievement.

Evidence Record Template

Below is an example of an evidence record for a math course. This would be printed out as one page per student. It does mean a binder full of these sheets in order to track a student’s grade and progress. Some teachers are not keen on these being a paper-pencil tool. There are electronic versions of evidence records starting to be made; the OCDSB is piloting the online MaMa+ version this semester, and Bruce McLaurin over at Glebe C.I. has created some excel spreadsheets that self-populate the marks into evidence records for you. However, I have found that trying evidence records for the first time with pencil and paper gives you a much better feel for how they work & what they represent. I find that by handwriting a student’s achievement on to this paper record, I notice trends in the students’ marks much better than if I were relying on a program to create the evidence record from my class list of marks.

Math evidence record

Update (2014.12.04): This fall, OCDSB teachers have access to the MaMa (Marks Manager) software online which allows us to keep electronic evidence records.
Here are some screenshots of what it looks like.
Inputting test marks by class list:
MaMaMarkEntryMaMa then automatically populates each student’s evidence record with the marks from the test:MaMaEvidenceRecord

 How to use evidence records

Here’s a video clip explaining how to take the levels of achievement earned on the test and record that on your evidence record.
(skip to 2:50 for evidence records)

So a completed evidence record might look something like the following:

Evidence Record U

We can see, for example, the students quiz marks (code Q) and test marks (code T). Each code may appear in more than one row because a test can evaluate multiple expectations. The numbers attached to each code indicate the chronological order of the tests: T1=Test 1 and T2 = Test 2. The exam mark has been labeled with code E in the grey row above each strand. Notice the exam receives 3 separate marks; one for each strand of the course. The summative, code ST, has been recorded in its own row below the term work.

What final level of achievement would you assign to the student with the above evidence record?

Here’s a video clip of the type of discussion one has with colleagues as we learn to interpret these evidence records, and still later on when we collaborate with colleagues to interpret evidence that may be inconsistent and therefore difficult to assign a final grade. Not they are using a different evidence record in the video than the one I showed above:

This is the type of moderation I have teachers try when I offer workshops to my colleagues about this new A&E. I find it really useful to hear my colleagues interpret the evidence presented, and comforting to see how consistent we usually are in determining a final grade.

Levels VS Percents

Unfortunately, at this time, the Ministry of Education still requires us to convert each student’s final overall level of achievement into a percent for the report card. The OCDSB has provided the following “peg chart” in order to do so:PegChartOCDSB

It’s not ideal to be converting the levels back to percents in the old style, but it will do until the Ministry changes report cards to match the levels of achievement from their curriculum documents.

Here are a couple more completed evidence records to look at and practice determining an overall final grade. Note that MT is Makeup Test.

evidence record Z. . . . and another . . .

evidence record E

The more familiar we become with this template, the more conversations we have with our colleagues, the more comfortable we become with using our professional judgement in order to determine the student’s final grade. This professional judgement replaces our reliance on software that calculates averages for us that feel accurate, but that I don’t think are any more accurate than my own well-informed professional judgement (but that’s a conversation / debate for another blog post!).

The End! But it’s only the beginning . . . 

So here we are at the end of my blog series about the new A&E in the OCDSB. It’s really only the beginning; this process will evolve as we implement it in our classrooms. We’re learning how to best use these templates in order to inform not only our evaluation practices, but our instructional practices as well.

How are you feeling about the new A&E?

Start a conversation with your colleagues in your school or in your department.

Update (2014.12.04): Here is what the OCDSB published for parents this fall on the topic of their assessment & evaluation policy:
Parent Guide to Assessment, Evaluation and Reporting – Grades 9-12

I would love to hear your comments, questions, concerns . . . Leave a comment below or get in touch via Twitter!

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

Tests/Tasks & their Rubrics

Part 3 of 4 in a series of blog posts about Standards-Based Grading.

Making a test, task, or project

The first step is to decide which overall expectation(s) [OEs] you will be evaluating with the current task. You may evaluate more than one OE at a time. Then you’ll need to design a task or a set of test questions that allow your students to demonstrate their ability or knowledge for that skill or content. I find it helpful to organize my test so that all of the questions for a given OE are together (as opposed to grouping them by the categories – K/U, T, C, A – as we used to do). For example, in the grade 9 math test below, each page has questions for one of the 4 OEs being evaluated. Additionally, I have organized each page so that the simpler problems (more K/U  or level 1-2 type questions) are first, near the top of the page. The more complex problems (more Application or level 3-4 type questions) are at the bottom of the page.

mfm1P testa

mfm1P testb

 

PDF version of above Math test: Test 3 no answers

“Is it necessary to make a new, different rubric for each and every test or task?”

No, it is not. In fact, you could simply print out the achievement chart rubric from your curriculum document and attach that to your task or test.

achievement chart

However, I find that rubric too wordy (the students don’t bother reading it) and sometimes too vague for a specific project.

“Are we simply getting rid of the categories; K/U, T, C, A? Why include them in the assessment plan if we’re not using them to organize our tests?”

We are not getting rid of the categories. We will embed them into our test questions and tasks and often even use them to build our rubrics. For example,for the math test above, I used three of the categories from the achievement chart in order to build my rubric for the test. Notice also that I repeat the rubric 4 times; once for each of the OEs being evaluated on that test:

Math rubric

I check off the appropriate level for each of the 3 categories – which then allows me to determine an overall level for that OE. I do this 4 times; once for each OE (which also happens to be once for each page of the test since each page corresponds to a separate OE). At the end, I return the test & rubric to the student with the 4 levels. There is no “overall average” on the test, the student attempted for 4 different skills/expectations and so receives 4 separate levels of achievement.

How do we record these levels in our mark book? For that, we have Evidence Records.

More examples of tests/tasks w/ rubrics:

Gr. 9 geography test (using checkbricks): 3 test human geo

Do you have a test w/ a rubric you’d be willing to share here for colleagues in your subject area to see?
Get in touch: laura.wheeler@ocdsb.ca

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

The Assessment Plan

Part 2 of 4 in a series of blog posts about Standards-Based Grading.

We all learn about backwards planning in teacher’s college; for a given unit, that means you start by creating the unit test (deciding what are the most important skills & topics to evaluate) and then building learning & assessment activities that build towards that unit test appropriately. And this makes all the sense in the world.

But then the real world of teaching smacks you in the face; three new courses to prep, very little free time what with all the extra-curriculars, etc. you’re helping with . . . all of a sudden it’s all you can do to prep something for the next day’s lessons, let alone backwards plan your entire unit for each of your courses.

The Assessment Plan (AP) brings us back to this idea of backwards planning. We use the AP to plan the number of evaluation tasks & tests we will have in a semester and to outline which of the overall expectations (OEs) and achievement chart categories (K/U, App, T&I, Comm) that the test covers. The intent is to have a variety of task types that cover a variety of OEs and categories. If your AP is a good one, there should be no gaps in OEs evaluated or in the categories that they incorporate.

Let’s have a look at a completed AP for a course, in this case a math course:
AP math

For a closer look, here is a pdf version: MPM2D Assessment Plan 2012 rev

The AP is broken into two sections of columns; the first section lists all of the Overall Expectations, divided into their strands, while the second section lists the category expectations (K/U, App, T&I, Comm). Each row represents an evaluation task or test. The left-most column is where you write the title of your evaluation tool (ie. “Quiz 1 – solve by graphing”). There are checkmarks for each of the OEs that are being evaluated by that tool, and also each of the category expectations that are built into the tool.

For example, let’s look at “Test 3” in the above AP. The title of the test is written in the first column along with the code “T3” which will be used to record marks on each student’s evidence record. Then there are symbols, in this case *, marking which of the OEs are covered on the test. So this test 3 covered 3 out of the 4 overall expectations in the first strand. We can see the 3 *s under the OEs. We can also see that this test incorporated all of the Knowledge & Understanding (K/U) and Application category expectations as well as some of the Thinking and Communication expectations.

What we can easily see is that each of the overall expectations is evaluated more than once over the course of the semester and that all of the category expectations have been met across the span of the course as well.

Here’s a video by the OCDSB that might help illustrate how the assessment plan fits into the overall framework:
https://docs.google.com/a/ocdsb.ca/file/d/0B5hknKM3CbwyZ0NDY1BkNWVyUXc/edit

Homework: Here is my suggestion for the best way to practice using an assessment plan before next September.

  1. Pick one course that you are currently teaching and feel most comfortable in.
  2. Get a copy (electronic or printed) of the AP for that course. You can find the AP for any course you teach on OCDSB’s Desire to Learn platform:
    Go to http://ocdsb.elearningontario.ca and sign in using the same username & password you use to sign on to any OCDSB computer.d2L AP
    Under “Teacher Resources” on the right hand side of the page, click on “Secondary Assessment Plan templates by Subject”.
    From there you will click on the folder for your subject area and click on the AP for your course.
    The AP will open in a viewing window.
    Click on the download button google drive download button in the bottom right-hand corner to save the file to your own computer – and from there you can open the file, edit the file, and print  a copy.
    I suggest you print out a copy.
  3. Fill it out with the quizzes, tests, & tasks that you have been using so far this semester for that course; anything for which you recorded a mark in your mark-book. Start by listing the tests/tasks in chronological order down that left-hand column.
  4. Next, check off the OEs that it touched on as well as the categories that it touched on. If you need a refresher on the OEs or categories for your course have a read through your curriculum documents. I can’t emphasize enough how you need to be intimately familiar with your curriculum documents.
  5. Finally, have a look at your AP as it stands & take a moment to reflect:
    • Have you checked off each of the overall expectations at least once?
      • If not, which ones are missing? What sort of test/task could you add in order to evaluate them?
      • If so, have you provided multiple opportunities for each overall expectation? How can you build your tests/tasks so that student have more than one chance to provide you with evidence of their learning on a particular OE?
    • Have you checked off each of the achievement chart categories at least once?
      • If not, which ones are missing? How could you modify/amend your tests/tasks to incorporate them all. Alternatively, what sort of test/task could you add in order to touch on those that have been missed?

Don’t panic if you notice big gaps in your assessment plan’s checkmarks. This is the time to notice the gaps and work to fill them in. The AP is a handy a tool to help you ensure that you have a well-balanced set of evaluation tools which will allow your students to provide the best evidence possible of their learning across the expectations of your course.

Next up: Creating tests/tasks & the rubrics to go with them.

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

Standards-Based Grading: the Framework

Part 1 of 4 in a series of blog posts about Standards-Based Grading

For myself, I have spent several years shifting into a Standards-Based Grading (SBG) model of assessment & evaluation (A&E). I have gradually adopted what I considered to be the best practices of colleagues I met within my school board, teachers from other boards at educational conferences and from teachers in the educational blogosphere.

For many OCDSB teachers, they have simply heard that the school board’s A&E policies & documentation are changing, effective September, 2014. I’ve heard many teachers express their anxiety over what they perceive to be a lack of training with these new documents. The school board recently put in place some on-line training modules, but is also hoping that the teachers who have been implementing this type of A&E over the last few years will lead a bottom-up training approach to getting their colleagues on board.

What we were doing before:
Teachers taught their courses by unit. Sometimes those units came from the curriculum documents, sometimes they came from the textbook which divides the course up into chapters. We taught each unit, and at its end assigned a test on that unit or chapter. Those tests were divided up into 4 sections of problems/questions; Knowledge/Understanding, Application, Thinking, & Communication (the achievement chart categories; can be found at the beginning of each curriculum document).

achievement chart

Students received 4 different marks, one for each category. Those marks then got input into a grading software like MarkBook by “bin” that would weight the categories the way we wanted and calculate an average for us.
Marks were collected & recorded by unit/chapter and by the 4 categories.

What we’re doing now:
Use the overall expectations in our curriculum documents as a way to divide up our teaching and our evaluation. We test a student on whether or not they are proficient at a certain curriculum expectation. We evaluate their proficiency using levels (R, 1, 2, 3, 4).
OE vs SE

The overall expectations (OEs) are what we need to evaluate or test. The specific expectations (SEs) are what we need to teach. Of course we will evaluate some of the SEs since they make up the OEs, but we do not need to test students on every single SE.
“All curriculum expectations must be accounted for in instruction, but evaluation focuses on students’ achievement of the overall expectations. A student’s achievement of the overall expectations is evaluated on the basis of his or her achievement of related specific expectations (including the process expectations). The overall expectations are broad in nature, and the specific expectations define the particular content or scope of the knowledge and skills referred to in the overall expectations.Teachers will use their professional judgement to determine which specific expectations should be used to evaluate achievement of the overall expectations, and which ones will be covered in instruction and assessment (e.g., through direct observation) but not necessarily evaluated.” Gr. 9-10 Mathematics Curriculum in Ontario

Most curricula have anywhere from 9-12 overall expectations. These OEs are the “standards” according to which you will be evaluating your students. For example, in my mathematics curriculum I might evaluate my students’ ability to meet the expectation of “solving a linear equation”.

One thing I love about this is that it forces us, as teachers, to really become familiar with our curriculum documents. This is in contrast to a reliance on textbooks created by companies for profit but not necessarily well-matched to the curriculum (but that’s a conversation for another day).

Another thing that is so great about SBG is the ability to pinpoint the topics/skills (by OE) that a student is strong in or in which they need to improve. Even better is when you have the students track their progress too so that they always know what areas they need to work on. More to come on this when we talk about Evidence Records later on.

The placemat:

The OCDSB has created what they call “the placemat” which is meant to give an overview of the documents we will be using to support this shift in our assessment & evaluation practices:

placemat

placemat (in pdf)

Get reading:

The best way to really get a good grip on the framework and philosophy behind SBG is to read a lot about it. That’s what I did to really understand what we were trying to do beyond the A&E documents the school board is focusing on (Assessment Plan & Evidence Record).

Here’s a reading list about standards-based grading … jump in!
Daniel Schneider
Frank Noschese
Sam Shah
Jason Buell
Shawn Cornally
Dan Meyer
Jim Pai
#SBGchat on Twitter

Stay tuned for part 2: The Assessment Plan.

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)

Assessment & Evaluation in the OCDSB

A big shift is happening in my school board, the Ottawa-Carleton District School Board (OCDSB) with respect to our assessment and evaluation policies. A lot of teachers are feeling anxious about this shift, not only about how to implement the new policies and supporting documentation, but also about the reasons behind it. The new approach is modelled (I feel) on a framework called standards-based grading.

I don’t purport to be any expert in this new approach. I have been shifting towards this new method of A&E over the last 4 or 5 years and wanted to share my understanding of it with you. My hope is to start a conversation about what this new policy looks like in each of our classrooms.

Over the course of several blog posts I will delve into 4 different aspects of the new A&E policy (I will add links here as the posts get written):

1. The framework and philosophy behind standards-based grading.

2. The Assessment Plan

3. Your assessment & evaluation tools (“what will my tests look like?”)

4. The Evidence Record

“Determining a report card grade will involve teachers’ professional judgement and interpretation of evidence and should reflect the student’s most consistent level of achievement, with special consideration given to more recent evidence.” – GROWING SUCCESS

Update (2014.12.04): Here is what the OCDSB published for parents this fall on the topic of their assessment & evaluation policy:
Parent Guide to Assessment, Evaluation and Reporting – Grades 9-12

If you comments, questions, and/or ideas you’d like to share on the topic, please leave a comment below or connect via Twitter @wheeler_laura.

– Laura Wheeler (Teacher @ Ridgemont High School, OCDSB; Ottawa, ON)