Consider what it's like to drive through a heavy morning fog. It may be a busy highway where brake lights blink in and out of the haze, or a neighborhood road where the familiar details of buildings and trees are obscured by a gray curtain that implies shapes without clear form. Such driving conditions bring tension as we seek any details that will keep us on the road, and not hit a car or miss a sharp turn.
Assessment fog holds similar dangers of miscalculation for accurately diagnosing student needs. Unlike road fog, which is obvious (you know you're in it), assessment fog can be invisible. You have to be looking for it to notice. Once revealed, differentiation is easy and necessary to support learners. Consider these three guidelines for finding and eliminating assessment fog so as to meet the needs of all students.
1. Identify and Communicate Clear Learning Targets
Scenario: Tammy, a classroom teacher, is frustrated. Her students are highly engaged in the activities, but produce work that appears to demonstrate superficial understanding. With high-stakes tests coming in the spring, she's very concerned about the level of her students' content knowledge.
The heart of quality learning is having clear learning targets. This probably sounds obvious, yet there are an abundance of assessment tools that give students only a vague understanding of the outcomes. Academic criteria lists are created from unpacking the unit standards or curriculum outcomes for specific skills and concepts. Consider the following:
- Use the academic criteria list as a filter to align all assessment strategies such as observations and rubrics.
- Coach students to understand what is expected of them. Get student feedback about what they "think" are the expectations.
- Unpack curriculum and standards into "I can" statements. This process helps to evaluate clarity of assessment tools to the academic outcome.
- Use a Learning From Student Work protocol with colleagues to reflect and revise assessment tools. When used during team and staff meetings, it helps clear assessment fog regarding intended academic outcomes and how students may interpret expectations.
2. Separate Logistical Guidelines from Academic Learning Targets
Scenario: The sixth grade math teacher reviewed student grades for the semester. He's concerned that the results may not accurately reflect the students' growth.
Points should not be taken off because a student didn't follow directions regarding where to place an answer or how to format a paper, or because of his or her level of contribution to classroom activities. These may be important non-academic skills -- work ethic, following directions, being responsible -- and can be coached. But when these logistical requirements are factored into formative and summative assessments, they obscure the truth:
- Tom has strong math skills, and needs support in his work ethic and collaboration.
- John is reliable with completing tasks and collaborates, but his level of content understanding is low.
The information can be shared with students and their stakeholders as a separate report, outside of the grade book. #SBLChat on Twitter dialogues about this critical topic live on Wednesdays at 8PM EST.
Logistical requirements are important to shape the students' task. Have them revise their work toward those logistics before you assess it. Those requirements should not be confused with the learning targets that students are practicing. For example, this Social Studies final (click image to download) has ten assessment indicators. Which indicator is not logistical? Such assessment fog makes it difficult to know who needs help or enrichment.
3. Provide Students with Different Options to Demonstrate Their Learning
Scenario: Sara volunteers answers to complex questions in Physics with 90 percent accuracy. Her lab reports are difficult to read, but her presentations about content and analysis are high quality. Sara's final essay exam result was a D-, which is confusing to her and the teacher.
Assessment construction is as important as the assessment itself. One assessment format can include obstacles that are irrelevant to the learning targets. Students like Sara may fail to demonstrate content understanding because of low writing skills. For others, it may be the reading level of the test itself. Such obstacles create skewed data pictures, and lead to misdiagnosis of needs and supports for students. Consider offering alternative options that students can choose for the assessment, or as a follow-up for an unsatisfactory result. Students need options that are respectful to how they process information effectively. The options must be cleanly targeted to the skills that students will demonstrate.
No Fog, No Harm
When doctors take the Hippocratic Oath, they are promising, in essence, to "do no harm." Having clean data is crucial to fulfilling that oath. Teachers have a similar charge with teaching and assessing learners for growth. With the best of intentions, data is obscured by some current practices, creating misleading impressions of what students know and can do. The results lead to misdiagnoses for students' true needs. It's difficult to differentiate effectively for all learners if the data used does not focus wholly on academic learning target. Clean assessment data benefits us all.
In This Series
- Myth-Busting Differentiated Instruction: 3 Myths and 3 Truths
- 3 Guidelines to Eliminating Assessment Fog
- 3 Ways to Plan for Diverse Learners: What Teachers Do
- 15+ Readiness Resources for Driving Student Success
- How Learning Profiles Can Strengthen Your Teaching
- Learner Interest Matters: Strategies for Empowering Student Choice
- There's No Time to Differentiate: Myth-Busting DI, Part 2
- Igniting Student Writer Voice With Writing Process Strategies
- Empowering Student Writers
- 50+ Tools for Differentiating Instruction Through Social Media
- Differentiation Is Just Too Difficult: Myth-Busting DI Part 3
- Quality Instruction + Differentiation: Beyond the Checklist
- 4 Paths to Engaging Authentic Purpose and Audience
- Teachers Are in Control: Myth-Busting DI, Part 4