Larry Ferlazzo’s Websites of the Day…

…For Teaching ELL, ESL, & EFL

“Instead of seeing students as Far Below Basic or Advanced, we see them as learners”


The English Department at our high school does a pretty impressive job of teaching and evaluating writing. I invited Lara Hoekstra, one of the leaders of this effort, to write a post describing what we do. One of the key elements of our work is the use of an “improvement rubric,” and Lara discusses it here. You can read more about it in one of my previous posts and in my book, Helping Students Motivate Themselves. By the way, Lara contributes to its upcoming sequel.

“Instead of seeing students as Far Below Basic or Advanced, we see them as learners”

Guest post by Lara Hoekstra

Six or seven years ago the English department at Luther Burbank High School decided to pursue putting together a school wide writing assessment. Somehow we came to settle on using past versions of the University of California’s entrance assessment, the AWPE (Analytical Writing Placement Examination). In this assessment students read a passage (it could be from any discipline) and then write an essay in response to a prompt. In all the cases, students must read and understand the passage, and then write a response which is connected to, or based on, the reading. All students, 9th-12th, take an AWPE exam twice a year, once in the fall and once in the spring.

Once the assessments have been collected, a group of English teachers gets together and creates a norming packet. We look for examples of high, medium, and low papers. We also look for trends we see in the writing, and pull out samples of papers the show particular skills. Then for two days in the fall, and two days in the spring, our department assembles and norms and scores the assessments using an improvement rubric from the California Writing Project’s ISAW (Improving Students’ Academic Writing) program.

The whole process is shaped by the rubric; it has shifted the conversation in our department. The point isn’t to give the paper a single score; it’s not a summative assessment. The point is to show the student, and the teacher, where on the continuum he or she lays in twelve specific areas, and by the spring assessment (scored on the same rubric using a different color of highlighter) to hopefully show progress in several of the areas on the rubric.

The language of the rubric is carefully chosen and works well. Deficit language is not used. On most rubrics, especially for summative assessments, students who score in the passing range can see what they did well; students who fall below the passing range see what they did not do. If you look at the table below you can see the difference in the language: the score on the left is based on deficit language, what the student cannot, or did not do. With an improvement rubric, like the box on the right, the deficit language is removed and it states what the student did. The point is to show the student where they are; it gives them credit for what they are doing but acknowledges there is work to do.

**Examples from the deficit language are based of the CAHSEE rubric. Examples from the improvement rubric are based off of the CWP’s ISAW rubric

After we norm and score, teachers review their own papers and look for trends. The next time we meet as a department we chart out what we are seeing in the students’ writings across grade levels. We frame our conversation in terms of what we see students doing, what they are attempting to do, and what is missing. You can look at the chart below to see our observations from this year’s fall assessment.

The rubric has changed our conversation as a department. Instead of talking about what students can’t do, we are talking about what they are doing and what we need to do to move them to the next level. Once we have charted out what we are seeing, then we meet in grade level groups to discuss focus areas for each unit. Which bands of the rubric do we want to focus on? In the unit we are teaching, are we giving students enough quality practice time with that particular strand? We set goals and then meet throughout the unit to share ideas and assess what we are doing. We bring in student work and talk about what we can do to continue moving students forward.

In the spring we give a second assessment and we measure progress. Again we map out what we are seeing in the students’ writings. This time we are looking to see if, as a department, we have moved students. We look at which strands show improvement and which ones don’t, and then we set goals for the following year, when we will start the process over again.

This process is a tremendous amount of work but well worth it. Our department is constantly talking about students, their writings, and progress. We aren’t afraid to talk about what isn’t working and to admit when we need help. The information gleaned from the writing assessments gives a better picture of our students as readers and writers than the standardized test scores. Instead of seeing students as Far Below Basic or Advanced, we see them as learners. We get a snapshot of them in twelve specific areas of reading and writing, and we can measure and see the progress.

Even more importantly, we see that the skills are sticking with students across grade levels. The 12th graders start out doing more than the 11th, the 11th more than 10th. This year we were able to see students using academic language and sentence frames that we had started using the previous years, tools that weren’t introduced this year at the time of the first assessment. It’s rewarding to see your teaching reflected in students’ reading and writing and it drives our department to work harder in planning and implementing our curriculum.

Print Friendly

Author: Larry Ferlazzo

I'm a high school teacher in Sacramento, CA.


  1. Pingback: Analyzing/Calibrating Writing

  2. Pingback: Ideas about what works while learning a language – Part Four: mostly to the teacher « Learning and teaching English in the Netherlands

  3. Pingback: Technology in Schools | Using Rubrics in MOODLE

Leave a Reply

Required fields are marked *.