Two very talented educators — Ted Appel, the extraordinary principal we have at our school, and Kelly Young, creator of much of the engaging curriculum we use at our school through his Pebble Creek Labs — brought-up the same point in separate meetings with teachers at my school this week: The importance of not being “data-driven” and, instead, to be “data-informed.”

These conversations took place in the context of discussing the results of state standardized tests that came out last week. Here’s the point made by Ted:

If schools are data-driven, they might make decisions like keeping students who are “borderline” between algebra and a higher-level of math in algebra so that they do well in the algebra state test. Or, in English, teachers might focus a lot of energy on teaching a “strand” that is heavy on the tests — even though it might not help the student become a life-long reader. In other words, the school can tend to focus on its institutional self-interest instead of what’s best for the students.

In schools that are data-informed, test results are just one more piece of information that can be helpful in determining future directions.

I’ve been thinking about these conversations. Here is an example of how the perspective of being data-informed plays-out in my own teaching practice.

Typically, students in my classes show high-growth in state test results. This growth comes without “teaching to the test” (in fact, that is strongly discouraged at our school) and, instead, by focusing on developing life-long learners (again, which is our school-wide policy). I typically will spend thirty minutes or so teaching test-taking strategies, but that’s about it.

This past year, most of my students continued to demonstrate high-growth in the state test results. That is, everywhere except for my ninth-grade mainstream English class.

It was a hard class. Regular readers might remember this class by having read my post about it titled Have You Ever Taught A Class That Got “Out Of Control”?

The first semester was very difficult.  Lots of student transience, family problems, economic issues — the works.  Finally, I was able to get things under control at the beginning of the second semester.  I thought their subsequent work was good, but in the spirit of being data-informed, I can see that it’s possible that I might have lowered my standards.  Perhaps I was just thrilled that everybody was doing their work, seemed engaged, and was getting along that I “settled” for that.  I don’t think that was the case, but it’s possible.  In addition, the fact that the first semester was so chaotic meant that they received a full semester of less than high-quality instruction.

Reviewing the test results sparked this kind of reflection — on my own.  I certainly have not received any kind of pressure from our data-informed administrators.

As a result of this reflection, which was informed by data, I’ve made two decisions:

* I’m going to begin the classroom management program that I shared in my previous post from day one in my ninth-grade class.  If it took six weeks to move from extrinsic to intrinsic after a semester of chaos, I suspect it will take far less time at the beginning of the year.

* I’m going to make visits to the homes of most, if not all, of my ninth-grade students.  I usually make a lot of home visits, but the past two years they’ve been primarily to the home of my ESL students.  This year, I’m going to switch the focus.

Other than these two actions, I’ll continue to do what I’ve always done in my class — though I also believe I just become a better teacher each year with more experience.

Something tells me that a “data-driven” culture would have resulted in pressures to do something considerably differently.

What about you — is your school culture data-driven or data-informed?