Two very talented educators — Ted Appel, the extraordinary principal we have at our school, and Kelly Young, creator of much of the engaging curriculum we use at our school through his Pebble Creek Labs — brought-up the same point in separate meetings with teachers at my school this week: The importance of not being “data-driven” and, instead, to be “data-informed.”
These conversations took place in the context of discussing the results of state standardized tests that came out last week. Here’s the point made by Ted:
If schools are data-driven, they might make decisions like keeping students who are “borderline” between algebra and a higher-level of math in algebra so that they do well in the algebra state test. Or, in English, teachers might focus a lot of energy on teaching a “strand” that is heavy on the tests — even though it might not help the student become a life-long reader. In other words, the school can tend to focus on its institutional self-interest instead of what’s best for the students.
In schools that are data-informed, test results are just one more piece of information that can be helpful in determining future directions.
I’ve been thinking about these conversations. Here is an example of how the perspective of being data-informed plays-out in my own teaching practice.
Typically, students in my classes show high-growth in state test results. This growth comes without “teaching to the test” (in fact, that is strongly discouraged at our school) and, instead, by focusing on developing life-long learners (again, which is our school-wide policy). I typically will spend thirty minutes or so teaching test-taking strategies, but that’s about it.
This past year, most of my students continued to demonstrate high-growth in the state test results. That is, everywhere except for my ninth-grade mainstream English class.
It was a hard class. Regular readers might remember this class by having read my post about it titled Have You Ever Taught A Class That Got “Out Of Control”?
The first semester was very difficult. Lots of student transience, family problems, economic issues — the works. Finally, I was able to get things under control at the beginning of the second semester. I thought their subsequent work was good, but in the spirit of being data-informed, I can see that it’s possible that I might have lowered my standards. Perhaps I was just thrilled that everybody was doing their work, seemed engaged, and was getting along that I “settled” for that. I don’t think that was the case, but it’s possible. In addition, the fact that the first semester was so chaotic meant that they received a full semester of less than high-quality instruction.
Reviewing the test results sparked this kind of reflection — on my own. I certainly have not received any kind of pressure from our data-informed administrators.
As a result of this reflection, which was informed by data, I’ve made two decisions:
* I’m going to begin the classroom management program that I shared in my previous post from day one in my ninth-grade class. If it took six weeks to move from extrinsic to intrinsic after a semester of chaos, I suspect it will take far less time at the beginning of the year.
* I’m going to make visits to the homes of most, if not all, of my ninth-grade students. I usually make a lot of home visits, but the past two years they’ve been primarily to the home of my ESL students. This year, I’m going to switch the focus.
Other than these two actions, I’ll continue to do what I’ve always done in my class — though I also believe I just become a better teacher each year with more experience.
Something tells me that a “data-driven” culture would have resulted in pressures to do something considerably differently.
What about you — is your school culture data-driven or data-informed?
Larry,
Your blog provides a wonderful example of the point about proper use of data. The lower scores by themselves tell you little, but combined with your observations and understanding of what went on in the classroom, you have multiple indicators that seem consistent and you have a plan for improving. You do us all a service and do credit to the profession by opening your practice to readers in this way.
Excellent observations Larry. This isn’t only a problem in schools, but in business, government and I suspect countless other organizations. Too many times I see professors and business leaders stick their heads in charts and data and in turn say “this is what the chart says to do so this is what I will do.”
That’s not to say the data itself is not valuable, in fact it’s the exact opposite, good unbiased data is immensely valuable. It’s easy to forget, however, that past data is not necessarily predictive of future events.
Using data in combination with observation, intuition and anecdotes, will likely lead to better and more informed decisions.
(Don’t forget though – observation and intuition are data too! Just because it isn’t in a chart or table doesn’t mean it isn’t data.)
Of course the data gives you information, but the efficacy of that data may be limited. If students are taking a multiple choice test that they know is solely for administrative purposes ( i.e. not part of their grade ) they may not take it seriously.
Also, on multiple choice tests you never really know what went through their brains.
Larry,
Your post brings up important distinctions in how we view learning. There’s been a relentless push for standardized data ever since A Nation At Risk, as if standardized tests alone can tell what’s important about students.
It’s also good to know that “data” can be many things such as evidence of problem solving, inquiry and collaboration — and a disposition towards life-long learning.
My school is also looking at the tension between skills-based assessment and “skills-driven” assessment. And of course we have to ask what’s more of a priority: 1) Effective paragraph transitions, or 2) A love of reading and writing.