Next February, this blog will be celebrating its ten-year anniversary! Leading up to it, I’m re-starting a series I tried to do in the past called “A Look Back.” Each week, I’ll be re-posting a few of my favorite posts from the past ten years.
In 2009, I was able to do a pretty interesting ed tech experiment and teach one ELL United States History class with students using computers every day and another U.S. History class that almost never used computers. Here is the final post I wrote about the results:
As some readers might remember, Holly Coyle (my exceptional student teacher) and I taught two United States History classes with English Language Learners this year — one entirely in the computer lab, and the other — for all practical purposes — entirely out of the computer lab (but using what –in my mind at least — is a very engaging curriculum).
We did assessments and evaluations at the beginning of the year, at mid-year and at the end of the year. You can read more about this — and download the actual assessments — at my post Mid-Year Results Of My “Experiment”. Two of the assessments tested basic knowledge of U.S. History (admittedly, pretty low on Bloom’s Taxonomy) and the third was a student evaluation of the class.
In January, the results showed that student achievement gains were about equal, though students in the technology-oriented class seemed more engaged and interested in U.S. History.
The two June assessments that tested U.S. History knowledge on a basic level were, like they were in the ones we gave in January, just about the same in both classes.
The one where students evaluated the class itself — if they liked it, if it made them want to study more US History, etc — was a bit of a surprise. As I wrote earlier in January, even though both classes evaluated it positively, the zero period class was more engaged. In the year-end one, again both classes evaluated it positively, but this time the non-tech class was much more engaged. The only place where the tech class evaluated it more positively was that they clearly felt like they developed more computer skills — which was to be expected.
I’m not surprised that the knowledge level is similar, but I am surprised that the non-tech class felt that they liked it more and got more out of it. The fact that the tech class took place an hour before regular school began, and that students repeatedly complained about having to getting-up early, might have some effect on the difference, but the amount of difference really was pretty striking, so it’s unlikely to have been the only factor (by the way, all students voluntarily chose to take the early class).
In retrospect, I would have done two things differently:
1) I wish we had given a straight pre-and post-assessment on English comprehension. Based on the data from our family literacy home computer project, I would have expected that those in the computer lab would have had a greater increase in understanding English, though I might very well have been proven wrong.
2) I would have put more time into figuring-out how the tech class could have connected more with our International Sister Classes. We started out strong in that regard — for example, students were corresponding with an EFL class in Spain to learn how the Spanish Conquest of the New World was taught in that country — but ended up succumbing to the impulse of having to “cover the curriculum” and those connections fell by the wayside. I suspect with a little more strategic planning on my part that kind of cooperation could have been integrated.
Feedback is welcome. Again, you can download the assessments by going to my January post.