Beyond the rubric: Think-alouds as a diagnostic assessment tool for high school writing teachers

Beck, S.W., Llosa, L., Black, K., & Trzeszkowski-Giese, A. (2015). Beyond the rubric: Think-alouds as a diagnostic assessment tool for high school writing teachers. Journal of Adolescent & Adult Literacy 58(8), 670-681.


Here is an assessment that attempts to look at students’ writing “in process” rather than just looking at a finished product as is usually done with writing assessments.

These authors devised a protocol for teachers to observe students as they wrote an argumentative essay. Students “thought-aloud” while writing the essay, and teachers recorded what they observed on the protocol, which is called the Think-Aloud Protocol (TAP). The TAP is provided in its entirety in Figure 1 of the article (pp. 672-673), which I think was essential here so that readers could form an impression of how useful the TAP might be in their own practice.

My first thought was that the TAP could be a useful scaffold for teachers wanting a view of how their students progress through the process of writing. The TAP could provide formative data that would help a writing teacher see where students have learning “snags” and address those problem areas. The authors “explored the usefulness of the TAP” (p. 671; I cite this to underline that the study is very exploratory and preliminary in nature) with five high school teachers, three who were regular English Language Arts (ELA) teachers and two who were English as a Second Language (ESL) teachers. Each of the five teachers selected three students to implement the TAP with, for a total of 15 students.

The teachers were interviewed about the students they chose prior to using the TAP. Then they used the TAP with the students and were audiotaped as they did so. Finally, the teachers were interviewed after using the TAP. Not surprisingly, the protocol definitely increased the number of inferences teachers were able to make about students and the writing process. The protocol helped them focus on various aspects of the writing process, and that might be hard to do without such a scaffold. The authors also claim that teachers focused more on student strengths after using the TAP than before. Data tables presenting these data are provided.

Regarding strengths, I wonder if the protocol, and the fact that it required teachers to put their thoughts into a written record, was a factor in the more strength-based inferences seen with the TAP. When we have to put our thoughts in writing and let others see them, we may tend to be a little more thoughtful about what we write and about its effects on those who might read what we write. We also might be conscious of what others think of us as professionals based on what we write on such a protocol. I don’t think that kind of thoughtfulness is a negative thing; in fact, I think we need to encourage it. However, in a study like this one we also need to be mindful of the possibility of teachers couching their inferences in rosy terms because they think that strength-based responses are desired by the researchers.

There are some gaps in this research report that I wish had been filled. The biggest one is that we are given almost no information about the actual think-aloud process with the high school students and how it worked. How did the teachers in the study prompt the students to think aloud as they wrote? Were prompts of any kind provided? I think it might be difficult to get students to think aloud as they wrote, and it might be difficult for a teacher to think of ways to elicit think-aloud responses. Plus, the teachers could have been very busy managing the protocol and recording whatever students did say. I am going to need much more detail about how these sessions were actually conducted to be convinced of the usefulness of the TAP.

It also is entirely possible that attributes of the various teachers could contribute to how much data and what kinds of data could be gotten with such an assessment. Would a more extroverted teacher be able to get more from students than an introverted one? What about a teacher who had more experience versus a less experienced teacher? Some teachers might have a deeper knowledge of the writing process than others, or some might have a better relationship with students than others. All of these attributes, and probably others, could make a difference in the data that come out of the TAP assessment as it was implemented here. The only such attribute addressed was the possible differences in teacher knowledge between the ELA teachers and the ESL teachers.

I wondered how the teachers in the study were trained to use the TAP. We are given almost no information on that, and that could be important. Without training, there could be reliability issues, with some teachers interpreting the meanings of items on the protocol in ways that would differ from the way other teachers interpreted the items. Training might help smooth out some of the differences due to teachers’ personal attributes that I discussed above, though that could never be completely eliminated. It could be that there was good training in this case, but we just don’t know without more information than was provided in the report.

This study was definitely exploratory, and definitely small in scale, so there are areas to work out with the TAP, and I hope the researchers will do so. Even with the gaps I discussed above, I see the potential of an assessment like this. In many ways it reminds me of in-depth reading assessments like miscue analysis, which take a look at a reader’s processes “in-flight” during reading. Like the TAP, miscue analysis in its full form is too intense and time-consuming for most teachers to do with all students, but for students having difficulties that we have not been able to figure out, these kinds of in-depth, in-process assessments can help us look more deeply at our students and find their literacy strengths, and then build upon those.

No comments:

Post a Comment