You know that awkward moment when you’ve been blogging for
about two weeks, and you look at your phone to see that one of the most
prestigious English literacy gurus has tweeted at you to tell you you’re wrong?
The first thing you do is turn your head and look at her
books on your shelf that you’ve been using to guide your instruction – oh, and
the curriculum development for the district – and then you completely freak
out. And, yes, you start to question yourself.
But here’s the thing. In our last podcast, we took issue
with Sara Holbrook’s assertion that the STAAR test should not be asking readers
to infer author’s purpose from text structure or other literary elements. We
argued, basically, that this was one of the key features of text analysis,
that, in fact, it broadens and enriches the reader’s experience of the text,
and that it helps readers become writers. You can listen to our full
conversation here.
We knew there would be a response to this. What we didn’t
expect was such a prestigious one! Stephanie Harvey – luminary educational consultant, writer
of many respected books on education, (seriously
buy her books, she’s awesome I’ve hyperlinked the ones we own here) -
disagreed with us that evidence based inferencing is a way to gauge college
readiness. On a larger scale she seems to support the position that you can
think a piece of art is about whatever you want – but when the author tells you
what it’s really about, then that’s the end of it. Georgia O’Keefe, it turns
out, really was JUST painting flowers all this time.
But here’s the thing. This flies in the face of accepted
critical literacy skills, as we know them at least. Webb’s Depths of Knowledge
– the framework widely used by curriculum writers - explicitly references “determining the author’s purpose” as a
strategic thinking skill. Stephanie, in your
own book “Strategies that Work” you cite “reading like a writer” (i.e.
inferring author’s intent for structural and literary technique use) as a,
well, strategy that works. It is all over the state standards for Texas as a
readiness standard, not to mention Common Core and College and Career Readiness
Standards. And there is a reason for that. Determining author’s purpose is
explicitly and repeatedly reconfirmed as an essential literacy skill.
You asked for our research backing up our claim that
author’s purpose is a critical literacy skill. But what we ask you is, where is
your research that is NOT? The state standards, common core, and college and
career readiness standards all explicitly address this skill as necessary.
These standards have been vetted. The STAAR exam itself has been independently
validated to be accurately and reliably measuring these standards. These items
are field tested, validated, and combined with multiple other readiness
standards in order to get a general picture of the student. And to be
considered ready they only have to get 50% correct.
The STAAR test itself is not all ALL author’s intent or
craft, but those higher order thinking skills are a part of the test along with
lower level testing of vocabulary
understanding, dictionary skills, summarizing, and paraphrasing. Would we
prefer to see the higher order thinking skills assessed in a short answer
rather than a more limiting multiple choice question? Sure. Unfortunately
thanks in large part to complaints like these, the more authentic short answer
response questions are now gone.
But since we criticize multiple choice assessments here –
and since you asked for research, how about this one - Critical reading and
multiple choice questions for author’s intent could increase students’
multiple-cause thinking - a key cognitive area of need according to Columbia
University’s Kuhn and Holman. And this is the key point
– using these assessments to drive instruction towards critical literacy and
away from traditional reading comprehension is an equity issue in education. Traditional
reading comprehension tests, which do not delve into analysis tools like
author’s intent, are far too dependent on background knowledge. This gives an
unfair advantage to wealthy students with their increased cultural capital and
academic prep. This is part of what causes tests to skew along demographic
lines.
And it is this inequity in education that, ultimately, all
of us are here to solve. Our experience, as educators working under the TAKS
framework – which skewed more towards the reading comprehension skill – and now
under the STAAR framework bears witness to that bias. Critical literary
analysis questions – even those as flawed as can appear in the STAAR – are the
most equitable way to assess and identify students who are in NEED of
intervention. Watch this space for a follow-up analysis of these same STAAR
questions for an example.
Comments
Post a Comment