You judge which is the most productive stage of this daisy's cycle.
Shanahan on Literacy
Information for teachers and parents on teaching and assessing reading, writing, and literacy.
Early Childhood Literacy
The Connecticut Council for School Reform asked me to speak in Hartford, on April 9, 2015.
My presentation reviewed and responded to some of the complaints
or concerns about teaching young children to read,
and considered several issues in expanding preschool literacy opportunities.
My presentation was based largely on the Report of the National Early Literacy Panel
and a handful of other individual studies that I wanted to highlight.
Response to
Complaint about What Works Clearinghouse
I have
recently encountered some severe criticism leveled at reviews and reviewers
from
What Works
Clearinghous (seehttp://www.nifdi.org/research/reviews-of-di/what-works-clearinghouse).
I am concerned about recommending this site to teachers as a resource
for program
evaluations. I'm wondering if you agree with the criticisms, and if yes,
where you
would recommend teachers go for evidence-based program reviews.
I know that
NELP and NRP reports are possibilities but are also static documents
that do not
get updated frequently with new findings, so some of the information really
isn't current. Perhaps the Florida Center for Reading Research is an
alternative?
Do you have
others than you would recommend?
I don’t agree
with these criticisms and believe What Works Clearinghouse (WWC)
has a valuable
role to play in offering guidance to educators. I often recommend it to
teachers
and will
continue to do so. It is the best source for this kind of information.
WWC is operated
by the U.S. Department of Education.
It reviews
research claims about commercial programs and products in education.
WWC serves as a
kind of Good Housekeeping seal of approval.
It is helpful
because it takes conflict of interest out of the equation.
WWC and its
reviewers have no financial interest in whether a research claim is upheld or
not.
I am an advisor
to the WWC. Basically, that means I’m available, on a case-by-case basis,
to help their
review teams when questions come up about reading instruction or assessment.
Such inquiries
arise 2-3 times per year. I don’t think my modest involvement in WWC
taints my
opinion, but the whole point of WWC is to reduce the commercial influence
on the
interpretation of research findings, so it would be dishonorable for me
not be open
about my involvement.
I wish the
“studies” and “reports” you referred me to were as disinterested.
The DI
organization has long been chagrined that the WWC reviews of DI products
and programs
haven’t been more positive.
That the authors
of these reports have a rooting interest in the results should be noted.
Different from
the disinterested reviews of the Clearinghouse which follow a consistent
rule-based set
of review procedures developed openly by a team of outstanding scientists,
these reports
are biased, probably because they are aimed at trying to poke a finger in the
eye
of the reviewers
who were unwilling to endorse their programs.
That’s why there
is so much non-parallel analysis, questionable assumptions, biased language,
etc.
For example, one
of the reports indicates how many complaints have been sent to the WWC
(62 over
approximately 7 years of reviewing). This sounds like a lot,
but what is the
appropriate denominator… is it 62 complaints out of X reviews?
Or 62 complaints
about X decisions included in each of the X reviews?
Baseball umpires
make mistakes, too; but we evaluate them not on the number of mistakes,
but the
proportion of mistakes to decisions. (I recommend WWC reviews, in part,
because they
will re-review the studies and revise as necessary when there are complaints).
Or, another
example: These reports include a table citing the “reasons for requesting
a quality review
of WWC findings,” which lists the numbers and percentage of times
that complaints
have focused on particular kinds of problems
(e.g.,
misinterpretation of study findings, inclusion/exclusion of studies.
But there is no
comparable table showing the disposition of these complaints. I wonder why not?
(Apparently, one learns in another portion of the report, that there were 146
specific complaints,
37 of which led
to some kind of revision—often minor changes in a review for the sake of
clarity; that doesn’t sound so terrible to me.)
The biggest
complaint leveled here is that some studies should not have been included
as evidence
since they were studies of incomplete or poor implementations of a program.
The problem with
that complaint is that issues of implementation quality only arise
when a report
doesn’t support a program’s effectiveness.
There is no
standard for determining how well or how completely a program is implemented,
so for those
with an axe to grind, any time their program works it had to be well
implemented
and when it
doesn’t it wasn’t.
Schoolchildren
need to be protected from such scary and self-interested logic.
http://www.shanahanonliteracy.com/2015/04/early-childhood-literacy.html
Introduction
to Turbo Charged Reading YouTube
A
practical overview of Turbo Charged Reading YouTube
How to choose a book. A Turbo Charged Reading YouTube
Emotions when
Turbo Charged Reading YouTube
Advanced Reading Skills Perhaps you’d like to join my FaceBook group ?
Perhaps you’d like to check
out my sister blogs:
www.innermindworking.blogspot.com gives many ways
for you to work with the stresses of life
www.ourinnerminds.blogspot.com which takes
advantage of the experience and expertise of others.
www.happyartaccidents.blogspot.com
just for fun.
To quote the Dr Seuss
himself, “The more that you read, the more things you will know.
The more that you learn;
the more places you'll go.”
No comments:
Post a Comment
Your opinions, experience and questions are welcome. M'reen