The other night we had an opportunity to share student achievement data with the school improvement committee. Our testing regimen at Hudson consists of a battery of tests that are designed to complement one another, verify our results, and [if formative] be used as a guidepost to shape instruction. The results that we most often share are those from our battery of summative assessments. They include the Iowa Assessments (formerly known as the Iowa Tests of Basic Skills) and the MAP tests (Measures of Academic Progress). Summative assessments are those given at the end of a unit or course and are most helpful to determine if the students learned the material that was covered in class. These type of tests are usually not very helpful when it comes to shaping instruction because by the time they are given, the instruction is over. Formative assessments on the other hand are very helpful in diagnosing problems in instruction because they are given during instruction. Their sole purpose is to determine what adjustments are needed to ensure student mastery of the content. They are not used to measure student progress.
Summative assessments [then] are designed to determine student progress, and in Iowa's case [student] progress on the Iowa Core Curriculum. This is the material that is supposed to be covered in class, and as such our testing battery should measure the Iowa Core, right? The trouble is these tests do not align very well to the Iowa Core. Imagine if you were going to get a license to drive a school bus. In preparation for that test I told you that it was necessary to learn about air brakes, the difference between the amber and red flashing lights, and the components of the walk-around inspection. You would probably study pretty hard for that test and maybe even work with one of our drivers to make sure that you have the material well in hand. You are ready for the test, right? Now imagine going to take the test and having a bunch of questions on the operation of a motorcycle. It wouldn't be fair would it? There would be an alignment problem.
That is kind of what we have with the Iowa Assessment. The alignment is out of sync. There are several studies to back up this claim, but we need to look no further than our own testing regimen to see this played out. Remember above where I stated that the entire testing protocol is designed to complement and verify results? That is where we can really begin to see the misalignment.
So at our meeting the other night, we shared the results from the MAP test--and we shared the results of the Iowa Assessments. Both tests are designed to measure the same thing and both claim that they are aligned to the Iowa Core. The trouble is that if I were to place results in front of you without the name of the school at the top, you would be convinced that you were looking at two different schools. The testing protocol(s) in no way resemble one another. To use one to verify the other's results is impossible.
The main question we are trying to answer with these assessments is student growth or how much they have learned. How many students met targeted growth? Now in fairness, the administration of each instrument is a bit different (but the measurement is the same). The MAP test is given twice a year. Shortly after school starts (in fact by the time you are reading this we will already be starting MAP testing), the MAP test is given. This will give us a baseline number for each student, and based on that number we will be able to determine how much that particular student should grow over the course of the academic year. At the end of the year, we will take the measure, again and from that be able to determine how much each student has grown. This seems to work pretty well for us and provides adequate information.
The Iowa Assessment on the other hand is given annually. Each school in Iowa is afforded the option of giving the test in fall, winter, or spring. Traditionally, Hudson has always given the test mid-year. The same principles apply, we compare the standard scores from the test that the students take this year with the scores from last year, and from that we will be able to measure how much each student has grown. In a normal world, the results would be similar. If student 'A' showed 15 points of growth on one assessment, then a similar result should be suggested by the other. That is not at all the case. In some instances, student 'A' may show 15 points of growth, but on the other they may not show any growth, or even show a regression!
This brings us to our legislative priority #3: Support continued progress in the development of rigorous content standards and benchmarks consistent with the Iowa Core focused on improving student achievement --including the development of high quality summative and formative assessments, aligned to the skills students should know and be able to do to succeed globally and locally.
Here is another weakness in both the MAP and the Iowa Assessments: both are essentially a multiple choice instrument. For practicality and standardization purposes, the assessments can be scored very quickly and folded into a statistical model on a traditional bell curve. It is pretty difficult to produce a high quality rigorous instrument using a multiple choice option. What about writing skills? How about asking students why they selected a particular answer, or prove they are correct in their response? With norm referenced tests like this, it makes it difficult to determine what a student actually knows. What it does is rank and order students. This does not tell us whether or not the student has learned the material, but rather it tells us how they did compared to peers. So not only should the test be aligned to the Iowa Core, but it should be criterion referenced. After all, what is more important to you: whether your child can read proficiently, or if they can read better than the student in the next district over?
Quality assessments that are properly aligned will go a long way to creating world class schools in Iowa and will also provide valuable information to schools about the academic progress of students.
No comments:
Post a Comment