In general, these reports review general information about the construction of the statewide assessments, statistical analysis of the results, and the meaning of scores on these tests. The Florida Department of Education (FLDOE) publishes a technical report annually following state testing. In early April, 2016, the FSA Technical Report FINALLY was published.
FCAT VALIDITY AND RELIABILITY QUESTIONS FULL
Want more details? Our full blog is here:įSA Technical Report: Looking For What Isn’t There Who is responsible for these documents that create more questions than they answer? Why aren’t they being held accountable? Why, if virtually our entire Accountability system is dependent on the use of test scores, isn’t it a top priority to ensure these tests are fair, valid and reliable? When Jeb Bush said “If you don’t measure, you don’t really care,” was he speaking of test validity? Because, it appears the FLDOE really doesn’t care. Results from the required “third-party, independent alignment study” conducted in February 2016 by HumRRO (You guessed it! They are associated with AIR and they have a long history with the FLDOE).Missing attachments to the report include: multiple reported appendices containing statistical data regarding test construction and administration, any validity documents or mention of the SAGE/Utah field study, any documentation of grade level appropriateness, any comparison of an individuals performance on both a computer and paper based test.Despite the previous FCAT 2.0 Technical Report‘s concerns questioning “the inference that the state’s accountability program is making a positive impact on student proficiency and school accountability without causing unintended negative consequences,” no evaluation of these implication arguments are made for the FSA (and I don’t believe that is because there ARE, in fact, no unintended consequences).The use of student scores to evaluate teachers (via VAM scores) is completely ignored and is left off the “Required Uses and Citations for the FSA” table, despite such use being statutorily mandated.The use of test scores to evaluate schools and districts is mentioned but there is no evidence those uses were ever evaluated. Though the report clearly states that validity is dependent on how test scores will be used, this report seems to only evaluate the validity of the use of scores at the individual student level.To summarize, here is a list of the most obvious omissions: In a nut shell, despite its 897 page length, there seems to be a LOT that isn’t in the 2015 FSA Technical Report. “Looking for what isn’t there” seemed like good advice, so when the 2015 FSA Technical Report was released, I started looking… At the time, we were advised to “look for what isn’t there” and we found no evidence the tests were shown to be valid, reliable or fair for at risk sub-populations of students. The results were anything but reassuring, evaluating only 6 of 17 new assessments and finding the use of individual test scores “suspect” but, strangely, supporting the use of scores for the rating of teachers, schools and districts. They hired Alpine Testing Solutions, a company anything but independent from FSA creator AIR, and the validity study was released in 8/31/15. When no documentation was forthcoming, legislators demanded an Independent Validity study. First, we were told it had been field tested and validated in Utah. “Looking For What Isn’t There,” the Brief Version:Īlmost since the day the Department of Education signed the contract with test developer American Institutes for Research (AIR), the validity of the Florida Standards and the Florida Standards Assessments (FSA) have been questioned.