Common Assessments Lead to Easier and Better Data Use

Blog090114-CommonAssess.jpg

As schools across the nation transition to assessments tied to Common Core State Standards (CCSS), the focus has been – understandably – on steps leading up to each test's administration. Nonetheless, a look at what will likely happen after the assessments are administered can bring relief to educators familiar with the challenges of performance data use.

 

Currently, there is much evidence educators often analyze data incorrectly, despite above-average intellect, professional development, and varied district supports. Though many explanations exist, the manner in which data is displayed for educators has proven to be a significant part of the problem.  Most teachers, principals, and other stakeholders have viewed a data report and thought, “This shouldn’t be so hard to understand.” They are right. Educators constitute ideal consumers of data given their higher-than-average schooling, intellect, and care for the jobs they do. If data is misunderstood by such consumers, that data needs to be made easier to understand.

 

The data systems and reports that communicate data to educators can do much to improve the use of assessment results if they present data in an “over-the-counter” format, which means data is made easier for educators to understand and use. This involves good design practices applicable to all data reports, as well as offering embedded data analysis guidance. When educators are using different assessments, it becomes harder for data system/report providers (DSRPs) to offer the best displays and supports each assessment warrants.

 

Label

 

According to a quantitative study of 211 educators of varied backgrounds, adding a label (with report-specific/assessment-specific data analysis guidance) to a report renders educators’ data analyses 307% more accurate (Rankin, 2013). This involves a footer or other annotation of 1-3 lines of text, placed directly on the data report to help educators understand its specific data and avoid common analysis errors.

Each data report’s label should be catered specifically to the data being reported. When educators utilize unique state assessments, in addition to diverse local assessments, DSRPs are less likely to provide customized labels to support correct analyses of these assessments’ results. Now that Smarter Balanced Assessment Consortium (SBAC) or Partnership for Assessment of Readiness for College and Careers (PARCC) assessments are being used in most states (with California using the former), with many educators also using related interim resources, more of educators’ data reports will feature the same types of data. DSRPs can then more easily provide labels to support the proper use of these common assessments’ results.
 


Supplemental Documentation

 

According to the same study, educators’ data analyses are 205% more accurate if a reference sheet accompanies the report or 273% more accurate if a reference guide accompanies the report. Research-based templates (www.jennyrankin.com/templates) can be used to create effective reference sheets and guides, which help educators understand and use a specific report.

 

Creating supplemental documentation takes more time than writing a brief report footer. Thus the standardization of state assessments is especially important to helping DSRPs offer supplemental documentation more frequently and successfully. Effective supplemental documentation reflects a keen understanding of the data being reported. The widespread use of SBAC and PARCC assessments increases the likelihood of DSRPs being familiar with these assessments and the guidelines for interpreting their results. Such familiarity will better facilitate quality supplemental documentation in data systems and report suites.
 


Help System

 

According to van der Meij (2008), shorter, targeted manual or user-friendly Help system causes users to need 40% less training time and to successfully complete 50% more tasks than they would have accomplished with only access to a full-sized manual. An edtech data system can feature a help system with task-based lessons on how to use the technology, but also with topic-based lessons to help educators use the data displayed by the technology.

 

For example, imagine the first time a teacher tries to use SBAC interim assessment results in a formative way. He or she could have questions like:

  • Using these results as one of multiple measures, how can I now group my students for differentiated intervention, how can I determine how to personalize their learning, and where can I find interventions appropriate to each student?
  • How can I use this assessment as part of a year-long assessment plan?
  • What basics should I know if I am new to using data?

 

Other resources, such as CCSS websites, might provide some answers to some of these questions. However, those answers are scattered. A help system can offer a centralized location for lessons addressing each of the teacher’s key needs, right where the teacher already is: in the data system.

 

A U.S. Department of Education (2009) study found 59% of teachers report using data systems on their own time. This increases the need for a help system to serve as a virtual data coach. Now that most states will be assessing students’ mastery of CCSS, DSRPs can use CCSS-based examples in help lessons to make content more familiar to educators and more applicable to their undertakings.

 


Package/Display

 

Common assessments make it easier (and thus more likely) for DSRPs to tailor a report’s display – or the way its data is ‘packaged’ – to the assessment’s reporting needs. For example, since graphing all of an assessment’s results minimizes the benefits of using graphs, a DSRP familiar with an assessment would be more likely to know which data to graph, as well as which graph types and features best facilitate understanding of the particular assessment’s data. For educators, well-designed data reports foster improved understanding and use of the data being displayed.

 


Content

 

Each report’s content should be selected based on what will make the assessment results easiest to understand and use appropriately. For example, changing legislation related to the assessment data can warrant changes to table headers (such as new demographic labels), changes to graphs (such as how proficiency status is determined and shown), changes to calculations (such as how the assessments are factored into federal accountability for school districts), and more. Thus the more familiar DSRPs are with an assessment, the more likely they are to know when content ‘expires’ and needs to be replaced. Likewise, this familiarity can help DSRPs craft report content most helpful to each report’s audience.

 


A Hope

 

Presently, most data systems do not present data in an over-the-counter (i.e., easy to understand and use) format. Given the varied assessments used state-to-state, the varied home states of DSRP vendors striving to remain familiar with so many assessments, and the fact that DSRP companies are not typically staffed by educators who have used all of these varied assessments, the challenges of catering each report’s design and supports to its particular assessment are understandable.

 

The more common assessments become from one state to the next, the more familiar those who communicate the assessment results can become with the best way to report and support the use of the assessments’ data. The standardized, widespread nature of CCSS assessments will make it easier for data systems and reports to more actively facilitate accurate data use.

 

Now it is up to DSRPs (e.g., vendors) to rise to the occasion. When they do, educators will have an easier time using data successfully. This means they will have a powerful complement to all they do to help students.

 

Visit www.jennyrankin.com/subscribe to subscribe to this blog.