Parent Perspective: State Assessments Under ESSA

About state assessment results, parents have said they want to know and understand:
  • How does this information help me and help teachers to provide a better learning environment for my child?
  • What does this mean? Is she doing well or not doing well? 
  • How will teachers get the information they need?
(Note to DESE: Look at, and think about, what you're going to say to parents!)

One of the things ESSA has done is give States more flexibility* in education governance. It's not a perfect system; it's a series of trade-offs and choices.

Those trade-offs and choices begin with the question: What do you want the test to do? If one thing is to spend less time on testing, are we trading-off on content coverage, which may result in less reliable data than a longer test. How well is the assessment aligned to state's standards? Both questions are striking, given Massachusetts is in the midst of developing its 2.0 test.

(I'm told every testing vendor, if asked, can give an analysis of how well the questions line up with state standards. Note to self!) 

B: Align to Standards (ELA/Literacy) Criteria (L) & Evidence (R)
Also important is to understand what a high-quality assessment is. CCSSO criteria outlines principles for assessing a high-quality test, including testing for "depth of knowledge" (B4). Constructing a response, writing, or doing a performance assessment - all of these types of responses are considered high-quality assessments. (They also take more time and cost more money to score hence, the trade-offs!)

It seems obvious to state this, but apparently it's something frequently overlooked until quite late in the development process: To ensure a test is actually assessing higher order thinking skills, some questions should ask students to do the kinds of things we actually want them to do in classrooms.

Questions this parent still has:
  • What information will be reported?
  • How will information be reported?
  • When will assessment results be shared with parents, teachers, and policymakers? (If teachers don't get info on how their students did until November, it really isn't going to do them any good in terms of making changes to their practice or lesson plans.)
  • How transparent and accessible will those results be?
  • Are we monitoring everything we're doing?
  • How?
  • Do we understand the capacity of our local districts?
    • Do they have enough computers/devices?
    • Is there sufficient bandwidth? (Many times parents feel as though the testing sessions go on for weeks and weeks, even though it's not happening to every kid; the perception is no other learning is taking place in the school).
  • Is DESE ready for new reporting requirements by subgroup ("N-size") to include military students?
Upshot: Where state assessments are concerned, DESE can't do it all and can't do it all well. Test developers' decisions are made on which content standards are being prioritized (and the Board didn't establish a list of content standard priorities). Given the fact that state assessments are going to continue, and that the test represents the particular skills and knowledge the test developer has chosen to assess, I'll be referring to the above questions as the Board considers its ESSA decisions concurrent with the development of the MCAS 2.0
-----
*ESSA also provides for additional flexibility under the Innovative Assessment Pilot, but Massachusetts isn't taking this on at this time.

US Department of Education (July 2016): ESSA Assessment Fact Sheet

NASBE (January 2016): The State Education Standard

Ensuring Equity in ESSA: The Role of N-Size in Subgroup Accountability