The Smarter Balanced Assessment Consortium (SBAC) launched a field test for students during the spring of 2014. The year before this launch, the Smarter Balanced Practice Test was implemented “to give students a grade-specific (3–8 and 11) experience taking a test that was similar in format and structure to Smarter Balanced assessments. The practice tests include a variety of items (approximately 30 items each in ELA and math.)” After the initial launch new features, Spanish glossaries and American Sign Language videos were added.
The Smarter Balanced Training Tests was instituted as a field test to prepare test administrators as well as students for standardize tests that have been developed to address common core requirements. These tests are also provided for English language arts (ELA) and math. As per the Practice and Training Test Information Page, “It is designed to provide students with an opportunity to become quickly familiar with the software and interface features that will be used in the field test. These training tests are available by grade band (3–5, 6–8, high school) and have approximately eight to nine math and six ELA items per grade band. The training tests also provide students with an opportunity to experience all of the universal tools, designated supports and accommodations that are available on the field test. A complete list of these resources is available at www.smarterbalanced.org.” The site also offers a handy FAQ section.
Smarter Balanced makes the recommendation that all students should take both computer tests; practice and training, to “become familiar with the “system’s functionality and interface (training test.)” The training test will not be scored.
The testing interface has the typical rules of standardized tests; students are required to answer all test items on a page before they can proceed to the next page. A nice feature is the ability for students to flag questions to which they would like to return and the ability to review and change previously answered items (although there is a bit of a negative with this that I will address later.) Students may not return to a segment (in those tests with multiple segments) after it has been completed.
The negative, to which I referred in the previous paragraph, is the concern over what Smarter Balanced calls “Pause Rules.” A pause rule is applied for several reasons; the student pauses the test; the Test Administrator pauses the test, a technical issue such as a power outage or network failure. These situations cause a student to be logged out and, therefore, must log back in to resume testing. The upside to this is that when logging back in, the student is taken, immediately, to either the first page that contains an unanswered item or where he or she was previous to the shutdown. The downside is that in a non-performance task (Non-PT), the student is unable to resume if the 20-minute pause time has been reached. Performance Tasks (PT), do not have a pause time limit rule. To familiarize yourself with all the rules, see page 13 of the Test Administrators’ Guide, Spring 2014. This guide is a 108-page instruction manual that has many additional highlighted sites to which a Test Administrator can go to find answers for the interface. As a teacher/professor who has administered numerous computer tests, I question the ease of use for this interface. My doubts were also dittoed and documented by John Fleischman, Assistant Superintendent, Technology Services for Sacramento (CA) County Office of Education in what he calls, “An Unscientific Anecdotal Review of the SBAC Scientific Pilot Test.” Mr. Fleischman and the review board identified the following concerns: User interface or design; Computer hardware; Network connectivity; and technical support in their review. Specific to the user interface or design was a frustration with lack of word processor type functionality and the inability of the browser to “cut,” “paste,” “undo,” “redo” and “select all” options. There also was no highlighting text available. Additionally, the “pause & rewind” function was “challenging for some.” The variables between small and large screen displays required students to do more scrolling and in some of these cases, the time differential led to being “timed out” (pause rule) because scrolling is not seen/identified as active involvement from the test taker. In situations where the volume on the headsets was needed, users had to exit the test.
Although, computer-based testing is undoubtedly the best way to implement standardized testing, programs, such as those applied by SBAC is the only way to perfect testing implementation to a computer savvy student population. I applaud their “run-through” of Practice Testing, and Training Testing prepares students for interface recognition of all standardized tests and the resulting corrections/improvements.