Rachel Rich, a member of CAPE (Community Alliance for Public Education: check them out on Facebook) in Eugene has been doing some great comprehensive work lately on exposing Smarter Balanced Assessment Consortium (SBAC) testing as the destructive presence it is. Here is a compilation of her recent testimony surrounding Senate Bill 351, legislation that seeks to straighten out the botched audit on SBAC. The audit was way too sketchy and not comprehensive enough. Rich’s testimony is long but worth the read. That is followed by her update of the numbers after some more research.
I urge the passage of SB 351 to more thoroughly audit the costs of state mandated Smarter Balanced standardized tests – in finances, time and other resources. I also endorse SB 354 to provide parents consistent, prompt and objective information on whether each standardized test benefits their own individual student and how to opt out if it doesn’t.
Last year’s House audits of Smarter Balanced stated they wouldn’t discuss classroom impacts or test quality, which should be central. A new audit should also address costs beyond ODE contracts for test administration, such as for the enormous expansion of ODE personnel, training and infrastructure, as well as district’ costs for adding test-related positions, substitutes to setup and proctor, computer hardware and software, data management, technical services and training.
A new audit should also propose more solutions to the problems it finds. From last year’s audit report, document #2016-21, I pulled out a partial list of flaws in the test’s design and execution. Here they are in order and verbatim:
- “results are not well-suited to inform instruction or individual educational decisions at the student level
- students taking between 18-23 hours to take the tests
- additional staffing and resource demands on the entire school
- (needing) new staff or substitutes
- training displaces professional development
- test administration can take up meeting time at schools
- testing tied up computers for months
- less instruction time, fewer support services, and less access to common resources for all students during testing
- multiple reports of computers freezing and accommodations, such as text-to-speech, not working properly
- work …lost
- anxiety or pressure
- disruption and stress
- challenges exacerbated by the length of the test
- impacts fall hardest on vulnerable populations”
No test in history has ever caused so many problems or yielded such unusable results, yet the ODE justifies this for the sake of “systems level … accountability”. All other tests report specific skills or standards for specific students that help teachers target instruction, such as if a kid understands decimals. Instead, Smarter Balanced only reports “Claim #2 – Student can solve a range of complex, well-posed problems”. There is no excuse for this. Other tests are helpful both for instruction and comparing individual or group performance “for accountability.” We’ve known this since the SBAC pilot in 2013. It’s time for a change.
Further, the new audit must count all testing costs. Not finding any ODE discussion of this, I asked Representative Susan McLain’s office for school budget records between 2010 and 2014 with regard to likely test-related budget items, pre- and post- Smarter Balanced. (I chose budget items based on Smarter Balanced test manuals, an AFT study, etc.) 2010 was the starting point since OAKS alone was used and 2014 represents the full implementation of Smarter Balanced, as well as the last set of complete records for all Oregon districts at that time. I used public records.
Here’s one example of what I found: additional substitutes needed to set up and proctor the new tests raised expenditures this way:
121 – Substitutes-licensed $3,887,787
122 – Substitutes-classified $2,594,894
In total, districts’ test-related budget items increased by $86 million between 2010 and 2014. That averages out to $22 million annually, so that for the past seven years schools themselves spent about an extra $154 million to both develop and administer the new tests.
Additionally, an ODE contract with AIR test administrators (annual or biannual?) was for $27.5 million, as opposed to the old test which Rob Saxton said totaled only $3.5 million. The SBAC was implemented statewide from 2013 to now, meaning that since then there have been two to four AIR contracts for test fees and administration totaling about $55-110 million. The ODE says their contract is now a few million less, but that is a drop in the bucket of the total costs.
Further, a 2009 ODE grant application to develop Smarter Balanced outlines their own expanded infrastructure, personnel and training for $202 million. The maintenance of this expansion continued after the grant expired – for added staff, equipment, bandwidth, data storage and training. At an average of $50 million yearly, the subsequent three years after the grant would total about $150 million.
To summarize, depending on whether you count only out-of-pocket expenditures only or if you include the grant as part of total test-related spending, and whether the AIR test administration contract is annual or biannual, Smarter Balanced has increased state expenditures by approximately $350 – 600 million!! Those are only some of the increases, not total costs. It also doesn’t include the new ESD, district or school personnel required for this more elaborate testing system.
Are we getting our money’s worth? The most highly respected national test, the NAEP, shows Oregon has made no improvement under the tutelage of Smarter Balanced. Worse, teacher training outside of the testing theme is down by $11 million and funding for key student services like reading support, talented and gifted and psychological services are also in the red, suggesting although we test to identify needs, but aren’t committed to addressing them.
Further, the shift in emphasis, funding and personnel towards math and English and away from electives has created a new set of problems: students feel they have few avenues for developing their own talents and interests outside those dictated by adults obsessed with data and accountability.
In short, spending so much time and so many resources on testing is not only futile, but it robs students of their love of learning. I think we need SB 351 and 354 to hold the adults accountable.
Rachel Rich’s Assessment of SBAC budget costs.
How much does your state spend on testing? What are the trade-offs? Rachel Rich, 4-10-17
Since starting Smarter Balanced, Oregon’s spending on standardized testing has exploded by over $550 million! That’s according to one ODE contract, a grant and their own budget spreadsheets. Yet key student services have risen less than one feeble million and teacher training is down many millions! How can we expect more from students and teachers when we support them less?
Two years ago I never heard a peep about testing costs, so I decided to investigate. Rustling through Oregon Department of Education records, I found this:
A 2014 contract shows AIR charged Oregon $27.5 million for two years of Smarter Balanced. That’s an average of $14 million annually, while the previous computerized test was a mere $3.5 million. After Smarter Balanced went statewide in 2013, it continued racking up fees totaling over $50 million!
Further, the ODE 2010-14 Race to the Top grant shows they expanded their office personnel, training and infrastructure by a whopping $202 million. The grant expired, but nobody fired the new help or threw the equipment out the window, so costs continued. At an average of $50 million annually, seven years of maintaining ODE expansion now total over $150 out-of-pocket or $350 million over-all!
Next, school districts themselves spent extra to accommodate longer and more elaborate tests. According to Smarter Balanced testing manuals, an American Federation of Teachers study, two state audits, and friends who are administrators, school board members, test coordinators and tech specialists, school districts had to add:
- Extra substitute teachers to set up and proctor
- Test-focused professional development
- New hardware and software
- Increased technical services
- Additional data management
Using those as a reference, I requested ODE spreadsheets of related budget items. A friendly legislator expedited the process. The first spreadsheet was for 2010 when the state used the old test and started designing the new. The last spreadsheet was when Smarter Balanced was already underway, which happened to be the last year of available data: 2014. Here is the difference between test-related spending before and after Smarter Balanced, listed by budget item:
121 – Substitutes-licensed – to act as proctors $ 3,887,787
122 – Substitutes-classified – prepare for and proctor tests $ 2,594,894
470 – Computer software – system updates for testing $26,804,376
480 – Computer hardware – additional computers – $1,095,277
(grants offset costs for new computers)
380 – Technical services – typically for testing $8,262,137
390 – Other tech services – typically for testing $23,896,065
2660 – Technology services – typically for testing $14,430,357
2210 – Improvement of instruction – typically test PD $4,084,000
2240 – Staff development (paid) typically for testing $982,613
Mostly covered by ODE grant until 2014, test focused regular staff meetings not included
2630 – Information Services – manage increased SB data $1,500,554
2230 – Assessment and testing – other than state tests $614,948
2670 –Records management $93,225
Increased test-related expenditures for all Oregon districts 2010-14: $86,055,679
This averages out to an increase of $22 million annually or more than $150 million since 2010!
That doesn’t count the extra personnel schools needed! Smarter Balanced manuals show districts were expected to add testing personnel or else divert staff away from their primary duties. As a consequence, district and school test coordinators, test administrators, regional ESD partners, data managers, and technology specialists popped up like dandelions on a baseball field.
Summary of Oregon’s increased testing-related expenditures since Smarter Balanced:
AIR Contract $14 million x 4 years $50+ million (since 2013)
ODE expansion $50.5 million x 7 years $350+ million (since 2010)
District expenditures $22 million x 7 years $150+ million
District test personnel ? ?
$87 million $550+ million ($350+ out of pocket)
Meanwhile, expenditures for key student services (like nurses and summer school) only grew half a million dollars in four years. Given we have half a million students, that’s an increase of $1 per pupil or just 25 cents per pupil per year!!! Clearly, over-spending on tests is choking out services. Check out the difference between 2010 and 2014, listed by state budget item:
1113 – Elementary extra-curricular $50,059
1122 – Middle school extra-curricular $3,912
1132 – High school extra-curricular $160,875
1140 – Pre- K $226,488
1210 – Talented and Gifted -$196,181 loss
1220 – Restrictive programs for disabled $442,655
1250 – Less restrictive programs for disabled $1,006,129
1260 – Early Intervention (SPED) $974,795
1271 – Remediation $518,935
1272 – Title I -$6,228,523 loss
1291 – English Language Learners $467,229
1400 – Summer school $4,120
2130 – Health services $691,049
2120 – Guidance services $1,560,981
2140 – Psychological services -$376,844 loss
2150 – Speech pathology and audiology $919,983
2190 – Services Directions and Student Support $285,341
Total increases for Oregon’s key student services from 2010-14: $505,003
At an average of $125,000 since 2010, school services increased a mere $875,000 since 2010!
Notice: Psychological Services, Talented and Gifted, and Title 1 Reading assistance actually fell by $7 million! Summer school, which helps kids who’ve fallen behind, rose by a lousy $4 thousand! That doesn’t even include losses to electives like music, art, shop, PE, and civics!
Also, teacher training that’s not test-related is down $11 million! Yet any professional needs updates on best practices. Budget item 310 – Professional development lost – $10,830,571
Clearly, out of control test spending is robbing schools of vital programs and services! Why do we accept this? Why do we test supposedly to identify needs, when we aren’t committed to addressing them?