We keep hearing about data, and how data analysis is going to help education chart its own course toward salvation. I’ve been swimming in a sea of data lately, trying to make out some landmarks.
When the accrediting process for independent schools added a kind of data requirement a couple of years back, a real one this time, there was an expectation that schools, seeing where the bus was going, would merrily hop aboard. Most schools already had some measurement tools in place–ERB scores, admission tests, PSAT and other college-admission testing records–and so it was just a matter of making some correlations and seeing what changes, to the math program or the English curriculum, were going to be needed. Easy-peasy.
Shortly after this change to the accreditation standards, I did a project for the National Association of Independent Schools on the processes schools were using to go about their data-informed decision-making, and the results were a little surprising. Schools had the data, mostly, but very few had any idea what to do with it or how to create protocols for analyzing it or applying the results. Apparently, with all the proliferation in new kinds of administrative positions, schools had forgotten to sign on their data analysts or institutional researchers. Except in rare cases, few schools looked closely even at their ERB or PSAT scores as reflections of their programs.
NAIS has been working to help schools build the infrastructure and the capacity to use data to understand themselves better. Much of this work has been done on the business side, with dashboards and even bigger data sets to show for their efforts.
Me, I’ve been trying to figure out what use to make of things like SSAT and ISEE scores, breaking down my school’s admission testing data into all kinds of categories and permutations. The SSAT helpfully offers projected SAT scores, so I’ve looking at how those work out (considering they prognosticate in score bands of 50 to 140 points, it’s hard to know the value, exactly) and then how incoming students’ percentile scores (normed for a population applying to selective schools) match up with their percentile scores on tests like the ACT and SAT (normed for much larger groups). I’m also tracking the projections from kids’ 8th-grade EXPLORE to 10th-grade PLAN to senior ACT scores, just (naturally) as the ACT folks are about to change their whole system. (Quick takeaway: Individual classes do see to have their own collective character, confirming ages of teacher instinct.)
It is work worth doing, even if we’re still in the shallow end. It helps us see whether we are doing as a school what we claim to be doing, and whether there are kids or groups whose performance differs from the generality of our students. We’ve now offered the CWRA for three years, and in time we hope to start making better use of the results from that, which seems to measure capacities we say we focus on. Earlier this week we administered the High School Survey of Student Engagement, the HSSSE, through a joint effort of Indiana University and NAIS. The HSSSE has the advantage of being easy to give and to complete, not terribly expensive (about four bucks a kid for us, using the paper version and including the reports), and of providing a picture in the moment not only of how engaged students are with school and their work but also of their lives as students in our community. It’ll be interesting to see what we learn.
I’m confident that we are learning something in all of this, but I have to remind myself to take a deep breath every time I spot a provocative trend or what looks like a disturbing anomaly–the sky is not falling, and few data sets amount on their own to an Aha! moment. But there is definitely stuff we can use to make our school a better place.
What keeps me grounded as I burrow around is my constant awareness that our sample sizes are small and that each little number represents the efforts of a person. Public school teachers, parents, and students have begun to resist against the waves of standardized testing that seem to be transforming public schools into giant data-collection sites. The testing regimes under which kids and teachers in many places live is a vacuum sucking the heart out of education and processing alleged evidence of student learning–bubble tests on end–into nothing more than numbers that can be used not only to “prove” almost anything but also to dehumanize the students into data points–even as they are the ones whose time, energy, and anxieties feed this monster. No wonder dystopian fiction is so popular with kids these days.
No, kids are not data points, neither at my school nor anywhere else. The minute anyone at a school loses that distinction is the moment when someone needs to start questioning this whole data thing. Kids are kids, and the data they offer us–especially from limited, snapshot instruments like the HSSSE and standardized tests–is at best a poor measure of who they truly are and what they will accomplish in the fullness of time.