Updated: Dec 1, 2019
Let's think back to 1993 for a moment.
Jurassic Park was the biggest hit of the summer.
Meat Loaf's "I'd Do Anything for Love" was the top-selling single.
Ann Richards was the governor of Texas.
And this guy was all the rage:
Also in 1993, the 73rd Texas Legislature was in session. Education was a major point of emphasis due to continued issues related to the Edgewood ISD v. Kirby decision a few years prior.
As a result of the session, Senate Bill 7 overhauled school finance and accountability, resulting in several of the systems and processes that we still have today (i.e. Chapter 41, annual assessments for 3rd grade and later). While many of these systems existed in part already, what SB7 created more closely resembles current school finance and accountability in Texas.
SB7 also set a few broad goals for public schools, one of which seems like it could have been written yesterday:
The achievement gap between educationally disadvantaged students and other populations will be closed.
That was written 26 years ago.
We're still fighting the same battle today.
It's difficult to do a direct comparison of academic gaps in 1993 versus today, primarily because we're two major test iterations---TAKS and STAAR---removed from the TAAS test; however, one thing we can do quite easily is assess whether the achievement gap between educationally disadvantaged students and other populations (i.e. non-educationally disadvantaged students) still exists.
...the gap remains.
To be sure of this fact, I pulled the 2019 district and campus accountability data tables, specifically focusing on STAAR performance. The question I wanted to answer is this:
Does STAAR performance correlate to economically disadvantaged rates at campuses and districts?
Pulling district data for this purpose was fairly straightforward. However, for campus data, due to the vast size of Texas and the various different grade level arrangements that exist across small and large districts, I focused on campus data for campuses with the more traditional grade brackets: K-5 for elementary, 6-8 for middle school, and 9-12 for high school.
The data that follows is based purely on STAAR performance, not overall district A-F ratings as I discussed in one of my previous posts. My purpose here was to simply determine that if 26 years after our legislators set a goal to close the achievement gap, is it in fact closed.
A brief aside for any readers from non-education fields:
The analysis that follows looks at three performance levels for STAAR assessments: Approaches Grade Level, Meets Grade Level, and Masters Grade Level.
If you want a full description of performance levels, you can read TEA's full description here, or you can read my partial discussion of STAAR performance levels on this post.
The short version: Approaches Grade Level is what most people understand as "passing" a test. While that is an oversimplification of what Approaches actually means, that understanding is at least a good starting point.
Meets simply means a student answered a larger percentage of questions correctly on their STAAR test, and Masters means they answered an even larger percentage correctly.
While, again, this is a gross oversimplification of what these three performance levels tell us about what a student does and does not know within specific content areas, for our purposes today---identifying an achievement gap---a basic understanding is all we need.
Also, another quick aside: you may have noticed that the legislators set a goal using the phrase "educationally disadvantaged" rather than "economically disadvantaged." In most instances, these terms are used interchangeably when discussing legislative issues.
Let's start with a higher level analysis of how STAAR performance stacked up against economically disadvantaged rates for districts across the state.
At first glance, it is fairly obvious that there is at least some correlation between a district's overall economically disadvantaged rate and how they perform on STAAR.
I've included the R-squared values for these charts in the upper right hand corner so that we can have an idea of how these correlations stack up against each other.
For those unfamiliar with R-squared values, you can read a great summary here of the strengths and weaknesses of using R-squared values. In short, R-squared values simply tell you how tightly two data values fit against each other. The closer the R-squared value is to 1.0, the stronger the correlation.
This sort of analysis doesn't take into account any potential biases in the data or other outside factors that may need to be considered. However, at a base level, we can at least use these values to see that as we move up the performance level range, our correlation becomes stronger.
When we look at Approaches against district economically disadvantaged (EcoDis) rates, we can see that districts with lower EcoDis rates have a higher percentage of students scoring at the Approaches grade level and districts with higher EcoDis rates have a lower percentage of students scoring that level (R-squared = 0.3911).
As we progress up the performance level range, however, this correlation only strengthens, with low EcoDis districts having a higher percentage of students scoring at Meets and Masters as compared to high EcoDis districts (R-squared values = 0.4731 / 0.4632 respectively). Of the two dozen districts with EcoDis rates below 10%, only three have Masters rates below 30%. This is dramatically different when compared against the dozens of districts with EcoDis rates over 80%, a group that only included two districts with master rates above 30%.
According to TEA, students who score at Masters will need "little or no academic intervention" in order to succeed in the next grade level. When one district---one with an EcoDis rate under 10%---has 40-50% of their students needing little or no academic interventions and another district---one with an EcoDis rate over 80%---only has 10% of their students who need little or no academic interventions, we have pretty solid proof that an academic achievement gap still exists between students of poverty and non-economically disadvantaged students.
We could use this is a stopping point and simply acknowledge the continued existence of an achievement gap; however, when we look at campus data, we see that the gap is even greater at the elementary and middle levels.
As you can see, the correlation between campus EcoDis rates and STAAR performance exists regardless of the grade level, with the greatest correlation occurring at middle school campuses.
I mean, look at the R-squared values for Meets and Masters at middle school: 0.66. Take some of our most vulnerable, hormone-raged, impressionable youth and toss the stresses of poverty on top...it shouldn't be a shock that things get out of whack academically.
No doubt some of you will notice that the correlation breaks down a bit (but doesn't disappear) when you get to high school. There could be a number of reasons for this, but my first assumption is that it has something to do with three things:
1) Increased amount of retesting, giving students an increased number of opportunities to pass their assessments,
2) Low overall master rates across the state, including at low EcoDis campuses, and
3) Higher number of "academy" type campuses in the upper secondary levels (i.e. Early College High Schools, fine arts academics, etc.).
Regardless, the correlation exists. The achievement gap remains.
So...now what? What do we do with this information?
Well, for one: hope. Hope that our legislators follow through on some of the initiatives they've put in place via HB3. Increased funding for early literacy, increased funding for teacher incentive pay, increased funding for compensatory education (which directly, in theory, addresses the achievement gap).
The promises are there.
However, as I browsed the Legislative Reference Library during the Thanksgiving break---yes, I am that kind of nerd...Hello legislative records dating back to 1846!---one thing was painfully clear: each and every legislative session comes with promises.
And more often than not, those promises are broken. Or revised. Or removed altogether.
I'd like to think we're in the midst of an educational revolution. An awakening of sorts, I guess. An era where educators use the communication powers of social media to hold our lawmakers accountable for fulfilling what they've promised.
I started this blog, in part, for that very purpose.
Here's to continued accountability, not just for educators, but for everyone.