The Worthless Innovation of Dropout Prediction Algorithms
An algorithm that can predict the likelihood of young students completing high school on time sounds like a great idea. The AI could identify high-risk students early enough for real intervention. Educators could refocus their resources on the students that need the most help. And a school system could decrease their dropout rates over time.
Great in theory but abysmal in practice.
One such system was piloted in Wisconsin (the state I grew up in) starting in 2012 (my 9th and 10th-grade years). It’s called Wisconsin Dropout Early Warning System or DEWS:
DEWS is an ensemble of machine learning algorithms that are trained on years of data about Wisconsin students—such as their test scores, attendance, disciplinary history, free or reduced-price lunch status, and race—to calculate the probability that each sixth through ninth grader in the state will graduate from high school on time. If DEWS predicts that a student has less than a 78.5 percent chance of graduating (including margin of error), it labels that student high risk.
Twice a year, schools receive a list of their enrolled students with DEWS’ color-coded prediction next to each name: green for low risk, yellow for moderate risk, or red for high risk of dropping out.
Our investigation found that after a decade of use and millions of predictions, DEWS may be negatively influencing how educators perceive students of color. And the state has known since 2021 that the predictions aren’t fair. – The Markup
WTF? Following Worthless AI Predictions
Nearly a decade into its usage across the state, the data shows how badly this system fails. And specifically how badly it fails students of color.
When DEWS predicts that a student will graduate, it’s usually right – 97% of the time those students graduate in the standard four years, according to the 2021 validation test. But when DEWS predicted a student wouldn’t, it was usually wrong – 74% of the time those students graduate on time, according to the same test.
An internal Department of Public Instruction equity analysis conducted in 2021 found that DEWS generated false alarms about Black and Hispanic students not graduating on time at a significantly greater rate than it did for their white classmates.
The algorithm’s false alarm rate – how frequently a student it predicted wouldn’t graduate on time actually did graduate on time – was 42 percentage points higher for Black students than white students. The false alarm rate was 18 percentage points higher for Hispanic students than white students. – Chalkbeat
The Markup spoke with many Wisconsin teachers and found that most were given these reports but not given any guidance on how to apply the data to actually make a difference.
As a result, not much has changed since implementing this system. In Wisconsin, the gap between Black and White students’ reading and math test scores has been the widest of any state in the nation for over a decade. And the dropout rates have remained almost the same.
The question I have: Is this system predicting or reinforcing dropout bias?
Hits Close to Home
Growing up in Milwaukee, Wisconsin, I was exposed to the realities of the racial divide. It’s a city where the crime rate is scarier than the poverty rate. The city took away the mall, the movie theater, and all the other places where we could go to be normal teens.
My black peers and I were an afterthought. Less than 10% of my black male peers who started with me finished high school with me.
Were they labeled high-risk? Were they bound to drop out regardless or were they victims of bias? Did our teachers see that they were high-risk and thus decided not to pay closer attention to their improvement?
It also makes me wonder if I was (or what I would’ve been) labeled – given my socioeconomic status, single mother, race, a propensity to skip classes for self-learning entrepreneurship ventures, and occasional run-ins with the deans. What about my two younger siblings?
I’ll never know. But what I know for certain is that DEWS wouldn’t have predicted I’d finish high school at 17 – a year earlier than all of my peers.
Biased and discriminatory algorithms are a widely documented problem in machine learning. From discriminating in the hiring process (HBR) to predicting criminal recidivism (New York Times) to deciding what patients to treat (Washington Post), AI is reinforcing many of the same problems present in humans.
When machine learning only works from a set of historical knowledge, then it’s bound to repeat the same mistakes of the past.
What’s sad about DEWS is that these systems are still in use in Wisconsin and at least 11 other states including Hawaii, Kentucky, Maryland, Massachusetts, Michigan, Montana, Nevada, New Jersey, North Carolina, Rhode Island, and West Virginia.
This is why creating easier systems for more people to design and deploy algorithms for good is so crucial. Because clearly, the ones in place today aren’t ideal.
Member discussion