Privacy SOS

Predicting student outcomes by data-mining their information: What are the risks?


You might have read lately that elementary, middle, and high school administrators are using high-tech tools to track and monitor students as they move through public school systems. But that tracking doesn’t end when students graduate high school, if they go on to university.

Georgia State is one college that’s feeding many points of student data into algorithms that alert administrators and professors when it seems as if a person may be falling behind.

The Atlantic reports:

At Georgia State University, algorithms alert advisers when a student falls behind in class. Course-planning tools tell students the classes and majors they’re likely to complete, based on the performance of other students like them. When students swipe their ID cards to attend a tutoring or financial-literacy session, the university can send attendance data to advisers and staff.

Institutions can track what students say in online class forums, who downloads the lecture notes, and how long they spend reviewing online material. Institutions can record when and where students swipe their ID cards to follow their physical movements, from the dining hall to the health center.

To engage in these practices, institutions typically build or purchase software—the Knewton Platform, for example—that analyzes every keystroke a student makes to figure out his or her learning style. “The NSA has nothing on the ed tech start-up known as Knewton,” Politico wrote earlier this year. Some of the data these learning applications collect doesn’t fall under the federal government’s definition of “educational record,” and thus doesn’t fall under laws that restrict the kind of information colleges can and cannot share with third parties.

Universities are supposed to be places where young people can share ideas and test theories without fear of censure or retribution. As education technology corporation executive Matt Pittinsky warns in an interview with the Atlantic, these kinds of tracking mechanisms and predictive tools might stifle experimental thought exactly where our society and youth most need it, and expect to find it.

And worse still, Pittinsky says, basing academic decisions around finely detailed data may have the effect of inappropriately “red lining” young people for insufficient cause. Have you ever made a mistake? Shown up to class late? Or maybe even dropped a class in the middle of the semester because you lost interest, or had too much work to do? In a world where every data point adds up to a future prediction of your likelihood to succeed, just one mistake may mean you’re denied opportunities afforded to others. “What begins as the notion of pacing education to each learner’s abilities at the time can very quickly become a solidified view of what someone is able to do and what someone is not able to do, with very heavy-handed direction given to them about what they then have access to,” Pittinsky warns.

Read more about the difficult questions raised by predictive analytics in the classroom.

© 2024 ACLU of Massachusetts.