You have /5 articles left.
Sign up for a free account or log in.

Istockphoto.com/DrAfter123

When a student misses a class, does poorly on a quiz, fails to turn in an assignment or otherwise seems in danger of flunking a class, what should a professor do?

At Lycoming College, a small, private liberal arts college in Pennsylvania, worried faculty members simply talk to their students or pick up the phone to call the dean and share their concerns.

In an era when many colleges and universities are using high-tech early-alert systems for monitoring and improving student performance -- many promising to boost retention and graduation rates through proprietary student tracking tools and powerful predictive data analytics -- Lycoming’s solution seems remarkably quaint.

Some institutions, such as Georgia State University, have had remarkable success with heavily data-driven student success systems. Others, including Lycoming, have not. Despite becoming well-established tools over the past decade at colleges and universities hyperfocused on improving student performance and retention, state-of-the-art early-alert systems haven't been embraced at every institution. Some have resisted the need for in-depth data reporting and gone back to their old ways of measuring student performance. Others have switched frequently between providers, seemingly unable to settle on one that works.

Andrew Kilpatrick, associate dean of student success at Lycoming, said he sometimes had doubts about the college’s old-school approach. A few years ago, college administrators seriously considered purchasing early-alert systems from leading vendors such as Starfish, EAB and Civitas Learning.

“We saw value and benefit in a lot of them but didn’t find one that was exactly what we needed -- there wasn’t a slam dunk,” said Kilpatrick.

Instead, administrators decided to create their own solution in-house. The system, created by the IT department, prompted instructors to answer a series of questions about their students’ performance during the third and sixth weeks of class. The system was used during the 2016-17 academic year, but it was soon obvious that this solution “was not nearly as effective as what we had been doing all along,” said Kilpatrick. Rather than raising the alarm as soon as they spotted an issue, faculty members were waiting to report problems, he said.

And while most faculty members were willing to engage with the new system, not everyone was on board, said Kilpatrick. Telling faculty members when and how to report on student progress “made some people feel like they were teaching high school,” he said.

There were also other hurdles, said Phil Sprunger, provost and dean of the college.

The data collected at both the third and sixth weeks “became too much to handle -- it became a distraction,” he said.

The homegrown early-alert system also resulted in “a lot of false negatives” with students being flagged as at risk when “they were probably going to be OK,” he said. “It was making people feel bad.”

Though Lycoming largely went back to its old way of doing things -- asking faculty members to report provisional grades at the sixth week only -- Sprunger recognizes the university's approach wouldn't work for every institution. Lycoming has around 1,300 students, and relationships between instructors and staff who work in residential life and advising are closer than there might be at a larger institution.

“If we had 5,000 or 10,000 students, our system wouldn’t work,” he said.

Sandra Kingery, Logan A. Richmond Professor of Spanish at Lycoming, said in an email that college's current system works very well. Faculty members have, since 1994, been asked to give early-assessment grades at the six-week mark to first-year students, transfer students and students who are deemed at risk.

"It's quick and easy to do -- faculty enter the grade a student is getting at that time, and there is a checklist of reasons we can use to explain poor grades (e.g., poor attendance, missed assignments, poor grades on exams etc.)," she said.

When Lycoming tried adding reporting at the third week in addition to the sixth week, results indicated that "having two early assessments didn't seem to improve student success more than having just one," said Kingery. "For that reason, we went back to the single early assessment in the sixth week as we had done previously."

If a problem arises before that six-week mark, faculty members will pick up the phone or write an email to notify the appropriate dean, who is very effective at identifying what's going wrong, said Kingery.

"I would also copy their student adviser, and when applicable, his or her coach," she said.

Kingery said she had never tried using a more high-tech early-alert system, "but I have a hard time believing that kind of a program would be an improvement to what we're doing already at Lycoming," she said.

"It doesn't seem to me that technology would make us any more effective than we already are at identifying problems and notifying the appropriate people about our concerns," said Kingery. "In fact, I would think any software system would actually reduce our personal connection with our students."

Finding an early-alert system that fits an institution’s culture is key to its success, said Fox Troilo, senior research adviser for higher education at Hanover Research. Hanover works with institutions to look at which factors might be causing student attrition by surveying current and past students who have dropped out of classes. This data are then used to create a predictive model that identifies students who are likely to be most at risk. (Note: Hanover does some survey work for Inside Higher Ed.)

Universities largely have the same goals for their early-alert systems: they want to improve student retention rates and ensure more students graduate. But which data need to be tracked and collected to achieve these goals will vary by institution, Troilo said.

An important factor for success is ensuring that faculty are engaged in the data-collection process. “There has to be faculty buy-in for these systems to work,” he said.

Feleccia Moore-Davis, provost at Tallahassee Community College, knows this firsthand. Earlier this year at the Achieving the Dream annual conference, where member colleges of the nonprofit organization meet to discuss different student success initiates, she told attendees that the college had started using a solution from vendor Starfish in 2012, but after a few years decided to switch to a homegrown solution when the system failed to improve student retention.

At the conference, EdSurge reported that Moore-Davis told attendees that faculty members “hated” the Starfish system. “They didn’t understand why they were doing it and they didn’t get any feedback,” she said.

The Florida college modified its IT help desk system last year to keep track of alerts. The platform, called TeamDynamix, allows faculty members to create a work order that tells staff in advising or the financial aid office that a student needs help. Alerts must be responded to within 48 hours, and faculty members receive an automatic notification when someone from the college completes the request.

The college is planning to stick with the TeamDynamix system for now.

“Faculty like that they can see into the process,” said Moore-Davis. The new system is also saving the college money. The Starfish subscription was around $73,000 a year; the new solution costs under $3,000, she said.

Moore-Davis is somewhat skeptical of the impact that early-alert systems can have on student retention.

“I think the best tool that we have ever used and that we still have access to are our faculty,” she said. “They are in the best position to retain our students.”

Howard Bell, senior vice president for higher education student success at Starfish, said that since Tallahassee joined Starfish in 2012, technology and approaches to student success have changed significantly.

“Today the way forward for student success in higher ed requires the use of guided pathways,” he said. This means that colleges shouldn’t take a “single-bullet approach” to student success, but make an “intentional effort” to ensure students and staff have access to multiple resources, including skills assessment, career exploration and academic planning and analytics, in addition to early alerts. This comprehensive suite of resources (which are all included in the Starfish offering) in addition to shared data and insights from a network of 467 institutions are what sets Starfish apart from homegrown solutions like Tallahassee’s, said Bell. “That’s why it costs more,” he said.

“Many early adopters of student-success systems have come to realize that change management at the school is just as important an element as the adopted technology,” said Bell. He added that in recent years, Starfish and other organizations have been spending more time “helping schools with cultural shifts in campus mind-set and policies that are significant barriers to the successful integration and use of tools by staff and faculty.”

Martin Balinsky, a professor of earth science and vice president of the United Faculty of Florida’s Tallahassee Community College union chapter, said he was somewhat surprised to hear Moore-Davis report that faculty prefer the current system over Starfish.

“In my experience, the exact opposite is true,” he said. “Starfish was much more user-friendly, because you could record attendance daily and there was an automated process to flag students to let them know if their grade was below a certain average, or if their attendance had not been good.”

By contrast, the new system requires faculty members to contact students twice before they can make a referral, and there is no mechanism for recording attendance.

“It is ludicrous that faculty can only make a referral if they have reached out to a student twice already,” said Balinsky. “Faculty do not have time to spend constantly calling or emailing students who have not been attending class.”

Balinsky believes the system is “babying” students and is detrimental to student success because it doesn’t teach them about “the way the real world works.”

“A community college faculty member’s primary job should be preparing the best-quality product to give the students, delivering it to them and answering their questions in a classroom or office-hour setting,” he said. “Let us do our jobs, let the students learn from both their good and bad choices, and that will be the true key to student success.”

Frank Baglione, a professor of history at Tallahassee, also said that he had preferred the Starfish system but acknowledged that when it was introduced, many faculty members disliked it because it required them to take attendance -- an activity that was not previously mandated.

Though Moore-Davis said that the new system gave faculty members greater insight into what happens after they flag a student, Baglione said he was still in the dark about what assistance students receive.

A common response that faculty members receive from the system is “could not get ahold of student,” Balinsky said. “So it’s pretty meaningless.”

Farrah Jackson Ward, associate vice chancellor for academic affairs at Elizabeth City State University in North Carolina, said her institution had experienced encouraging results with its early-alert system but noted “they don’t work out of the box.”

Elizabeth City State has worked with EAB’s early-alert solution since 2015. Initially, faculty didn’t get it and only about 50 percent of faculty engaged with the system, Ward said. They said they were too busy and forgot to do it because they weren’t reminded.

Now faculty are prompted twice a semester by email to flag any students with issues. The university is also developing training sessions, videos and handouts to ensure that faculty understand how to use the system, what happens when they flag a student and why it’s important. As a result, the university now has a 99 percent response rate from faculty.

Margery Coulson-Clark, a professor of social and behavioral sciences at Elizabeth City State, said she likes the EAB system -- particularly how it easy it is to use. EAB sends out an email with a link to the system that doesn’t require faculty members to log in. It also sends prompts about upcoming reporting deadlines. Training on how to use the system is “accessible and easily followed,” she said, making the use of EAB “less cumbersome for those who are new to it.”

Nancy Biggio, associate provost for administration at Samford University in Birmingham, Ala., also uses EAB’s solution. She says her institution did a lot of work to encourage faculty, particularly those that work part-time, to use the system.

“Faculty are an inquisitive population -- they ask lots of questions and want to know the reasons for things,” she said.

Biggio said the system has become indispensable to staff working in advising and has improved communication across the institution between staff and faculty members. Mary McCullough, chair of the Faculty Senate at Samford, said she personally had "positive experiences with the early-alert system" and believes that it has improved communication between faculty members and advisers on campus.

"I believe it's easy to use, and when I flag a student, I am confident that the Academic Success Center, the student's adviser and I can work together to follow up with the student in question," said McCullough.

Ana Borray, director of professional learning at Educause, said that when it comes to making early-alert systems successful, “technology can only do so much.” Engaging faculty members to make reports is important, but so is “closing the loop” to let them know that their reports have been acted on, she said.

Borray said it’s not uncommon for institutions to switch between early-alert providers. And a lot of new players are entering the market -- including traditional enterprise resource planning providers. As early-alert systems have evolved, they have also become more complex and costly.

“It wouldn’t be uncommon for a university to be paying $200,000 a year,” said Borray.

Borray said most institutions that use early-alert systems see some positive impact on retention rates -- but these may be more modest increases than they expected. Some institutions set unrealistic goals, such as increasing retention by 10 percent in a year, she said. And they often don’t realize how much time it takes for a system to start working.

“There is no magic,” she said.

Next Story

More from Data Analytics