Who is really responsible for identifying, tracking student outcomes?
How are colleges and universities using data and analytics to capture and support efforts to improve student outcomes? And whose job is it, anyway?
A report out earlier this year based on a study by three industry groups explores the roles of three key stakeholder groups whose responsibilities, they found, overlap more often than anticipated when it came to measuring student success. Those groups are institutional research, student affairs and information technology; more than 900 responses were collected
"No one part of the field owns student success," said D. Christopher Brooks, director of research at the Educause Center for Analysis and Research (ECAR), during a panel session Wednesday discussing the findings at the association’s annual meeting in Denver. "It requires partnerships across the institution, breaking down the silos and looking for ways we can collaborate from the beginning of the process."
Researchers from ECAR along with the Association for Institutional Research (AIR) and NASPA-Student Affairs Administrators in Higher Education evaluated more than 900 responses to the same set of questions from members of those three stakeholder groups. They looked at the types of data projects underway to measure student success, the structures in place to do so, the level of coordination and how those outcomes were influencing campus policies.
Below, we share four takeaways from the report, "Institutions' Use of Data and Analytics for Student Success" and the panel discussion.
How much — and what kind — of data is enough?
Student success metrics can range from easy-to-collect attendance and demographic measures to factors such as motivation and self-efficacy that can be harder to capture, said Amelia Parnell, vice president for research and policy at NASPA. As colleges put more emphasis on quantifying students' proficiency in soft skills, being able to clearly and consistently show that progress will become more important.
Capturing data in order to do just that requires strategic planning, Parnell said, noting the study indicated respondents didn't agree on what constituted an "appropriate" amount of data collection. After all, institutions are targeting different student profiles, which will inform different interventions. She recommends basing the data collected on the types of interventions to be implemented.
Brooks agreed, advising colleges to avoid "collecting everything for the sake of collecting it."
Colleges have a narrow focus
One of the report's major findings is that colleges are focusing student success studies primarily on first-year students. More than half of institutions surveyed said they created studies to determine the academic progress and success of first-year, first-generation and transfer students.
The researchers noted further studies would explore the "why" behind their findings. However, colleges are placing more emphasis on recruiting first-generation and transfer students, who often face barriers to success. For example, several Ivy League universities have implemented 1vyG, a support network for first-generation college students that can include textbook donations and mentoring.
"No one part of the field owns student success. It requires partnerships across the institution, breaking down the silos and looking for ways we can collaborate from the beginning of the process."
D. Christopher Brooks
Director of research, Educause Center for Analysis and Research
Adult learners are another student group colleges are taking a closer look at, though surveyed institutions rated their focus on "nontraditional students" lower than the majority of other groups (minus student athletes and LGBTQIA students). However, research indicates some colleges could be doing more to recruit and retain this group, and experts recommend several ways to improve outreach and success tracking.
Gathering data about students and using it to tailor their academic experience to their perceived needs raises important questions around data privacy, Parnell said, as well as challenges around effective execution. "We now can identify which students would benefit most from our resources," she said. "How can we communicate that to them and measure it appropriately?"
She cited "nudges" and other notifications. While they are increasingly common learning management system (LMS) integrations, colleges should be mindful when designing these reminders of how they may be perceived by students, EdSurge reported.
Data governance will only become more important
As colleges collect more data, ownership of that information is becoming increasingly important. Especially on bigger campuses, having several disparate points of data collection can lead to redundancy and poor user experience — in addition to data security concerns.
Fewer than 20% of respondents across all three stakeholder groups the survey covered have a data governance policy in place that would be considered "a fairly robust regime that can handle pretty much whatever gets thrown at it," Brooks said.
A larger share — 45% — have a policy but said it needs improvement, he added. And the remaining 35% said such a policy is not in place, regardless of whether they think they need it.
Although the report found overlap in responsibilities and activity among the three groups, IT's role stands out. "This is where IT plays a crucial role on the front end of data collection and management," he said.
Measurement is lacking
The researchers found a not-insignificant share of respondents — anywhere from about one-half to nearly two-thirds, depending on the variable — aren't doing much to measure return on investment in student success initiatives or measuring cost.
"Perhaps these things are so complicated, it's difficult to keep your finger [on them]," Brooks said, adding this is an area where data governance could help.
Reasons for the lack of tracking, according to respondents, include difficulty filling data and analytics roles, lack of a fully developed data infrastructure and uncertainty as to what data is needed.
As for what groups are tracking costs, institutional relations rated highest for descriptive analyses, the provost's office for predictive measures, and student affairs for early-alert systems.
The use of courseware tools beyond the LMS, such as publisher-provided resources, can make capturing comprehensive data challenging. Possible outcomes, according to Brooks and Parnell, include working with vendors to ensure their programs are compatible with popular APIs as well as standardizing sources of student data and success tools.
"I don't think the solution is to have them all live in one place so much as it is to have adequate middleware,” Parnell said. "I don’t think it will happen quickly, though."
Follow Hallie Busta on Twitter