Educational Intelligence: How one university uses data to improve student outcomes
Southern Connecticut State University is one of five institutions in the country to receive the 2017 Excellence in Assessment designation from NILOA and its partners—a recognition for successfully using student learning data to improve student performance. SCSU was commended in 2017 by its regional accreditor, NEASC, for its essay on educational effectiveness.
Dr. Michael Ben-Avie serves as the university’s Interim Associate Vice President for Institutional Effectiveness. Kevin Michielsen, CEO of Watermark, caught up with Dr. Ben-Avie to discuss SCSU’s innovative use of data to improve student and institutional outcomes.
What is SCSU’s philosophy around data use?
We think of data in terms of educational intelligence...meaning, taking a more holistic view of data collection so that we can connect data points that will provide meaningful insights into how we can improve institutional effectiveness and student outcomes.
In 2007, we launched a longitudinal, cohort study to discern important predictors of students’ persistence, academic success, graduation, and post-graduation outcomes. The study identified a psychological-educational factor that is amenable to change—future orientation—for explaining the difference in student outcomes. In the data analyses from the ten-year study using platforms like Watson Analytics, machine learning techniques, and SPSS, we found the following: Students’ scores on an assessment that measures the quality of their learning and development accurately predicted whether they would stay at the university, graduate from the university, transfer to a different institution, or withdraw from higher education.
This led us to focus on “that which is amenable to change” instead of students’ incoming profiles, which shifted our mindset and how we talk about data on our campus. We now say, "we need evidence, not anecdotes!"
I love that mantra! What evidence are you collecting to provide insight into how you can improve outcomes?
The university participates in the Multi-State Collaborative to Advance Quality Student Learning (MSC) – a national effort led by AAC&U and SHEEO – to create a scalable way to assess essential skills based on coursework. Students’ final papers for courses are scored by faculty using AAC&U’s written communication, quantitative literacy, and critical thinking rubrics. In this process, we are able to hone in on “that which makes a difference” in students’ preparedness to demonstrate competencies desired by employers. Results have been used to inform the restructuring of the university’s access programs, developmental math curriculum, general education, and writing across the curriculum program.
To collect and score these data, we use Watermark for the MSC and all certification programs on campus to promote educational intelligence. We follow students from New Student Orientation through graduation from the university, or subsequent enrollment in other colleges, and these results are added to the cohort studies. As each incoming class enters the university, a cohort dataset is established, containing demographic information from our SIS like high school rank, high school GPA, SAT scores, gender, ethnicity, residential status. Each year, new data are added, including MSC scores, earned credits, cumulative GPA, registration status, and scores on surveys and direct assessments. Each cohort dataset now has over 1.8 million data points.
1.8 million data points is incredible! What systems do you use to manage and analyze this?
I joke that I created an academic discipline called “forensic assessment” because programs stored data all over the place and in widely different formats. It took so much energy for programs to gather and organize all the data needed for accreditation reports. Now, they have time and energy to reflect on the data because we use Watermark to collect student artifacts, assess them with rubrics, and report on learning outcomes achievement. Using Watermark has helped us evaluate essential skills and competencies across the institution, integrate this performance data with LMS and SIS data, and then disaggregate it by key demographics to analyze scores by criteria.
One of the hardest parts often is making meaning of the data. Based on the insights you’ve gathered so far, what improvements have you made and what comes next for SCSU?
SCSU continues to use data to identify low-cost, high-impact strategies. For example, when our data indicated that students were compelled to withdraw due to financial concerns, we created the position of financial literacy coordinator to intervene before it was too late.
For the future, we plan to expand our datasets and continue thinking about the kind of metrics we use to define success. Doing so will help us get to an even more sophisticated level of using educational intelligence to improve our outcomes.