Tuesday, October 28, 2008

Educause 2008 - Academic Analytics

Day number 1 at Educause and again I am amazed at the large size of the Higher Education technology "business" (not to mention the size of the Orange County Conference Center that is hosting the conference). My first session is on "Academic Analytics: Using Institutional Data to Improve Student Success". The session is a half-day pre-conference session and it actually costs some additional money. Thus my expectations are relatively high.

The session was hosted and conducted by Kimberly Arnold and John P. Campbell of Purdue University. Purdue has been working on and implemented an Academic Analytics process over the past 2 years. They are focusing at the course level and their goal is to improve retention by identifying at risk students in freshman "gateway" courses. (I believe this is what we used to affectionately call "weed out" courses.) This focus was similar to many of the 40 participants in the seminar, most of whom also shared the goal of increasing student retention.

The seminar was well organized. First an overview and definition of Academic Analytics was presented. From there, we covered the support options related to this type of project. One point that was stressed repeatedly throughout the session was that analytics are much more useful if they are incorporated with a well defined goal. Collecting data for its own sake is a deep pit.

After we level checked the participants and reviewed some of the pitfalls, it was time to learn how to develop a model for academic analytics. We first discussed what types of data are appropriate to include in the model. They provided a list of several dozens of possible data elements; demographic, course related and institutional level data were all considered. From there we brainstormed a conceptual model. I focused on group of data to consider. These include:
  • Pre-Matriculation Preparation - relatively static data that the student brings to campus can help predict their proclivity for success. (SAT scores, HS Grades, Science GPA, etc.)
  • Recent University Effort - this is more recent data that may be more dynamic from week to week. (Homework assignments, number of logins, number of chat posts, recent grades from pre-req courses, etc.) NOTE: Consider this as a moving average over the student life cycle.
  • Help Seeking Behavior - outside of course metrics from other university areas. (Tutor/counseling sessions attended, health services visits, campus security incidents, IT help desk calls, etc.)
  • Student Self-Awareness Survey - the student may also provide valuable information about their potential success. (Subjective view of effort, subjective view of understanding, existing thoughts on specific course topics, interest in course topics, etc., self-reflection of health in mind, body, spirit, overview of social life)
  • Peer or Cohort Evaluations - data from ones peers may be included in a formula as a predictor of success. (impression of social interaction, changes in behaviour, etc.)
Finally, once the model is in place and has been validated by using historical data, reporting methods and intevention policies need to be created. Purdue used a stoplight metaphor for reporting. Yellow meant the student was in potential trouble and red meant there was significant concern over the students potential success. Several examples of e-mail text on interventions were also provided.