The faculty adviser as a student’s GPS

At Augustana, we have always believed in the importance of faculty advising.  And we have solid evidence to support this approach.  In addition to the many proud stories of students who have blossomed under faculty tutelage, our recent senior survey data and our prior NSSE data both suggest that overall, our students are satisfied with the quality of our advising.  In fact, other NSSE data suggests that faculty ask students important advising questions about career aspirations more often than faculty at similar institutions.

Yet many of us share a gnawing sense that we can, and need, to do better.  Because even though these average scores roughly approximate general satisfaction, the degree of variability that lurks beneath them hides an uncomfortable truth.  For each advising relationship that inspires a student to excel, there are students for who gain little substantive benefit from their advising interactions.

One way to strive for improvement with some measure of confidence is to collectively apply a theoretically grounded framework of advising with a formative assessment feedback mechanism to guide our advising conversations and hone them over time.  One theory of advising, often called developmental advising, positions the adviser as a guide to help students both select and weave together a set of curricular and co-curricular experiences to attain important learning outcomes and post-graduate success.  In many ways, it harkens back to the artisan/apprentice model of learning placed in the context of the liberal arts.  In our senior survey, we included a set of questions informed by this framework to assess the degree to which students experience this kind of advising.  The table below reports the average responses to these questions among students who graduated in 2012.

Question

Mean

St.Dev.

My adviser genuinely seemed to care about my development as a whole person.*

4.13

1.003

My adviser helped me select courses that best met my educational and personal goals.*

3.98

1.043

How often did your adviser ask you about your career goals and aspirations?**

3.55

1.153

My adviser connected me with other campus resources and opportunities (Student Activities, CEC, the Counseling Center, etc.) that helped me succeed in college.*

3.44

1.075

How often did your adviser ask you to think about the connections between your academic plans, co-curricular activities, and your career or post-graduate plans?**

3.31

1.186

About how often did you talk with your primary major adviser?***

3.47

1.106

The response options are noted below.

*1=strongly disagree, 2=disagree, 3=neutral, 4=agree, 5=strongly agree
**1=never, 2=rarely, 3=sometimes, 4=often, 5=very often
***1=never, 2=less than once per term, 3=1-2 times per term, 4=2-3 times per term, 5=we communicated regularly throughout the term

 

First, I think it’s useful to consider the way that each question might enhance student learning and development.  In addition, it is important to note the relationship between questions.  It seems that it would be difficult for a student to respond positively to any specific item without responding similarly to the previous item.  Taken together, this set of questions can function as a list of cumulative bullet points that advisers might use to help students construct an intentionally designed college experience in which the whole is more likely to becomes qualitatively greater than the sum of the parts.

Second, the data we gather from these questions can help us assess the nature of our efforts to cultivate our students’ comprehensive development.  Looking across the set of mean scores reported above, it appears that our students’ advising experiences address optimal course selection more often than they help students connect their own array of disparate experiences to better make the most out of college and prepare for the next stage of their lives.

Yet, if we were to adopt this conception of advising and utilize future senior survey data to help us assess our progress, I am not sure that continuing to converting each question’s responses to a mean score helps us move toward that goal.  The variation across students, programs, student-faculty relationships, and potential pathways to graduation doesn’t lend itself well to such a narrowly defined snapshot.  Furthermore, suggesting that we just increase an overall mean score smells a lot like simply adding more advising to all students instead of adding the right kind of advising at the just the right time for those who need it the most.

A more effective approach might be to focus on reducing the percentage of students who select particular responses to a specific item.  For example, in response to the question, “How often did your adviser ask you to think about the connections between your academic plans, co-curricular activities, and your career or post-graduate plans?” 25% of the 2012 graduating students indicated “never” or “rarely.”  It is entirely possible to reduce that proportion substantially without markedly increasing an average score.  For example, if we were to find a way to ask every student to consider the questions outlined in the senior survey once per term while at the same time focusing less on whether students indicate “often” or “very often,” we might find that the proportion of students indicating “never” or “rarely” drops considerably while the mean score remains about the same.  More importantly, I would suggest that at the end of the day we might have become more effective (and dare I say more efficient) in making the advising relationship a positive and influential piece of the educational experience without exhausting ourselves in the process.

As we embark on our HLC Quality Initiative to improve our advising efforts, I hope we will think carefully about the way that we utilize our data to understand our progress.  Our goal is to improve our educational effectiveness – not just move a number.

Make it a good day,

Mark