Lest we rest on our laurels . . .

Last week I noted an important data point from our recently completed participation in the Wabash National Study of Liberal Arts Education (WNS) that suggested an increase in our seniors’ level of academic challenge.  This finding was particularly gratifying because when we instituted Senior Inquiry we had hoped that it would help us maintain the increased academic challenge that we had infused into our freshmen year through the AGES curriculum several years before.  Our 2009 NSSE data had shown a marked increase since 2006 in the academic challenge benchmark among freshman, but the parallel measure among seniors showed no change, suggesting that we might be taking our foot off of the academic gas pedal after the freshman year. This new WNS data provided evidence that we are indeed making progress toward a sustained level of academic rigor across our students’ four years.

But as with all good assessment data, the WNS data provides additional nuances that can help us continue to improve what we do even as we might (and should) celebrate our successes.  So I’d like to introduce two other data points from the WNS regarding academic challenge and student learning, consider them in the context of optimizing faculty work/life balance, and see if there might be something here worth thinking about.

College impact researchers have found that when students are (1) challenged to push their intellectual capacities through substantive assignments and (2) supported in that process with encouragement, direction, and precise and timely feedback, student are more likely to maximize their learning and growth.  In the WNS, two of the scales that address important aspects of challenge measure (a) the frequency of higher-order exams and assignments and (b) the degree to which faculty communicate and maintain high expectations for student performance.  Likewise, two other scales capture crucial aspects of support by assessment (a) the quality of students’ non-classroom interaction with faculty and (b) the frequency of prompt feedback on assignments and performance.

The two tables below report our students’ scores on the two challenge metrics and the two support metrics at the end of the first year and compare those scores to the average scores from the other similar small colleges in the WNS.  The asterisks indicate where the difference score is statistically significant (in other words, the “+” or “-” sign doesn’t necessarily mean anything by itself).

2009 Spring – Challenge Metrics

Augustana

Comparisons Institutions

Difference Score

Frequency of higher-order exams and assignments

71.2

68.8

+2.3 *

Challenging classes and high faculty expectations

69.3

66.8

+2.5 *

2009 Spring – Support Metrics

Augustana

Comparisons Institutions

Difference Score

Quality of non-classroom interaction with faculty

66.7

67.2

-0.5

Prompt feedback

61.7

60.9

+0.8

Essentially, this data suggests that in comparison to the other participant institutions in the WNS, we challenge our students during the first year a bit more while we support them at levels similar to other small colleges.

Now, look at what happens to these metrics by the end of our students’ senior year.  Again, remember the function of the asterisks in these tables.

2012 Spring – Challenge Metrics

Augustana

Comparisons Institutions

Difference Score

Frequency of higher-order exams and assignments

72.8

75.1

-2.3

Challenging classes and high faculty expectations

71.5

72.5

-1.0

2012 Spring – Support Metrics

Augustana

Comparisons Institutions

Difference Score

Quality of non-classroom interaction with faculty

83.4

77.8

+5.6 *

Prompt feedback

71.1

67.6

+3.5 *

Interestingly, the pattern in the fourth year data is reversed.  By the end of the senior year we appear to challenge our students at levels similar to the other institutions while supporting our students at levels that are significantly higher (statistically) than the other small colleges in the WNS dataset.

So what should we make of this?  I’ve got a couple of thoughts, although I’d love to hear what strikes you (if anything) about these data points.

First, it is worth parsing some of the aspects of academic challenge that impact learning.  The academic challenge measure I described last week asks questions about the number of assignments and the amount of time spent on assignments.  Obviously, it’s tough to push students to learn if they aren’t being asked to put in the time and regularly produce substantial work.  However, the degree to which any workload can effectively impact learning is powerfully influenced by how much of the work requires complex, higher-order thinking (as opposed to simple memorization and regurgitation) and how high faculty set and communicate their expectations of quality.  Otherwise, time on task often devolves into mind-numbing busy work, and there isn’t a more effective strangler of student motivation  than the perception that homework is nothing more than a black hole of directionless wheel-spinning.  Our WNS data suggests that among first year students we’ve ramped up both the amount of work expected AND the complexity of the assignments and faculty expectations (and therefore the educational potential).  However, it appears that while we’ve increased the amount of work expected of our seniors, we haven’t necessarily matched that increase with a similarly expanded expectation of educational complexity.  I suspect that we might be able to improve on our already impressive learning gains if we could find ways to distinguish the nature of our seniors’ academic challenge in a manner similar to what seems apparent in our freshman data.

Conversely, the inverse pattern of change in student support for learning between the first and fourth year simultaneously suggests reasons to celebrate and opportunities to improve.  We have ample evidence that the quality of our support for students in their latter years plays a pivotal role in their development.  However, as we examine ways to increase the success of first year students ever higher (and thereby increase our retention rates), it seems that we might benefit from considering ways to increase student support in the first year to match the level of challenge that we’ve already attained.  Although it would be nice to find a singular solution (the discussion of a Center for Student Success as well as the new mechanism for math placement and remediation may well make a profound impact), I suspect that we might find additional ways to improve by further examining the clarity and uniformity of the LSFY experience and the partnership between the curricular and co-curricular experiences during the first year.

Lastly, I wonder if taken together these findings might provide an insight into a way that we might improve our faculty work/life balance even as we maintain – or even increase – student learning.  Right now our balancing of challenge and support seems to tip toward challenge in the first year and support in the fourth year.  I wonder whether there might be an opportunity to adjust this balance slightly by adding mechanisms for support in the first year and challenge in the fourth year.  In so doing, I wonder if we might find that leaning just a bit less on student-faculty interaction for our upperclass students might allow some of our work to be not quite so time intensive.

At present, it is clear that our efforts are working – but they are clearly time intensive and come at a cost.  I am in no way suggesting that we should somehow become more cold or unfeeling toward our students.  However, as I see the burden of our efforts take its toll again during the spring term, I sincerely wonder if there are ways to reduce the amount of time we spend burning the candle at both ends.  It seems to me that caring for our students shouldn’t necessitate killing ourselves to do so.

I think we owe it to our students and ourselves to at least consider this possibility.

Make it a good day.

Mark