We Are Improving a Key Aspect of the Academic Feedback Loop (And We Can Prove it!)

A few years ago we began to ask our freshmen about the degree to which they received academic feedback early enough in the term for them to adjust their study habits. The survey item read like this:

  • “I had access to my grades or other feedback early enough in the term to adjust my study habits or seek additional academic help.”

Students could respond by selecting:

  • strongly disagree
  • disagree
  • neutral
  • agree
  • strongly agree

One of the reasons we began to ask this question was because we wanted to gather more information on the nature and scope of the feedback our freshmen received during their first term. Based on the wealth of research on the critical impact of regular and clear feedback, we have always known that this is an important aspect of an ideal learning environment. However, we had been surprised by the number of struggling first-year students who claimed to be unaware of how poorly they were doing in their classes.

Upon reviewing our first round of data near the end of the 2013/14 academic year, we had to swallow hard. 46.0% of the respondents selected disagree or strongly disagree, while only 34.1% agreed or strongly agreed with that statement.

During the 2014/15 academic year we had serious, and at times even tense, conversations about these findings. Even though it was certainly possible that some students who claimed to be unaware of their grades had simply chosen to avoid checking the grades that were clearly posted for just this purpose, these conversations led to several faculty development workshops and a lot of reconsideration of the scheduling of student assignments and the nature of the feedback provided to students. In addition, a number of conversations delved deeper into the degree to which students need to be shown how to use the feedback they receive and learn how to approach learning at Augustana differently than they may have approached learning in high school.

Over the subsequent two years, we’ve seen substantial movement on this item. In 2014/15, 36.7% of respondents selected disagree or strongly disagree (a roughly 10 percentage point decrease from the prior year) and 38.6% agreed or strongly agreed (a 4.5 percentage point increase from the prior year). Although this was encouraging to see, many instructors had already planned their course for that year by the time we had begun to discuss the findings from the prior year.

Now that faculty have had a full year to contemplate and infuse this concept into course syllabi, our 2015/16 data suggests that freshman are experiencing a substantially improved learning environment.  Examining responses from 515 freshmen (a 75.8% response rate), only 24.9% disagreed or strongly disagreed while 52.8% agreed or strongly agreed.

In two years, we’ve seen a (roughly) 20 percent swing toward an improved learning environment for our students. Although there are certainly plenty of reasons to drill deeper and continue to improve the ways that we cultivate a vibrant feedback loop between instructor and student (i.e., instructor gives student feedback, student applies feedback to improve academic work, instructor sees evidence of improvement in subsequent student work, instructor give student feedback that notes improvement and points to further opportunities to improve. etc.), I think we deserve to take a moment and realize that we’ve just accomplished something that many colleges only dream of but rarely get to see: actual evidence of improvement in the act of educating. This data provides concrete evidence that we identified an opportunity to get better, did the work to plug that finding into our daily efforts, and produced a real and significant change for the better.

I’m really proud of us.

Make it a good day,

Mark