Setting a high bar for equality in graduation

US News rankings have never been my favorite part of higher education. For many years these rankings did little more than con colleges and universities into an illusory arms race under the guise of increasing educational quality. But recently US News has started to use their data, power, and influence to prod more useful conversations that might lead to improvements at higher education institutions. Last week, US News released their rankings for “Which top-ranked colleges operate most efficiently.” Like last year Augustana appeared near the top of the list among liberal arts colleges, suggesting that we apply our limited resources effectively to educate our students. Whether conversations about “efficiency” give you a warm fuzzy or a cold shudder, I don’t think it’s particularly controversial to say that such recognition is, at the very least, more good than bad.

But in keeping with their deified status in higher education, the US News rankings giveth and the US News rankings taketh away. A few weeks ago, they released another set of rankings that I found particularly intriguing given our recent campus discussions about equality and social justice. This set of rankings focused on the graduation rates of low-income students, and contrasted the proportion of low income students who ultimately graduate from each institution with each institution’s overall graduation rate. Based on these two numbers, US News identified colleges and universities that they called “top performers,” “over performers,” and “under performers.” Sadly, Augustana appeared in the under performer group with a 13 percentage point deficit between our overall six-year graduation rate (78%) and our six-year graduation rate of low-income students (65%). Just in case  you’re wondering, these graduation rates come from students who entered college in the fall of 2007.

Because of the focused nature of this particular analysis, US News combined all institutions from their two national ranking categories (national universities and national liberal arts colleges) to create these three groups. The presence of several familiar institutions in each group suggests that there might be something to learn about graduating low-income students from other similar institutions that might in turn help us narrow our own disparity in graduation rates. 

The criteria for the “top performer” category required that the institution’s overall graduation rate was above 80% and that the graduation rate of low-income students was the same (or within a percentage point). While there were numerous national liberal arts colleges on the list, they were generally highly ranked institutions with well known pedigrees. However, two familiar institutions appeared in this category that seemed worth highlighting.

  • St. Olaf College – overall grad rate: 88%, low-income grad rate: 87%
  • Gustavus Adolphus College – overall and low-income grad rate: 82%

The criteria for the “over performer” category was simply that low-income students graduated at a higher rate than the overall student population. There were several institutions in this group that are not too different from us, particularly based on their US News overall ranking (remember, Augustana was ranked #105 this year).  These institutions include:

  • Drew University (#99) – overall grad rate: 69%, low-income grad rate: 76%
  • College of the Atlantic (#99) – overall grad rate: 69%, low-income grad rate: 75%
  • Knox College (#81) – overall grad rate: 79%, low-income grad rate: 83%
  • Lewis & Clark College (#77) – overall grad rate: 74%, low-income grad rate: 79%
  • Beloit College (#61) – overall grad rate: 78%, low-income grad rate: 83%

Interestingly, there were also some institutions in the over performer group that probably wouldn’t dare to dream of a ranking approaching the top 100. In other words, they would probably trade their place for ours in a heartbeat. A few to note include:

  • Oglethorpe University (#148) – overall grad rate: 62%, low-income grad rate: 67%
  • Illinois College (#155) – overall grad rate: 64%, low-income grad rate: 68%
  • Warren Wilson College (#165) – overall grad rate: 51%, low-income grad rate: 60%
  • Ouachita Baptist University (#176) – overall grad rate: 60%, low-income grad rate: 80%
  • Wisconsin Lutheran College (#178) – overall grad rate: 64%, low-income grad rate: 75%

Finally, the under performer group noted institutions where low-income students graduated at rates lower than the overall graduation rate. Some similar/familiar liberal arts colleges in this group included:

  • Augustana College (#105) – overall grad rate: 78%, low-income grad rate: 65%
  • Washington College (#105) – overall grad rate: 68%, low-income grad rate: 49%
  • Hampden-Sydney College (#105) – overall grad rate 62%, low-income grad rate: 43%
  • St. Mary’s College of Maryland (#89) – overall grad rate: 73%, low-income grad rate: 64%
  • Wittenberg University (#139) – overall grad rate: 63%, low-income grad rate: 49%
  • Alma College (#139) – overall grad rate: 61%, low-income grad rate: 44%

Although we ought to be careful not to jump to rash conclusions from this data alone, there are a couple of suppositions that this data seems to contradict. First, although the national graduation rates for low-income students consistently lag behind overall graduation rates, this is not necessarily so at every institution. Some institutions graduate low-income students at substantially higher rates than the the rest of their students. Second, it does not appear that institutional wealth, prestige, or academic profile guarantees graduation equity. There are institutions at both ends of the ranking spectrum that manage to graduate low-income students at a higher rate than the rest of their students. Third, geographical location doesn’t necessarily ensure success or failure. Successful institutions are located in both urban and rural locations.

I don’t know what makes each of these successful institutions achieve graduation equality. But in looking at our own disparity in graduation rates, it seems to me that we might learn something from these institutions that have found ways to graduate low-income students at rates similar to the rest of their students. We have set our own bar pretty high (our overall graduation rate of 78% is comparable or higher than all of the institutions I listed from the US News over performer category). Now it’s up to us to make sure that every student we enroll can clear that height. We shouldn’t be satisfied with anything less.

Make it a good day,

Mark

Some Key Findings from our Recent Alumni Survey

Every once in a while you get lucky enough to have multiple studies that all point pretty clearly to the same conclusions.  So in the spirit of Christmas, I give you a gift of confirmatory evidence that all of what we do at Augustana – in the classroom and outside of it – matters for student learning.  Special thanks should go to my student assistant, Melanie, who did all of the data analysis and even wrote the first draft of this post.  Thanks, Melanie!

The Recent Alumni Survey asks a cohort of graduates about their experiences in the nine months since they walked across the stage to receive their diploma. Three items in this survey are designed to get at some of the intended outcomes of an Augustana education.  Those items ask:

  • To what extent do you feel your Augustana experience prepared you to succeed in your current program?
  • To what extent do you feel your Augustana experience prepared you to succeed in your current position/job?
  • To what degree does your current professional/educational status align with your long term career goals?

The first two questions address our graduates’ perception of the quality of their preparation for their next step in adult life, be it graduate school or their first foray into the world of work. Because we care about the full arc of our graduates’ adult lives, the third question addresses the degree to which that “next step” – the one for which our mission demands that we play an important role in preparation and selection, aligns with their long term goals.

To help us improve the quality of an Augustana education, we want to determine the nature of the relationship between college experiences that we already believe to be important (gleaned from our last senior survey) and our graduates’ lives nine months after they graduated. To this end, we linked responses from our 2013 senior survey and same individuals who responded to our recent graduate survey in the winter and early spring of 2014. After identifying which senior survey items significantly predicted (in a statistical sense) these recent alumni outcomes, we expanded our analysis to account for several factors that might confound our findings: race, socio-economic status, gender and cumulative GPA. The table below shows the experiences that emerged as statistically significant positive predictors for each outcome organized by the nature of the environment in which those experiences exist.

  To what degree does your current professional/ educational status align with your long term career goals? To what extent do you feel your Augustana experience prepared you to succeed in your current program? (asked of alums in grad school) To what extent do you feel your Augustana experience prepared you to succeed in your current position/job? (asked of alums in the workforce)
Co-curricular Experiences -My out-of-class experiences have helped me develop a deeper understanding of myself -My out of class experiences helped me develop a deeper understanding of myself*-My out of class experiences helped me develop a deeper understanding of how I relate to others
Advising – How often did your major adviser ask you about your career goals and aspirations? – How often did your major adviser ask you about your career goals and aspirations? – How often did your major adviser ask you about your career goals and aspirations?-My major adviser connected me with other campus resources
Experiences           in the Major -Faculty in this major cared about my development as a whole person-In this major, how frequently did your faculty emphasize making judgments about the value of information, arguments, or methods, such as examining how others gathered and interpreted data and assessing the soundness of their conclusions?
Overall Curricular Experience -My one-on-one interactions with faculty have had a positive influence on my intellectual growth and interests in ideas -My one-on-one interactions with faculty have had a positive influence on my intellectual growth and interests in ideas

Clearly there are multiple experiences across a range of settings that influence these three outcomes. Moreover, these findings are similar to the results of prior alumni data analyses and replicate findings from analyses of senior survey data.  In short, we can be confident that the experiences noted in the table above play a critical role in shaping the success of Augustana graduates.

These findings strongly emphasize the importance of quality and purposeful faculty interactions with students. The item, “my one-on-one interactions with faculty have had a positive influence on my intellectual growth and interests in ideas,” significantly predicted students’ sense of preparedness for both those entering graduate programs and those who went into the workforce. This item focuses on more than the frequency of students’ interactions with faculty or friendliness of those interactions. Instead, this item emphasizes the nature of faculty influence; encouraging, inspiring, cajoling, pushing, prodding, and even challenging students to engage tough questions and complicated ideas while at the same time supporting students as they struggle with the implications and ramifications of their own evolving values, beliefs, and worldview.

Faculty influence was again evident in the advising relationship. The question, “How often did your major adviser ask you about your career goals and aspirations?” significantly predicted all three outcomes. In addition, for graduates in the workforce faculty attention to connecting students with other campus resources also influenced the graduates’ sense of preparedness. Furthermore, faculty impact on our graduates’ success is apparent in the major experiences that predicted students’ sense of preparation for their career. Two items were significantly predictive: “Faculty in this major cared about my development as a whole person,” and “In this major, how frequently did your faculty emphasize making judgments about the value of information, arguments, or methods, such as examining how others gathered and interpreted data and assessing the soundness of their conclusions?” In addition to confirming the caring aspect of quality and purposeful faculty interactions with students, this finding also highlights the value of classroom experiences that cultivate higher order thinking skills.

It is also worth noting the importance of out-of-class experiences in predicting our graduates’ success. Again, the importance of the developmental quality of these experiences is paramount. Instead of items that denote participation in particular types of organizations or activities, the items that proved predictive emphasize that the experiences that matter are ones that help students develop in two ways. First, they help students develop a deeper understanding of themselves.  Second, they help students develop a deeper understanding of how they relate to others. Obviously, these skills are critical for success in every manner of adult life.  The key for Augustana is to ensure that every out-of-class experience contributes – directly or indirectly – to this kind of growth.

The goal of this analysis was not to determine which experiences (faculty interactions or co-curricular experiences) play a larger role in shaping Augustana graduates’ outcomes. Instead, it is clear that all facets of the Augustana education contribute to our students’ success.  It is also clear that not all graduates experience Augustana in a way that maximizes the potential impact of quality and purposeful faculty interaction or developmental out-of-class activities.  Throughout the institution, we can use these findings as principled guidelines to improving the work that we do with our students.

Make it a good day (and a great holiday break),

Mark

When does a stereotype lose it’s margin of truth?

As we tumble down the back side of the fall term, I know that the potential value of a long and involved blog post drops like a stone in the face of the ten things that have to be solved right now(!).  So I’m gonna just roll out one big-picture data point and let you mull it over when you get a chance to breathe.

Remember that economic collapse that wrecked the economy and scared the pants off of tuition dependent college like us?  Yeah, sorry – not the best way to start the week.  Although there are a few reasons to think that the American economy might be slowly pulling out of its nose dive, we all know that the ripple effects haven’t abated much.

I’ve been trolling our three years of senior survey data lately to look for trends that might be worth noting (see last week’s post on a couple of general education items).  This morning one set of numbers really jumped out at me; the increasing proportion of our students who qualify for Pell Grants – a group of students who almost certainly wouldn’t be at Augustana if it weren’t for need-based financial aid from the federal government and the state of Illinois through the MAP grant program.

The table below shows the increase in this aspect of our student population over the past three classes.  I’ve included the actual numbers in parentheses to add some perspective.

Cohort % Pell Recipients as entering freshmen % Pell Recipients as graduating seniors
Fall of 2008 – Spring of 2012 12.5% (80/641) 10.7% (51/476)
Fall of 2009 – Spring of 2013 17.5% (108/616) 14.2% (66/465)
Fall of 2010 – Spring of 2014 24.6% (185/753) 22.4% (119/532)

You can see that over a relatively short period of time, we’ve roughly doubled the proportion of students who receive Pell Grants (and therefore also received MAP Grants from the state of Illinois).  If you look closely, you can see a hint of this lower socioeconomic status on retention and graduation, since the proportion of these students shrinks over the course of four years.

Just in case you were wondering, 26.2% of our student body (655/2500) of our current overall student body is receiving a Pell Grant.  And among our newest class of freshmen, 28.1% are receiving Pell Grants.

The headline of this blog referred to the stereotype that we like to throw around about our students coming from the Chicago suburbs.  Sometimes I hear that stereotype dressed up with healthy dose of wealth, homogeneity, and entitlement.  I’m not trying to suggest that those students don’t exist at Augustana.  Of course they do.  But the thing about these numbers that hit me was this:

  • We have a trend in our data that suggests a growing proportion (now over a quarter of our student body) of our students that, at least on one critical dimension, don’t conform to that stereotype.
  • Are we adapting our expectations and interactions with our students to match what we know about from whence they come?

Make it a good day,

Mark

What is the role of general education? Some ominous shadows in the data

Despite a genuine commitment to a liberal arts mission, at times it seems easier said than done. On one hand, public fretting (some of it well founded) about unemployed and apparently unemployable college graduates has made some suggest that a college education should focus more of its coursework on preparation for a specific career. On the other hand, the proliferation of knowledge and sub-disciplines within many academic fields translates into more knowledge that faculty often believe (sometimes rightly) need to be added to the range of concepts covered within a particular major. Both of these tangible pressures bolster the argument for expanding the footprint of the major. By comparison, the counter-arguments for maintaining a robust general education program tend to be more abstract and sadly, rarely stand a chance.

Two trends (one macro and one micro) highlight the declining clout of general education. First, the number of U.S. colleges classified as liberal arts colleges has dropped substantially in the last several decades (from 212 to 130). Most of this change involves institutions that expanded their educational offerings into more vocational and pre-professional programs. Closer to home, the proportion of Augustana students who earn at least two majors continues to increase (over 45% of graduates in 2014). In the case of Augustana students, our double majors don’t stay in college longer than everyone else, they just concentrate more of the credits they earn in specific areas.

During last year’s conversations about the relative impact of general education and potential improvements that could be made, some seemed to suggest that our general education program was not in need of revisions. One of the questions posed was whether or not our senior survey data might provide evidence to inform this conversation. Now that we have a third year of senior survey findings, I thought it might be useful to explore the responses to the survey’s general education items and look for any patterns or hints of trends. I’m not sure that the findings below provide definitive answers, but I hope they will further inform the discussion and direction of the general education conversation at Augustana.

The Augustana Senior Survey includes six questions intended to assess the nature of students’ experiences in their non-major or general education courses. Interestingly, the lowest average response score over the last three years came from the 2014 seniors on five of the six questions. Further analysis indicated that the drop from highest to lowest score was statistically significant for four of those questions. They are listed in the table below.

Senior Survey Gen Ed Question 2012 2013 2014
The skills I learned in my general education courses helped me succeed in my major courses. (response options – strongly disagree, disagree, neutral, agree, strongly agree) 3.55 3.43 3.38
My classes outside my major(s) challenged me to produce my best academic work. (response options – strongly disagree, disagree, neutral, agree, strongly agree) 3.57 3.53 3.44
In your non-major courses, about how often were you asked to include different perspectives (different races, religions, genders, political beliefs, etc.) in class discussions or writing assignments? (response options – never, rarely, sometimes, often, very often) 3.50 3.52 3.41
About how often did you discuss ideas from your non-major courses with faculty members outside of class? (response options – never, rarely, sometimes, often, very often) 2.88 2.82 2.76

I fully admit that three years of data is not nearly enough to make predictive claims or produce some sort of smoking gun. However, it is enough data to begin triangulating these findings with others (everything from observational to rigorously quantitative data) and look for evidence of multiple findings moving in the same direction. This can be a particularly effective way to identify early “shadows” in the data and give us time to consider their implications in a less reactive environment.

It would be entirely reasonable to expect some fluctuation on average response scores for individual items across multiple years. But it struck me as curious that the responses to so many of the general education items – questions that I think represent the way that we imagine our general education courses functioning at a liberal arts college – moved together in a negative direction.

What might explain this phenomenon? Is it a function of our students feeling an increased pressure to focus on career preparation? Could it be a function of our own subtle leanings toward areas of our own expertise? Or could it be that we lack a clear sense of exactly how our general education requirements link together to form the kind of integrated breadth of understanding that would ultimately produce the ideal liberally educated student?

I don’t know the answer to any of these questions. But these findings did make me think again about our discussion of the role of general education and the degree to which we may need to revisit our commitment to 1) the role of general education and 2) the way we ensure that our general education program helps students develop all of the knowledge, skills, and dispositions that we know are critical to their success after graduation.

Make it a good day,

Mark

 

Retention, part deux: Freshman survey data to the rescue!

In last week’s Delicious Ambiguity post we dove into the deep end of the data pool regarding retention and the complexity of the problem. Our institutional data shows that the factors shaping our student’s decisions to withdraw or persist are influenced by characteristics that students bring with them to college as well as experiences that they have during their first year. Moreover, it’s clear that addressing this issue requires “all hands on deck” if we are to make any demonstrable progress.

But knowing that information by itself leaves us far short of actually knowing what to do differently. For us to improve our retention rates we need to know which student experiences matter most in shaping their decision to persist. We need to identify specific experiences over which we have substantial and concrete influence. Information about more general experiences, even if they are specific to one aspect of the college experience, is not enough. For example, both of the items below predict our students’ general sense of belonging on campus.

  • “My day to day experiences in my residence hall have helped me feel like I fit in at Augustana.”
  • “I know that my Community Adviser (CA) cares about how I am doing at Augustana.”

Although both findings might appear interesting, the item addressing our CA’s impact on students’ sense of well-being is more specifically prescriptive, providing tangible guidance for designing the role of CAs as well as the way that we select, train, and assess their efforts.

Similarly, if we can collect and link granular experience data to bigger picture retention data, we will be more likely to glean specific direction from our data analyses that ultimately helps us improve retention. This was our big aspiration when we altered our freshman survey’s design last year, gathering more specific experience data about academic acclimation and social integration midway through the first year. After analyzing last year’s responses, I’m excited to share several specifically actionable findings that appear to increase the likelihood of persistence.

After last week’s examination of some retention trends involving race, gender, socioeconomic status (SES), and pre-college preparation, we applied those findings to run some statistical models that combined data from our institutional data, our student readiness survey, and our mid-year freshman survey. All of the findings I share below hold true after taking into account race, gender, SES, pre-college preparation (ACT). We then added items from the freshman data that might influence retention above and beyond those four pre-college characteristics. In the end, two items produced statistically significant effects.

The first item producing a statistically significant positive effect on retention addressed a very specific aspect of the LSFY/Honors experience.

  • My LSFY/Honors instructor helped me develop at least one specific way to be a more successful college student.

Just as so many other researchers have found previously, students often need guidance in figuring out how to successfully navigate college. This means so much more than just knowing how to react to trouble in a class or with a roommate. Instead, this means knowing how to take control of the experience in order to make the most of it in preparation for life next month, next term, or next year.  Our finding suggests that LSFY and Honors instructors who taught students at least one specific way to proactively engage college as a student actually contributed directly to student persistence. And this finding held regardless of incoming ACT score, suggesting that this kind of learning is valuable for all students no matter their pre-college academic preparation.  As LSFY continually explores ways to make that course more effective, this finding seems well worth incorporating into that discussion.

The second item that produced a statistically significant effect addressed a more general sense of social integration.

  • I feel like I belong on campus.

Although this is interesting, we needed to dig further to come up with more concrete guidance toward future improvement. After peeling back another layer of the onion, we found that very kind of guidance.

This time, in addition to accounting for race, gender, SES, and ACT score, we decided to add comfort with social interaction to the mix. Interestingly, in the end comfort with social interaction still produced a statistically significant positive effect, suggesting that despite everything that we might do to influence students’ sense of belonging, more reserved students are likely to still feel less of a sense of belonging than more outgoing students at the midpoint of the first year. However, this effect appears to vanish by the end of the first year, supporting the contention that more reserved students may simply need more time to find their niche on campus.

Under these analytic conditions, we found granular experience guidance for faculty, both as instructors and as advisers, and for student affairs professionals that appear to influence student’s sense of belonging. The two items addressing faculty interactions with students were:

  • My first year adviser made me feel like I could succeed at Augustana.
  • How often have your instructors pointed out something you did well on an assignment or in class?

Again, these findings held even after accounting for incoming ACT score. In other words, regardless of a student’s academic “ability,” faculty communicating to students that they can succeed and pointing to something that they have done well appears to contribute to a student’s sense of belonging on campus. This doesn’t mean that faculty should pull punches or tell students that they are doing well when they are not. Instead, this suggests to me the faculty play a critical role in contributing to student’s belief that they can succeed and then finding positive reinforcement to show them the way.

We found two items predicting a sense of belonging on campus that provide some concrete guidance for working with students outside of class.

  • Fall Connection provided the start I needed to succeed academically at Augustana.
  • I know that my Community Adviser cares about how I am doing at Augustana.

Although the item addressing Fall Connection (now called Welcome Week) seems fairly general, I think it further emphasizes the importance of the changes introduced this year to increase the emphasis on academic preparation. Based on last year’s data (prior to this elevated academic emphasis), this aspect of Fall Connection mattered significantly. In addition, I am particularly intrigued by the nature of the CAs impact on students’ sense of belonging. This kind of guidance provides pretty clear direction in designing the nature of CAs conversations with students.

All of these findings together simply confirm that we all play a significant role in shaping our students’ decision to persist at Augustana College. I hope we can find ways to further convert these findings into concrete action. As with so many other aspects of college students’ experience, it’s not what they do; it’s how they experience what they do.

Make it a good day,

Mark

Let’s talk retention (a.k.a., how to start a fight on campus)

Ok, so that headline might sound a little dramatic. Yet even on the most collegial of campuses, a serious conversation about retention rates – especially if that number has gone in the wrong direction over the last year or two – can quickly devolve into wave of finger pointing and rekindle a litany of old grudges.

We’ve all heard the off-handed comments. “If admissions would just recruit better,” “If faculty would just teach better,” “If student affairs would just help students fit in better,” “If financial aid would just award more money.” Lucky for me, all of these assertions are testable claims. And since we have the data . . . (cue maniacal laughter).

Yet using our data to find a culprit would fly in the face of everything that we are supposed to be about. We say that our students learn because of the holistic nature of the Augustana experience. And analyzing our student data by individual characteristics implies that our students are somehow one-dimensional robots. Most importantly, if we want to improve, then we have to assess with the specific intent of starting a conversation – not ending it. That means that we have to approach this question with the assumption that we are all critical contributors to retaining students.

This year’s first-to-second year retention rate isn’t great.  At 82.9%, it’s the lowest it’s been in five years. In the context of the Augustana 2020’s target retention rate of 90%, there is certainly reason for raised eyebrows. To understand what is going on underneath that overall number, it’s worth looking at our data in a way that mirrors the students’ interaction with Augustana up until they decide to stay or leave. So let’s organize these trends into two categories: students’ pre-college demographic traits and the students’ first year experiences.

Scholars of retention research generally point to five pre-college demographic traits that most powerfully impact retention: gender, race, socioeconomic status, academic preparation, and first generation status. Below is five-year trend data across these categories.

2009 2010 2011 2012 2013
Overall 87.8% 87.6% 84.4% 84.9% 82.9%
Female 91.2% 89.6% 85.7% 90.1% 82.7%
Male 83.6% 85.0% 82.8% 78.6% 83.2%
White 88.1% 89.0% 84.4% 85.8% 84.2%
Multicultural 87.0% 82.6% 85.6% 81.3% 78.4%
Pell grant recipient 78.4% 83.3% 84.0% 81.3% 80.8%
Only qualified for loans 88.7% 87.4% 83.5% 83.3% 81.2%
Did not qualify for need aid 90.6% 90.6% 86.2% 89.5% 86.7%
First generation data not collected until 2012 83.0% 80.8%
ACT <= 22 77.8% 82.1% 83.3% 75.0% 78.6%
ACT 23-25 90.5% 89.5% 84.3% 87.7% 84.9%
ACT 26-27 92.4% 88.2% 83.0% 90.4% 83.8%
ACT >=28 91.0% 90.9% 89.0% 87.0% 82.2%
Test Optional 76.7% 75.0% 75.8% 81.3% 84.8%
ACT top three quartiles 91.0% 89.9% 85.3% 88.3% 83.9%

As you can see, our own data suggests a more complicated picture. Although nationally women persist at higher rates than men, our data flipped last year when persistence among men actually eclipsed persistence among women. Our retention rate for multicultural students (our euphemism for non-white students) has trended steadily downward, a fact made more pressing by a steady increase in the number of multicultural students. Although we haven’t tracked first-generation status for more than a few years, this retention rate has also dropped while the number of first generation students has increased. While our retention rate of Pell Grant recipients (those students with the highest need) has increased slightly, the retention rates of students who only qualified for loans has dropped steadily. At the same time, the retention rate of students from the highest socioeconomic status has dropped a bit.

Finally, academic preparation retention rates paint an interesting picture. The national data would suggest that our worst retention rates should be among those students who come from the lowest ACT quartile. At Augustana, those students’ retention rates are also lower and haven’t changed much. By contrast, the retention rates of students from each of the other three quartiles, although they are still higher than the lowest quartile, dropped substantially between 2012 and 2013. Interestingly, the retention rate of the students who applied test-optional has gone up almost 8 points over the past five years.

But students’ likelihood of persisting to their second year is not etched in stone before they start college. Another way to look at some of these trends is to examine the characteristics of the students who leave against those traits that might indicate an experience that differs from the mainstream in some important way – especially if we know that this difference in experience might affect the calculus by which the student determines whether it is worth the time, money, and emotional investment to stay. So as you look through this table, remember that these percentages are the proportion of departed students who fit each category.

2009 2010 2011 2012 2013
cumulative GPA was below 2.5 57.3% 50.5% 47.3% 43.4% 39.3%
Male 60.0% 51.6% 47.3% 63.6% 45.8%
Female 40.0% 48.4% 52.7% 36.4% 54.2%
White 68.0% 73.1% 78.2% 71.7% 69.2%
Multicultural 9.3% 21.5% 16.4% 28.3% 30.8%
Pell grant recipient 29.3% 32.3% 25.5% 31.3% 31.8%
Only qualified for loans 36.0% 38.7% 47.3% 46.5% 42.1%
Did not qualify for need aid 34.7% 29.0% 27.3% 22.2% 26.2%
First Generation 24.2% 30.8%
ACT <=22 32.0% 28.0% 20.9% 30.3% 25.2%
ACT 23-25 21.3% 23.7% 28.2% 23.2% 26.2%
ACT 26-27 12.0% 19.4% 20.9% 11.1% 15.9%
ACT >=28 21.3% 19.4% 18.2% 24.2% 24.3%

Much of this data corroborates what we saw in the examination of retention rates by pre-college characteristics in the case of gender, race, socioeconomic status, first generation status, and academic preparation. However, one new trend adds some interesting nuance to the impact of the first year experience on retention. If we look at the proportion of departing students who are also in academic difficulty when they left, there is a clear differences of almost 20 percentage points among those who had less than a 2.5 GPA when they left. In other words, far fewer of our departing students are in a position where their grades might be the primary reason for their departure. This suggests to me that, if they aren’t departing because of grades, then there are other key elements of the first year experience that would be primary contributors to the decision to depart.

We aren’t going to answer the question of what is negatively impacting retention today. Even if we were to pinpoint a significant factor in a particular year, because the nature of each class differs, evidence from one year might be entirely useless the next. My point today is simply to highlight the degree to which all of us impact retention together.

I’d like to think that on some level we know that finger pointing is foolish. Yet in an environment where we are simultaneously immersed in our own silos and entirely dependent on the efforts of others (e.g., faculty don’t have a job if admissions doesn’t recruit anyone), it doesn’t seem all too surprising that such behavior (especially if budgets are under threat) might surface despite the best of intentions. So maybe if you hear someone grouse about retention rates and “rounding up the usual suspects,” you’ll remind them that we are in this together. If we fail to improve, it won’t be because someone didn’t do their job – it will be because we all didn’t pull together.

Make it a good day,

Mark

 

 

 

 

The gnarly problem of effective feedback

All too often we talk about feedback as if it’s something that either happens or doesn’t. Students get feedback or they don’t. Faculty give feedback or they don’t. Moreover, all too often I think it’s easy for people like me to unintentionally imply that if students would just get the right feedback at the right time, they would respond to it like budding valedictorians.

However, the concept we are really talking about is much more complicated than just simple information given in response to information. At its fullest, effective feedback encompasses a recursive sequence of precisely timed instructor actions intertwined with positive student responses that produces a change in both the quality of the student effort AND the quality of the student work. Yet despite our best efforts, we know that we have only partial control over this process (since the student controls how he or she responds to feedback) even as we agonize over our contribution to it. So it doesn’t help when if feels like what we hope for and what we get are two very different things.

In this context it’s no wonder that raising the issue of effective feedback can cut close to the quick. All of us do the work we do because we care about our students. To those who burn the midnight oil to come up with just the right comments for students, suggesting that we could improve the quality of the feedback we provide to students could easily come off as unfair criticism. To those who think that there isn’t much point in extended feedback because students today rarely care, raising the issue of faculty feedback seems like preaching to the (wrong) choir.

I, for one, have not always been precise enough in my own language about the issue of effective feedback. So I ought to start by offering my own sincere mea culpa. The conversations we’ve had on campus over the last month about gathering more comprehensive data on our students’ progress early in the term have helped me think much more carefully about the concept of feedback and the ways that we might approach our exploration of it if we are to get better at what we do. With that in mind, I’d like to share some recent data from our freshmen regarding feedback and suggest that we explore more deeply what it might mean.

For the last several years we’ve asked freshmen to respond to the statement, “I had access to my grades or other feedback early enough in the term to adjust my study habits or seek help as necessary.” The five response options ranged from “strongly disagree” to “strongly agree.”

Two events combined to start our consideration of a question like this. First, changes in federal financial aid law steepened the ramifications for dropping classes, making it critical that students know their status in a course prior to the drop date. In addition, we had been hearing from a number of people who work with struggling students that many of those students hadn’t realized they were struggling until very late in the term. Recognizing the pervasiveness of willful blindness among many of those same struggling students, it took us a while to phrase this question in a way that at least allowed for the difference between students who simply never looked at their grades or other relevant feedback versus students who never received a graded assignment until the second half of the term.

Here is the distribution of responses from last year’s mid-year freshman survey.

I had access to my grades or other feedback early enough in the term to adjust my study habits or seek additional academic help.

strongly disagree 62 16%
disagree 111 30%
neutral 75 20%
agree 104 28%
strongly agree 24 6%

What should we take from this? Clearly, this isn’t the distribution of responses that we’d all like to see. At the same time, the meaning of this set of responses isn’t so easily interpreted. So here are some suppositions that I think are probably worth exploring further.

Maybe students are, in fact, regularly ignoring specific improvement-focused feedback that they get from their instructors. Maybe they assume that since the assignment is already graded, any comments from the instructor are not applicable to improving future work. Given the “No Child Left Behind” world in which our students grew up, it seems likely that they would need substantial re-educating on the way that they use feedback if the feedback we provide is specifically designed to guide and improve future work.

On the other hand, maybe students are getting lots of feedback, but it isn’t the kind of feedback that would spur them to recalibrate their level of effort or apply the instructor’s guidance to improve future work. Maybe the feedback they get is largely summative (i.e., little more than a grade with basic descriptive words like “good” and “unclear”) and they aren’t able (for whatever reason) to convert that information into concrete actions that they can take to improve their work.

Maybe students really aren’t getting much feedback at all until the second half of the term. If they are taking courses that are organized around a midterm exam, a final paper, and a final exam, then there would be no substantive feedback to provide early in the term. Given the inclination of some (i.e., many) students to rationalize their behaviors in the absence of hard evidence, this combination of factors could spell disaster.

Finally, maybe students are getting feedback that is so thoroughly developmental in nature that it is difficult for the student to benchmark their effort along a predictive trajectory.  In other words, maybe the student knows exactly what they need to do in order to improve a particular paper, but they don’t understand that partial improvement won’t necessarily translate into the grade that they wanted or believed they might receive based on the kindness and empowering nature of the instructor’s comments.

The truth is that all of these scenarios are reasonable and in no way suggest abject failure on the part of the instructor. And it is highly likely that all students experience some combination of these four scenarios through the academic year.

Whatever the reason, our own data suggests that there is something gumming up the works when it comes to creating a fluid and effective feedback loop in which students’ effort and performance is demonstrably influenced by the application of the feedback provided to them.

What should we do next? I’d humbly suggest that we dig deeper. To do that, we need to know more about the kind of feedback students receive, the way that they use or don’t use feedback, the ways that students learn to use feedback more effectively, and the ways that instructors can provide feedback more efficiently.  In other words, we need the big picture. Maybe the new mid-term reporting system will help us with that.  But even if it doesn’t, we still would do ourselves some good to look more closely at 1) the result that we intend from the feedback we give, and 2) the degree to which the feedback we give aligns with that intent.

If history is any predictor of our potential, I think we are more than capable of tackling the gnarly problem of effective feedback.

Make it a good day,

Mark

Studying at the CSL: Benefits that might exceed preparing for class

When Augustana faculty, staff, and administrators were discussing the possibility of a single building that combined student life offices, a dining hall, multiple academic services, and the Tredway Library under one roof, one of the suggested advantages to such a design was grounded in the potential for proximity and efficiency. The proximity argument asserted that students would take advantage of more opportunities and services because these offices were conveniently located together. The efficiency argument claimed that students who already intended to use many of these services would be able to do so more quickly and easily.

Unfortunately, we will never be able to produce iron-clad proof that the Center for Student Life (CSL) has lived up to its billing. That would require building an identical Augustana College campus on which we did NOT build a CSL connected to a Tredway Library so that we could compare student behaviors under both conditions (how’s that for a fund-raising challenge!). However, now that we have a year of Gävle gatherings and “all-you-care-to-eat” dining under our collective belts, we ought to be able to examine our student data to see if use of the CSL is contributing to student growth and success.

So let’s start with the aforementioned rationales for attaching the CSL to the Tredway Library.

  • The proximity of academic and student life offices and facilities would collectively boost student use of academic services and involvement in student groups.
  • The convenience of locating all these facilities and services in one place would help students engage the totality of the Augustana experience more efficiently.

In last year’s freshman surveys, we asked several questions that we can analyze together to test these assertions. The question central to today’s analysis asked, “How often did you study – by yourself or in small groups – in any part of the CSL/Tredway library building?” Although this question focuses on academic pursuits, if the prior assertions hold true responses to this question ought to correlate positively with increased use of academic resources, increased involvement in student groups, and growth on some important developmental or learning outcome of the freshman year.

It turns out that the Center for Student Life appears to be functioning exactly as we hoped it would. Even after accounting for differences in gender, race, incoming academic ability, and socio-economic status, the frequency of students’ studying in the CSL or Tredway Library predicted a stronger response to the statement “I took advantage of academic support resources (faculty office hours, reading and writing center, tutors, study groups, etc.) when I could benefit from their help.” Likewise, the frequency of studying in the CSL or Tredway Library building predicted stronger agreement to the statement “I am participating in at least one student group/organization that interests me.”

Finally, students’ frequency of studying in the CSL/Tredway Library significantly predicted their agreement with the statement, “During the year I got better at balancing my academics with my out-of-class activities.” By comparison, students’ frequency of studying in their dorm room produced no such relationship.

So what does all of this mean? In short, the Center for Student Life seems to be cultivating the kind of student behavior patterns that improve multiple aspects of their engagement as well as a key aspect of their development. The more time students spend studying in the CSL/Tredway Library, the more likely they are to use the academic resources they need when they need them, find and join student groups that fit their interests, and improve their ability to balance all the in-class and out-of-class elements of the Augustana experience that we believe are important for learning. These findings suggest that we ought to take a hard look at students’ propensity to study in their dorm rooms (75% of last year’s freshmen spent at least half of their study time in their dorm rooms) and the ways that we guide them to make more effective use of space and place.

Moreover, this is the kind of guidance that students need to hear over and over. In many cases our students are coming from life experiences where they didn’t leave their homes – or even their rooms – to study. In addition, they may not know much about the importance of establishing effective behavior patterns or conditions most conducive to learning.  We know that a fundamental difference between high school and college success lies in the shift to a more assertive approach to learning, and the idea that one would find a distinct location to study is a longstanding example of such an approach to college.

Based on our freshman data, the benefits of the CSL seem pretty clear. This isn’t to say that the CSL is perfect or that there aren’t other things that we could do to improve the building or the way that we use it. But in terms of its effect on increasing the quality of our students’ experience and helping Augustana meet its educational mission, the CSL seems to be off to a good start.

But the building isn’t so effective that it will magically suck students into some sort of learning vortex. If they don’t use it, then it’s of little use. So I hope that you will strongly encourage your students to put themselves in a position to reap benefits of the CSL and the Tredway Library.

Make it a good day,

Mark

 

It’s Nice When a Plan Comes Together

As we claw our way toward the finish line at the end of another spring term, it isn’t hard to look around and see proof of our passion for our students’ development.  But one disadvantage of working in the unusually autonomous environment of a small college is that we don’t often get the chance to step back and enjoy the totality of our collective efforts. So in my last post of the 2013-14 academic year, I hope this data I share below will give you a chance to revel in our success and take some real pride in what we have accomplished together.

A few weeks ago, Gallup released the summary report of its first large-scale study of college graduates (they hope to make this an annual study).  The project, titled the Gallup-Purdue Index, explored the relationship between undergraduate experiences and the nature of college graduates’ engagement at work and overall well-being.  You can read some of the reviews of these findings in The Chronicle and Inside Higher Ed (or the actual report) here. Essentially, after surveying over 30,000 individuals across the country, Gallup found what we have known for a very long time: the quality of student-faculty interaction is fundamentally important to a college graduate’s long-term quality of life.

Interestingly, the various questions on which the Gallup findings are based look awfully familiar.  That is because we’ve been asking many of the same questions for years now, and using that data to inform our work and our perpetual effort to improve.  So I thought it would be nice to take a moment to step back, compare the responses from our students to those questions with the responses from the Gallup study participants, and smile.

Below I list each of the Gallup data points followed by a couple of similar Augustana data points.

Gallup-Purdue Index

  • I had at least one professor at [College] who made me excited about learning.
    • 63% strongly agree

Augustana Senior Survey

  • My one-on-one interactions with faculty have had a positive influence on my intellectual growth and interest in ideas.
    • 54% strongly agree + 37% agree
  • I really worked hard to meet my instructors’ expectations.
    • 45% very often + 39% often

Gallup-Purdue Index

  • My professors at [College] cared about me as a person.
    • 27% strongly agree

Augustana Senior Survey

  • Faculty in my major cared about my development as a whole person.
    • 52% strongly agree + 34% agree
  • My major advisor genuinely seemed to care about my development as a whole person.
    • 50% strongly agree + 28% agree
  • The faculty with whom I have had contact were interested in helping students grow in more than just academic areas.
    • 41% strongly agree + 48% agree

Gallup-Purdue Index

  • I had a mentor who encourage me to pursue my goals and dreams.
    • 22% strongly agree

Augustana Senior Survey

  • Faculty in my major knew how to help me prepare to achieve my post-graduate plans.
    • 37% strongly agree + 38% agree
  • How often did you major advisor ask you about your career goals and aspirations?
    • 84% very often, often, or sometimes
  • I am certain that my post-graduate plans are a good fit for who I am right now and where I want my life to go.
    • 41% strongly agree + 36% agree

Gallup-Purdue Index

  • I worked on a project that took a semester or more to complete.
    • 32% strongly agree

Augustana Senior Survey

  • 100 % Participation in Senior Inquiry
  • My senior inquiry project challenged me to produce my best possible intellectual work.
    • 44% strongly agree + 34% agree
  • During my senior inquiry project I learned a lot about myself (work habits, handle setbacks, manage a larger project, etc.) in addition to the topic of my paper/project.
    • 42% strongly agree + 38% agree

Gallup-Purdue Index

  • I had an internship or job that allowed me to apply what I was learning in the classroom.
    • 29% strongly agree

Augustana Senior Survey

  • 60% Participation in an Internship
  • My out-of-class experiences have helped me connect what I learned in the classroom with real-life events.
    • 22% strongly agree + 54% agree
    • 65% of seniors work on campus

Gallup-Purdue Index

  • I was extremely active in extracurricular activities and organizations while attending [College].
    • 20% strongly agree

Augustana Senior Survey

  • How many student groups or clubs did you find that fit your interests?
    • 32% many + 48% some

Gallup-Purdue Index

  • I feel emotionally attached to [College].
    • 18% strongly agree

Augustana Senior Survey

  • I felt a strong sense of belonging on campus.
    • 24% strongly agree + 43% agree
  • If you could relive your college decision, would you choose Augustana again?
    • 40% definitely yes + 33% probably yes

While the senior survey data we highlighted in this table came from the 2013 senior class, these responses aren’t any different from the 2012 class or the class that will graduate in this weekend.  So take a moment, even in the midst of all the last minute craziness and stress the comes with finishing out the last term of an academic year, and pat a colleague on the back today.  Because we do this together.  And we have every right to be proud of ourselves, each other, and the work that we do.

Make it a good day,

Mark

Trying out some first year learning outcomes

This year we tried something new with our freshman survey. Instead of administering the entire survey in the spring term, we split it into two pieces; one part administered in the middle of the year and one part administered at the end of the year, concentrating our questions in the mid-year survey on academic and social acclimation and focusing the end-of-the-year survey on learning and development. This allowed us to get much better data from struggling students, who often are no longer enrolled in the spring. It also allowed us to link the conceptual emphases of each part of the survey with what students were more likely to be experiencing at the time when they took the survey.

Over the last couple of months I’ve shared some of the findings from the mid-year survey -findings that can help us improve our support of students’ acclimation to college life. In this post I’d like to share some early learning and development findings from the end-of-the-first-year survey. The two items below are a few of the new items we added to the end-of-the-first-year survey as potential outcomes of the first year. You’ll notice in the phrasing of the questions that we approached these outcomes developmentally. In other words, we don’t conceive of freshman year outcomes as absolute thresholds that have to be met at the end of the first year. Instead, we think of the first year as a part of a larger process in which students move at different speeds and at different times. In the end, we are trying to get a sense of the degree to which freshmen believe that they have made progress toward one skill and one disposition that undergird a successful college experience.

During the year I got better at balancing my academics with my out-of-class activities.
Strongly disagree 8 5%
Disagree 10 6%
Neutral 35 21%
Agree 84 51%
Strongly agree 29 17%
Over the past academic year, I have developed a better sense of who I am and where I want my life to go.
Strongly disagree 2 1%
Disagree 15 9%
Neutral 41 25%
Agree 71 43%
Strongly agree 37 22%

In both cases, there appears to be reason to smile and reason to frown.  On one hand, about two-thirds of the freshmen respondents agree or strongly agree with these statements. While one could quibble about whether some of these students were already fully capable of balancing their academic and out-of-class responsibilities or already had a strong sense of self, direction, and purpose, I think it is fair to suggest that students are more likely responding to the phrasing about their own perceptions of personal growth.

On the other hand, about a third of our students indicated neutral, disagree, or strongly disagree to these questions. While there are probably more than few potential explanations that are outside of our control, I suspect that future analysis, once all of our data is cleaned and recoded for analytic purposes (that is fancy talk for “turned into numbers that statistical geeks like to use to play in the statistics sandbox”), will help us understand some of the experiences that positively and negatively predict students’ responses to these items.

Then comes the really hard work.  What do we do with that knowledge?  I hope we will do what we are learning to do more often: change our practices and/or policies to improve our students development and learning.  So stay tuned as we unpack all of our new data. And PLEASE PLEASE PLEASE if you have any freshmen in your classes, implore them to complete the freshmen survey that has been emailed to them several times over the last few weeks.

Make it a good day,

Mark