What do you do when change finds you?

Welcome back from the short holiday break. Don’t tell the wellness folks, but I hope you got to eat all the pumpkin pie and whipped cream you could stand!

But just in case you thought that you were going to ease your way back into the comforting routine of winter term, I thought that now would be the perfect time to tell you about a nifty change that is coming down the pike.

What if you could see your IDEA feedback summary and student comments as soon as you submitted your grades for the term? It seems to me like that would be pretty awesome.

And what if you could get real-time IDEA feedback from your students in the middle of the term so that you could adjust on the fly? It seems to me like that would be pretty cool, too.

Remember a year or two ago when I reported that IDEA was going to phase out their paper forms some time in the next several years? Well, I’ve been informed that the paper forms will breathe their last collective breath in the spring of 2018. That means that, unless we want to go out onto the market and audition all of the other players in the course feedback survey industry (from whom I get phone calls or emails at least twice a week about their “exciting fully customizable online format”) or take on the monumental task of building a homegrown course feedback system, we need to plan on moving to a paperless IDEA system in the fall of 2018.

To be honest, I may have oversold this change a bit. In reality, it’s not going to change the daily life of an instructor much. You’ll still choose your learning objectives at the beginning of the term, and the students still complete the same set of items (albeit with some improvements that actually align better with our own college outcomes) at the end of the term. In many cases, you’ll likely opt to use class time for students to enter their responses on their phones, tablets, or laptops instead of coloring in little circles on a piece of paper. Solving the potential problem that some students won’t have a device that can access the survey is probably a pretty simple one (one possibility would be to borrow a neighbor’s phone or laptop).

Now I suspect that some of you have questions about how this is going to work. And we wouldn’t be doing our jobs if we didn’t wonder about all of the ways that the paperless process could go horribly awry. But after quizzing the IDEA consultant, it seems as if they’ve found a myriad of ways to avoid the obstacles we most often worry about (e.g., low response rates, satisficing, etc.).

Nonetheless, we will host plenty of opportunities to answer questions about the details of this change – both in person and online. So what do we do when change finds us? Put your arms out wide and embrace the possibilities!

Seriously, what else are you going to do?

Make it a good day,

Mark

 

A Short Post for a Short Week!

Hello Everyone,

Since this week is a short one, and last week I rambled more than I should have, this week I’m going to keep it very short.

One way to assess the effectiveness of an office on campus is to look at how pervasively its services are recommended by others across campus.

We ask this very question of seniors regarding the ubiquity of recommendations they receive to use the services that CORE provides. If the frequency of recommendations is going up . . . that’s probably good. If the frequency of recommendations is going down . . . that’s probably not so good.

Here is the trend over the past four years of the proportion of seniors indicating that no one recommended CORE (or the CEC in 2013 and 2014) to them –

  • 2013  –  18.0%
  • 2014  –  18.9%
  • 2015  –  14.4%
  • 2016  –    7.1%

Looks like CORE is doing something right.

Make it a good day . . . and have a nice couple of days off.

Mark

The First Year Experience: A treasure trove and a quick peek behind the curtain

And they’re off!

It’s always a good thing when you can wear short sleeves, slacks, and sandals on the first day of winter term!  (I know nobody uses the word “slacks” any more, but I couldn’t resist the alliteration.)

Welcome back and good luck with the start of your winter term.  It was a pretty quiet break with nothing really going on . . . oh yeah, except for that.

Nonetheless, the IR office has finally finished putting together our two big reports from last year’s (2015-2016) Senior Survey data and First Year Survey(s) data.  These reports are both linked on the IR web page, so please help yourself to an overflowing spoonful of mean scores, standard deviations, frequency distributions, and bar graphs! (I know! So exciting.)  It’s a veritable smorgasbord of quantitative delectables.

In particular, if you scroll past the first nine pages of the 15-16 First Year Survey(s) Report, our student worker Katrina Friedrich create a table that highlights the statistically significant predictor variables of seven different intended outcomes of the first year:

  1. I feel a strong sense of belonging on campus.
  2. Over the past academic year, I have developed a better sense of who I am and where I want my life to go.
  3. If you could relive your college decisions, would you choose Augustana again?
  4. During the year I got better at balancing my academic with my out-of-class activities.
  5. I am certain that my choice of majors(s) is a good fit for who I am right now.
  6. How often did you push yourself to work harder on an assignment even though the extra effort wouldn’t necessarily improve your grade?
  7. I found myself thinking about what I am learning in my classes even when I’m not in class or studying.

Although I’m sure we will spend more time this year digging into the various findings highlighted in this table, this post wouldn’t be complete without at least one guided exploration into one of the predictor variables that just keeps popping up.  So today I thought we’d kick it up a notch “Bam!” by exploring the backstory of one pesky predictor variable for first year students:

  • “Reflecting on the past year, I can think of specific experiences or conversations that helped me clarify my life/career goals.” (response options ranged from strongly disagree to strongly agree)

This item turned out to be the only variable that significantly predicted all seven of the outcome variables. (For all of you who gobbled up the nerd salad a long time ago, each of our regression equations included controls for race, gender, socioeconomic status, and pre-college academic preparation). So the next question seems pretty important: What specific experience(s) might statistically predict students’ ability to recall specific experience of conversations that helped them clarify life/career goals?

Just like the analyses that Katrina conducted to produce the initial table of results, to run a reasonably legitimate test I build a regression equation that took into account race, gender, socioeconomic status, and pre-college academic preparation. The basic reason to include these variables at the outset is to allow us to say with confidence that our findings apply to all students regardless of differences that might exist across those four demographic characteristics.

Then I added nine variables that might be relevant considering what other researchers have found about what might help a student find some clarity of purpose. The items I chose to add to this analysis are:

  • My first year adviser asked about my career goals and post-graduate aspirations.
  • My first year adviser connected me with other campus resources to help me thrive in college
  • My first year adviser recommended specific on-campus activities to help me make the most of my college career.
  • My first year adviser pushed me to think about choosing courses as more that just checking boxes.
  • My out-of-class experiences involved me in community service off-campus.
  • My out-of-class experiences helped me connect what I learned in the classroom with real-life events.
  • How often did your instructors ask you to apply your learning to address societal problems or issues?
  • My on-on-one interactions with faculty have had a positive influence on my intellectual growth and interest in ideas.
  • Symposium Day activities influenced the way that I now think about real world issues.

Would you like to venture a guess which items popped (not the technical term, I know, but I’m trying to encourage some new slang amongst my people)?

Listed from largest to smallest effect size, these three items produced statistically significant positive effects.

  • One-one-one interactions with faculty positively influenced intellectual growth and interest in ideas.
  • Out-of-class experiences helped connect classroom learning with real-life events.
  • First year adviser recommended specific on-campus activities that would help make the most of one’s college career.

Refreshingly, these findings suggest that all of us can play a potentially key role in helping our first year students clarify their life and career goals.  If you interact with students in a faculty role, then look for ways to create one-on-one interactions that engage substantive questions. If you interact with students outside the classroom, look  for ways to help them connect their academic learning with real world events.  And if you interact with students as an adviser, then make the effort to identify and recommend specific on-campus activities that align with, and might even augment, your student’s post-graduate aspirations and college goals.

Although we can’t guarantee that a student makes the most of their college experience, we can increase the odds that he or she chooses the behaviors and activities that will point them in the right direction.  And if we keep it up long enough, we will likely be a pretty damn good choice for the students who are lucky enough to come here.

Make it a good day,

Mark

It’s Hard to Argue with this Welcome Week Data

Good Morning!

It’s week 10!  The last week of the fall term!  You can make it!

This week I’d like to send a virtual shout-out to all of the folks who run Welcome Week for our new freshmen at the beginning of the fall term. This four-day whirlwind is a logistical Cirque du Soleil of social and academic acclimation.

But in many ways, it’s really more of an orientational triage. There are certain things that the students have to know by the time classes start or they’ll tank right out of the gate. Then there are other things that we’d love them to learn but we know these things might be a bridge too far. In reality, four days isn’t a lot of time, and the students’ ability to digest information is undercut by all of the anxieties that come with knowing, “Holy crap, I start college in a few days!” So the Welcome Week design team is faced with a stark reality: be very clear about the difference between what these new students have to know and what would be nice to know. Then teach them all of the first category and as much of the second as possible – knowing that too much time spent on any of the “would be nice to know” could cut into the “have to know” and then we’ve got a potential problem.

A few years ago, I highlighted the ways that the Welcome Week team has used some simple assessment design principles to improve the quality of the experience. But in that post, we only had anecdotal data to suggest that some good things were happening as a result. Now that we have a couple years of quantitative data, the evidence is pretty clear: Welcome Week has gotten even better at doing exactly what it is supposed to do.

A few weeks after the beginning of the fall term, we ask freshmen to complete a short online survey to find out their perception of Welcome Week. Specifically, we want to know the degree to which they think they learned the things we tried to teach them. I’d like to highlight four items that represent things that we think students have to know. Below each item is the average response score on a 1-5 scale (1=strongly disagree and 5=strongly agree) from each of the last four years. Notice the steady improvement.

My Welcome Week experience . . .

. . . helped me learn exactly how to get to the location of my classes.

  • 2013 – 3.55
  • 2014 – 3.79
  • 2015 – 4.18
  • 2016 – 4.21

. . . helped me find places on campus where I can study most effectively.

  • 2013 – 3.59
  • 2014 – 3.63
  • 2015 – 3.82
  • 2016 – 4.00

. . . taught me specific ways to make the best use of my time during the school day.

  • 2013 – 3.23
  • 2014 – 3.39
  • 2015 – 3.40
  • 2016 – 3.68

. . . emphasized the importance of finding places on campus where I can take time for myself.

  • 2013 – 3.51
  • 2014 – 3.60
  • 2015 – 3.69
  • 2016 – 3.84

As you can see, the Welcome Week team deserves some well-earned praise. They’ve stuck to the overarching design and philosophy of the program and used evidence to inform change. They have redesigned several parts of the experience, revised the way that they train peer mentors, and tackled some difficult logistical challenges to ensure that our new students are more likely to be as ready as possible for the first day of classes. Equally difficult (and probably even more impressive), they’ve stopped doing a number of things – no matter how strongly they believed in the potential of those activities, in order to concentrate more precisely on making the most of every minute of those four days.

Late last week I was playing with our freshly-collected freshman data from the end of the first term to see if we could see any lasting effects of the Welcome Week experience. As you might expect, the impact of Welcome Week tends to fade as subsequent fall term experiences become more influential in driving student success. However, one particularly gratifying finding popped when I tested whether any of the Welcome Week survey items might predict our students’ response to an item in the end of the first term survey, “Welcome Week provided the start I needed to succeed academically at Augustana.” Even though the data collected from the Welcome Week survey was gathered during the second week of the term and the end of the first term data was collected during weeks seven and eight, the item “My Welcome Week experience taught me specific ways to make the best use of my time during the school day,” proved to be a statistically significant positive predictor of our freshmen’s perception of the preparatory effectiveness of Welcome Week. Impressively, this is also one of the learning goals where the Welcome Week team seems to have made substantial strides in preparing our new students to succeed.

So congratulations to everyone involved in putting together and pulling off Welcome Week!  I hope you’ll take a moment to send a kudos to anyone you know, even yourself, who contributed to a great Welcome Week way back at the beginning of the term.

Make it a good day,

Mark

Men, Social Responsibility, Volunteering, and Some Troubling Data

Last week I shared the first round of findings from our study of the 2012 cohort’s intercultural competence development during their college career. One finding that jumped out was the disappointing difference in change between men and women. While women’s scores improved on both the cognitive and the behavioral scales, the men’s scores only improved on the cognitive scale. In addition, the women’s improvement on the cognitive scale was notably larger than the men and the degree of women’s improvement on the behavioral scale almost doubled the advantage they started with over men four years earlier.

At the Board of Trustees meetings last week, I provided our annual Academic Quality Markers for the 2016 cohort to the Academic Affairs Committee. It’s pretty apparent that there is something troubling going on with male participation and engagement. Male participation in study abroad, service learning, and volunteering is significantly lower than women. This pattern continues in three student experience items that address our efforts to cultivate citizenship. Moreover, the other comparisons by race/ethnicity and socio-economic status don’t contain such repeated disparities between groups. The only other significant difference occurs where one would expect: white students report less encouragement to interact across difference compared to students of color. Given the substantially higher proportion of white students on campus, it would certainly take relatively less “encouragement” for students of color to find themselves interacting across difference.

I’m sure that the explanation for these differences between men and women are complex. However, we might have found something that could enlighten an effort to better educate our male students within the Global Perspectives Inventory (GPI) data that I shared last week and referenced above. One set of questions within this survey, the Social Responsibility Scale, is composed of statements that focus on the degree to which the respondent engages in the public sphere to affect change. As an example, two of the statements (to which the respondent indicates a level of agreement or disagreement) are: “I work for the rights of others,” and “I consciously behave in terms of making a difference.”

It might not surprise you to find out that male and female Augustana students from the 2012 cohort entered with different average scores, different enough that the gap would be considered marginally statistically significant.

  • Female: 3.76
  • Male: 3.62

But what surprised me was that over the course of four years, only the women had grown on this scale. Male students had on average remained unmoved.

2012 females: 3.76 – – – – – – – – – – – – 2016 females: 3.88

2012 males:    3.62 – – – – – – – – – – – – 2016 males:    3.59

Maybe this lack of male growth in prioritizing social responsibility partially explains the difference between men and women in volunteering and service learning participation. Maybe it partially explains the male deficit in getting something substantive out of Symposium Day. And maybe it partially explains the relatively lower sense among men that Augustana encouraged them to interact across difference.

If our goal, as our mission statement seems to suggest, is to graduate individuals who engage in both leadership and service, it appears that we may need to revisit the ways that we develop a service orientation among our male students.

Hmm . . . if only there were a major reconfiguration of the Augustana educational experience that would allow us to try something new based on these findings . . .

Make it a good day,

Mark

Does Augustana students’ intercultural competence improve during college?

In the fall of 2011 we set in motion a college-wide assessment plan where we would collect learning outcome data from each entering cohort, link this data to the various student experience surveys these students complete at different points during their four years at Augustana, then collect the same learning outcome data just before the cohort graduates. This plan allows us to track our students’ four-year change on a specific learning outcome and identify connections between student experiences and variations in the direction and degree of that change.

Obviously it would be logistically impossible (and a little stupid) to tackle all of the Augustana Learning Outcomes every year. So we decided to rotate annually through the three broad categories of learning outcomes starting with intrapersonal conviction, moving to interpersonal maturity, and finally addressing intellectual sophistication before returning to intrapersonal conviction. Because each category includes a variety of more specific outcomes, this framework allows us some flexibility in selecting outcomes that seem particularly pertinent to our students’ success while maintaining a more general pattern that keeps us tuned in to the totality of our learning goals.

The first cohort (starting in the fall of 2011 and graduating in the spring of 2015) provided data on orientations toward different types of motivation, something that undergirds the learning outcome that we have called “Wonder.” I wrote about some of our findings from that study last fall and last winter.

The freshmen who started in the fall of 2012 completed a survey called the Global Perspectives Inventory, an instrument designed to measure intercultural competence (an important aspect of the learning outcome category we call Interpersonal Maturity). In the spring of 2016 we collected the final set of data from this cohort. On September 16th, the Assessment for Improvement Committee (AIC) presented the first of three Friday Conversations (one in each term during the 16-17 academic year) intended to examine this data and explore what it might suggest. For those of you who were unable to attend the Friday Conversation on September 16th, I thought I would post the power point slides below. They give a brief description of intercultural competence, convey the nature of our students’ change on three aspects of intercultural competence as measured by the GPI, and pose some questions for us to begin thinking about what we might explore in preparation for our winter term Friday Conversation.

So click on this presentation of 4-year change at Friday Conversation 9/16/2016 and you will be able to scroll through the power point slides.

As you can see, we found that our students (at least this cohort of students) grew on two of the three elements of intercultural competence. Our students grew the most on the cognitive scale that assesses knowledge of cultures and the implications of differences between cultures. Our students also grew, albeit to a lesser degree, on the behavioral scale that attempts to capture the likelihood to enact behaviors that reflect intercultural competence. Finally, we found that our students made no statistically significant gains on the affective scale that assesses the attitudes that would motivate one to be intercultural competent.

In addition to examining the overall change, we also explored the change among several subgroups of students based on pre-college demographic characteristics. As represented by the bar graphs on several slides, this exploration discovered interesting differences in the intercultural competence growth between men and women, white students and students or color, and students whose ACT score suggested low and high academic preparation.

Reflecting on the changes that we see in our student data, the important next question becomes, Why? Why do our students grow in the way that they do?  Why do some students change differently than others? What experiences influence positive or negative changes in intercultural competence? In my mind, these are the more interesting questions to explore because they can point us toward concrete ways that we might improve the education we provide.

Of course, there are an almost infinite number of questions that we could ask of our data. Are there specific experiences from participating in distinct activities that improve intercultural competence? What about the possibility that a combination of experiences (especially in a specific sequence) might do more than any single experience? Finally, is it possible that a particular dynamic that pervades one’s college experience might transcend an individual experience or combination thereof?

Although we were able to solicit a long list of research questions to test from the folks in attendance at our first Friday Conversation, I’m sure there are many more that we have yet to consider. So please add a research question or two in the comments section below.  We will test as many as we possibly can.  And we will report back at the winter Friday Conversation and on this blog all of what we find.

So put on those hypothesizing caps, and send us your suggestions. If we can find a way to test it, we will!

Make it a good day,

Mark

Reading my way out of a sleepless weekend

Good morning, everybody!

That greeting is intentionally more peppy than I feel today.  Sometimes I have to try to con myself into a better place.  Although I’m not sure what to do with the fact that this approach actually works for me, today I don’t have the energy to quibble with the ends justifying the means.

The past few weeks at Augustana have been hard to watch. Getting ourselves to a deeper understanding of difference and how to communicate with each other despite those differences sometimes seems simultaneous deceptively obvious and painfully impossible.  I’d love to whip out some perfect data point and, with an magician’s “abracadabra,” cast a healing rainbow of glitter across the campus. But reality is never presto-change-o with a dollop of whip cream.

So instead, I’m going to share links to a couple of articles that gave me just a little bit of an uplift this morning.  I think there is something in both pieces that is worth pondering. Each of these articles were in the Chronicle of Higher Education – the first one just today and the second one earlier in the summer.

A Gorilla-Masked Student’s Attempt to Provoke is Met with Peace

Talking Over the Racial Divide

Sometimes you gotta “make it a good day,”

Mark

Retention is up. Great. But can we take any credit for this?

A couple of weeks ago I shared with everyone the eye-popping news of our most recent 1st-2nd year retention rate. The CliffsNotes/SparkNotes/Jiffynotes version (whatever happened to Reader’s Digest?) of that post are:

  1. 88.9% of the 2015 class came back this fall,
  2. this is the highest retention rate we’ve ever recorded (and more than a full point higher than the previous high set in 2010), and
  3. maybe we shouldn’t second-guess goals that seem at first to be too high.

Since we’ve made a concerted investment of people and resources toward improving our 1st-2nd year retention rate, this is nice to see. But since our efforts have been targeted to improve retention rates among several specific populations of students (i.e., students who have historically persisted at lower rates than the overall population), it makes sense to look and see whether those specific efforts are bearing fruit. After all, money does’t grow on trees at Augustana (although we’d be stinkin’ rich if it did!); it actually matters a lot whether or not our investments are paying off. So let’s dig a little deeper and examine the the last four years of retention rates among four groups of students long known to leave Augustana at higher rates than the rest of the first-year class: students of color, low income students, less academically prepared students, and first generation students.

While it would be a mistake to think that each of these groups were somehow completely independent of each other (i.e., there are certainly individual students who fit into more than one of these categories), it is true that research on the factors that influence the decision to withdraw from college has found differences among these four groups. Students of color often feel relegated to the margins of a college community – especially when that community is mostly white. Low income students often find that financial constraints undermine their ability to access the college experience offered to, and touted by, mainstream students. Students who are less academically prepared often find themselves overwhelmed and without the resources that might help them adjust to the academic rigors of college. And first generation students often struggle with a lack of confidence in their ability to succeed in college and a lack of knowledge about navigating the unwritten rules and norms that the rest of us unconsciously perpetuate every day.

With these findings in mind, we have developed specific programs to address each of these issues for students who fit into these groups. So . . . are these programs working?

Below I’ve listed the retention rates for each student subpopulation over the last four years. Remember, the overall retention rates for each of the last four years are:

  • 2013 – 84.9%
  • 2014 – 82.9%
  • 2015 – 86.1%
  • 2016 – 88.9%

Over the same period, these are the retention rates for:

Students of Color

  • 2013 – 81.3%
  • 2014 – 78.4%
  • 2015 – 82.2%
  • 2016 – 86.1%

Low Income Students

  • 2013 – 81.3%
  • 2014 – 80.8%
  • 2015 – 83.4%
  • 2016 – 86.6%

Less Academically Prepared Students

  • 2013 – 75.0%
  • 2014 – 78.6%
  • 2015 – 77.4%
  • 2016 – 83.9%

First Generation Students

  • 2013 – N/A (we didn’t create an easy way to track these students until 2013)
  • 2014 – 80.8%
  • 2015 – 80.5%
  • 2016 – 85.3%

Clearly, the retention rates of students of color, low income students, less academically prepared students, and first generation students have improved. And although the nerdy PhD in me would like to see a few more years of retention data before announcing that we have a definitive trend, at the very least we can say that our investments of money, positions, and space into these programs are not not working. Frankly, I think it’s far more reasonable to suggest that our efforts seem to be working quite well.

Lest you forget what this means in terms of real money, the difference in net revenue between last year’s retention rate of 86.1% and this year’s retention rate of 88.9% isn’t chump change. If we conservatively assume that:

  1. term-to-term attrition rates don’t change (which in reality are almost sure to go up if overall year-to-year retention rate goes up), and
  2. Actual revenue per first year student will not be less than the five-year low of $14,251 (2014/15),

the estimated increased net tuition revenue to the college this year ends up at just over $270,000. Moreover, the estimated increased net comprehensive fee revenue (i.e., including housing and student fees in addition to tuition) – again using the five-year low in actual numbers – ends up closer to $432,800.

By the way, if you want to test my math you can find all of the numbers I referenced above on the college dashboard that is always posted on the Augustana Institutional Research web page, along with the detailed breakdown of retention rates.

Triangulating these data with all of the anecdotal evidence I’ve seen over the past year, I’m gonna go out on a pretty thick limb and say that I believe that what we have been doing is working. So the next time you see someone whom you think might be involved in this work (I think you can figure that part out on your own) thank them for all of the effort they have put into helping our students succeed. And if you hear someone grouse about additional resources being invested when things are tight all over, it might be worthwhile to remind them that, sometimes those investments are worth it.

Not to mention that whole “educational mission” thing.

Make it a good day,

Mark

What if you could get your students to daydream about learning?

Rumor has it that in the late afternoon, after the students have all retreated to upper campus, you might catch a glimpse of a lone professor strolling under the leafy canopy, daydreaming of students who ponder their learning just for the fun of it. Although this might be an ever-so-slight exaggeration (it’s not THAT leafy), this vision of liberal arts nirvana isn’t just a fool’s paradise. When testing the effect of the first-year survey item, “I find myself thinking about what I’m learning in my classes even when I’m not in class or studying,” we regularly find that students who strongly agree with this statement also earn better grades (no matter their incoming ACT scores), say that they would definitely choose to come to Augustana again, and strongly agree that they can think of specific experiences that helped them clarify their life or career goals.

It appears that students who think about their learning when they don’t have to aren’t just a professor’s dream come true; this behavior is one indicator of a very successful student. Of course, I can already hear you blurting out the obvious, only semi-rhetorical, albeit entirely reasonable, next question.

“But we don’t have any control over that trait, do we?”

I can understand why you might ask that question, especially in that way. Sometimes it feels like all we do is implore students to embrace learning and truly engage the stuff we are trying to teach them. And sadly, all too often it can feel like those passionate pleas just bounce off the classroom’s back wall, reminding us of our inadequacies as the slap-back echo of our own voice hits us in the face.

But if there were some things that you could do, whether you are working with students in the classroom or outside the classroom, that might actually turn students into more intellectually curious, contemplative thinkers, would you do it? Sign me up!

We’ve just finished analyzing last year’s first-year student data and it looks like two items that we’ve recently introduced to the survey might point us toward some ways that could increase the degree to which students think about what they learn in class when it isn’t required. The first item that we found to be predictive of students’ thinking about their learning when they don’t have to asks students the degree to which they agree or disagree with this statement:

“My instructors recommended specific experiences outside of class (such as a lecture, forum, public meeting, demonstration, or other event) on campus or in the community that would complement or enhance my learning in class.”

Even after accounting for students’ sex, race, incoming ACT score, and socioeconomic status, as students reported these kinds of recommendations coming from their instructors more frequently, they also reported that they found themselves thinking about the things they learned in class even when they weren’t in class or studying.

In addition, we found a similar relationship between students’ thinking about learning and the degree to which they agreed with this statement:

“Symposium Day activities influenced the way that I now think about real world issues.”

It strikes me that these two items fit together perfectly.  On Tuesday (that would be tomorrow!), We hold our first Symposium Day of the year. In addition to four fantastic featured speakers, a variety of faculty, staff, and students will present a variety of thought-provoking presentations that tackle one or more aspects of the deliberately broad theme for the day, “Crossroads.” Some crossroads are physical, some are ideological, and some are about values and standing up for a set of principles even when it might not be the most popular thing to do. No matter the angle you take, everyone one of us faces these sorts of choices every day. If we’re paying attention, these moments can bring powerful meaning into our lives.

So if you want your students to be more likely to think about what they are learning when they don’t have to, take advantage of the upcoming Symposium Day and encourage them to soak up the atmosphere and the opportunity to choose what they want to learn. Maybe find a few sessions that sound particularly intriguing or controversial and suggest that your students practice hearing out an idea that they might not initially agree with.

Who knows? By the end of tomorrow that rumored incident of meandering thinkers might include a healthy dose of students, too.

Make it a good day,

Mark

Not Much to Say . . . Except, “Wow!”

Although President Bahls announced it at last week’s faculty meeting, it’s possible that the news about our latest first-to-second year retention rates hasn’t quite made it out to everyone who reads this blog. So just in case you haven’t heard, let me share with you a little number that still has me shaking my head a little bit.

  • 1st-2nd year retention rate of Augustana’s 2015 freshman class – 88.9%

Wow.  Just, wow.

So why am I so blown away by this number?

In the fall of 2010, we recorded a retention rate of 87.8%. At the time this was the highest retention rate we’d seen in 25 years of tracking the persistence of first-year students to the second year. For almost a quarter of a century, Augustana’s retention rate had bounced around somewhere between 82 and 87 percent. So in context, 87.8% was an awfully high number and more than a few of us (particularly me) didn’t think we’d be able to do much better than that.

But three years ago while we were in the midst of developing the Augustana 2020 strategic plan, someone asked me to estimate (AKA guess with data) what might be the best possible retention rate that Augustana could achieve given our student profile and educational resources. After crunching some numbers, I suggested that if the stars aligned we might be able to hit a retention rate of 90% in a given year. In all honesty, I wasn’t convinced that we’d ever break 88% since I’ve never seen the stars align outside of a Disney cartoon. Even in my most optimistic moments, I certainly didn’t think we’d crack 88% until we got all of the programming described in Augustana 2020 up and running and had worked out the kinks. If you had forced me to guess before the start of the fall term what our retention rate would be this year, I would have probably said something just short of 87%.

But we blew past 87%. We blew past 88%. We almost cracked 89%. Wow.

The part that is most surprising to me is that we have just started to get all of our programs aimed at first-year student success up and running. Folks have been working extremely hard, but I don’t think anyone would say that we have all hit our stride yet.

And as if all that weren’t enough, it appears that our retention efforts might just be spilling over to our second year students. Our retention rate for 2nd-3rd students this fall hit 94.4% – the highest for those students since we began tracking that number six years ago.

Now it wouldn’t be right if I didn’t acknowledge that this might be an anomaly; next year we could be lamenting a retention rate that is back within our familiar range. But maybe, just maybe, we might be on to something and all of the work that so many people have been doing over the last two years is starting to pay off.

Will we actually get to 90%?  I don’t know.  But the next time someone asks me to give them a ceiling prediction for what the Augustana community is capable of doing, I’m going to think twice before I tell anyone what I think we can’t do.

Congratulations to everyone who has worked so hard on behalf of our students.  It’s humbling to be on the same team with all of you.

Make it a good day,

Mark