Micro-Retention: Do fall-to-winter term rates tell us anything?

Trying to identify the critical factors that influence our students’ decisions to persist or withdraw is a tricky business. In addition to tracking our overall fall-to-fall retention rates for first year students (the only retention number that is widely reported), we track the fall-to-fall retention rates for each of the other cohorts (even 5th year seniors). Furthermore, we break those cohort retention rates down by a variety of demographic categories (e.g., race/ethnicity, gender, socioeconomic status, incoming academic preparation, and first-generation status).

But tracking the fall-to-fall retention rates only tells us of part of the story. The decision to persist or withdraw isn’t a simple or momentary decision, and research clearly indicates that the major decision to stay or leave is preceded by a multitude of minor decisions that combine to pull the student toward (or push the student away from) the brink of this ultimate choice. So if we want to more fully understand the nature of this series of decisions, another way to look at it is to examine the term-to-term retention rates. Although this approach is still based on evidence of the ultimate choice to leave Augustana, it might allow us to better understand something about the factors that influence student to leave after the fall term, winter term, and spring term (since we ask students who leave why they are leaving in an exit interview), thereby giving us the opportunity to see if there are differences in the reasons students give for leaving across these three departure points. It is this kind of knowledge that might help us figure out what kind of interventions to prioritize over the course of the academic year.

Below are three sets of fall-to-winter retention rates for our traditional student cohorts. Please note that each of these percentage rates represent the proportion of students in each cohort who were enrolled during the prior term. These rates do not represent the proportion of a entering cohort that is still enrolled at Augustana.

2015 Fall-Winter Retention        2014 Fall-Winter Retention
1st year –  96.5%                              95.9%
2nd year – 98.6%                              98.3%
3rd year –  97.9%                              97.1%
4th year –  99.4%                              97.4%
5th year –  63.3%                              42.4%
By comparison, below are the Four-Year Average Fall-Winter Retention
1st year –  96.6%
2nd year – 97.9%
3rd year –  98.3%
4th year –  98.3%
5th year –  54.9%

As you can see, our fall-to-winter retention rates increased for every cohort of students. In the case of the 1st through 4th year cohorts, I’d say this is a good thing. For the 5th year students, it’s more complicated (e.g., is the fact that more of them returned for the winter term a function of their particular choice of academic programs?  Or is it a function of our inability to offer them the courses they needed in a timely manner?).

What more are we to make of these numbers? By themselves, it seems to suggest what we already know – Augustana loses more first year student in the fall term than second, third, or fourth year students. This year, for example, we lost 24 first years, 9 second years, 10 third years, and 3 fourth years. While we might be able to improve among our first year students, it appears there might not be much more we could do systematically to increase fall-to-winter retention among all but the first year students. At the same time, if we are going to hang our hat on being a college that is very good at building relationships with all students, then those 22 non-first-year students each represent an opportunity for us to improve. The important thing to note about the first-year students’ departure patterns is that the vast majority of them didn’t even complete the first term. Although in some cases there may not be much we can do, this fact emphasizes the degree to which we need to build relationships with our students right away instead of waiting for them to open up or make the first move.

As you can expect, we are in the process of further analyzing our data, especially in connection with the freshman data we collected right before winter registration (i.e. about week 7). To be sure, you will be the first to know if we find anything!

Make it a good day,

Mark

Some old-school advice about studying turns out to still be true

Although I’d love to think that I’m some sort of innovatus maximus, when students ask me for advice I’m pretty sure that I just repeat what somebody told me when I was in college. This is particularly true when it comes to study habits. I was emphatically told to study during the day and never study in my dorm. I suppose the reason I think this advice was so good is because when I didn’t follow it my grades tanked. But just because some bits of sage advice have been around for a long time doesn’t necessarily mean that they are still accurate or applicable to everyone. Given the wealth of changes that have impacted undergraduate lives since I was in college (i.e., the late 1980s and early 1990s), it struck me that I’d better test these study habit assumption to see if they still hold.

Now I know that some of you might be chomping at the bit to raise the “correlation doesn’t equal causation” fallacy. Maybe I was dumber than a bag of hammers when I was in college and no amount of studying would have helped. Or maybe students who come to college with a boatload of smarts can study anywhere at anytime without any consequence. In all seriousness, given the vast changes in technology and the availability of library resources online, maybe the “where” isn’t all that important any more.

Luckily, we have exactly the data necessary to test this question. By linking first-year student data collected prior to enrollment, during the first year, and after the spring term, we can look at the relationship between pre-college academic preparation, study habits involving “where” and “when” one studies, and first-year cumulative GPA.

To account for pre-college academic preparation, we used the student’s ACT score and their Academic Habits score from the Student Readiness Survey (a score derived from each student’s self-assessment of their academic habits; things like preparing for exams early instead of cramming the night before the test). To account for studying “where” and “when” we used responses to three questions on the end of the first-year survey:

  • Of all the time you spent studying this year, about how much of it was in your dorm room? (1=none, 2=a little, 3=about half, 4=most, 5=all)
  • How often did you study – by yourself or in small groups – in the CSL (Tredway Library, 4th floor study spaces, Brew, or Dining Hall)? (1=never, 2=rarely, 3=sometimes, 4=often, 5=very often)
  • I made sure to set aside time to study during the day so that I wouldn’t have to do it all at night. (1=never, 2=rarely, 3=sometimes, 4=often, 5=very often)

And to account for cumulative first-year GPA, we used the final cumulative GPA in the college’s dataset that is constructed after all grades from the spring term have been logged.

I’ve inserted the results of the regression equation below, placing the statistically significant results in bold text.

Variable Coefficient Standard Error Significance
ACT Score*** .067 .011 .000
Academic habits .087 .090 .337
Studying in Dorm* -.099 .044 .028
Studying in CSL -.033 .042 .433
Studying During the Day* .096 .038 .014

Based on these regression results, the old-school studying advice seems to have withstood the test of time. As we would expect, pre-college academic preparation predicts first-year cumulative GPA. But even after accounting for pre-college preparation, “where” one studies (or more specifically, where one does NOT study) and “when” one studies still matters. Studying in one’s dorm room is a significant negative predictor, meaning that the more one studies in his or her dorm room the lower the first-year cumulative GPA. Conversely, studying during the day is a significant positive predictor, meaning that the more one studies during the day the higher the first-year cumulative GPA.

Interestingly, the question about studying in the CSL didn’t produce a statistically significant result. This may be the result of the question’s lack of precision. Because there is such a range of study environments in the CSL, studying the Brew may produce a much different effect than studying on the quiet floors of the library. In the end, the effects of those differences may well cancel each other out. Moreover, this possibility might further support the notion that the problem with studying in one’s dorm room isn’t the location itself, but rather the frequency and availability of distractions from friends, neighbors, TVs, game systems, and whatever else one might have stashed away in their dorm room.

It’s always nice to find that some sage old advice still holds true. But what I find compelling about these findings is the fact that they come directly from Augustana students who were first-year students in 2014-15. With this in mind, we can confidently tell our advisees that Augustana students who study away from their dorm room and study during the day earn better grades than similar students who study at night in their dorm rooms. In my recent experience, it appears that our students tend to respond to guidance supported by data more than they respond to sage old advice from the balding, middle-aged quasi-intellectual. Oh well.

Welcome back from Thanksgiving break, everyone! I’m looking forward to enjoying the holiday season on campus with all of you.

Make it a good day,

Mark

Peer Mentorship: A good thing that we might make even better

No matter what you do at Augustana, I hope you found some time to get away and recharge over the break. Now that we are all back for the winter term, I’d like to introduce a new feature of Delicious Ambiguity that has no plan other than knowing that it will happen from time to time starting today. (I can’t undermine the title of the blog by imposing some sort of precisely organized plan, right?). There are a number of folks on campus who have conducted interesting and thought-provoking studies of our students’ experience. This work needs the chance to be highlighted and shared broadly. So without further ado, here is a post from Dr. Brian Leech from a study he conducted last year.

_________________________________________________________________________

Guest Post by Brian Leech

Our college employs a number of students who provide mentorship to their peers, especially first-year students; yet, these mentors tend to get overlooked when we talk about the first-year experience. Many faculty in particular often know very little about how these programs can help students adjust to college. Each program either assists a specific segment of the student population or it helps the general student population in a specific way.

Here is a brief run-down of some mentoring programs available on campus:

Peer Mentors: Work with faculty to help first-year students adjust to life at Augustana.

Global Ambassadors: Help newly-arrived students from other countries with culture shock.

Multicultural Ambassadors: Help students who often have trouble connecting to the Augustana community.

ACI/Chicago Network: Small group works with students from Chicago to help with adjustment.

Community Advisors: Coordinate programming at residence halls, provide emergency assistance, and perform many mentoring activities, including referrals and informal peer counseling.

Career Ambassadors: Help students with resumes and assist with career programming.

Reading/Writing Center Tutors/Fellows: Assist students with academic reading and writing. The campus also hosts a growing number of tutors in other subjects.

Admissions Ambassadors: Provide campus tours, help visitors, host overnights, and assist with visit days. Often essentially serve as mentors before students are even enrolled.

Interviewing both the people who manage these programs as well as a number of students involved left me impressed. The fact that some students devote so much of their own time to helping their peers is quite admirable. Although the college does typically pay them too, which I’m sure is a factor, but it’s not like these are particularly easy jobs. Students performing peer-mentoring duties are on the front lines of campus inclusion. Joining a majority-white community, for instance, often serves as a shock to many incoming students. The same can be said for international students arriving in the Midwest. No matter their background, many, if not all, undergraduates struggle to adapt to increasing academic expectations. Mentors who do some campus jobs, such as serve as math tutors or as writing fellows for first-year classes, can therefore be of great importance to students and the faculty who teach them.

Yet these mentoring programs can also use a boost. Below are the top three areas for improvement, as identified by the people involved.

Problem: Lack of knowledge. Many people across campus simply don’t know what mentoring programs are available to students, whether students they know are in a particular mentoring program, or what the different mentoring programs do. Therefore, students who could really benefit from this help are often not getting it in time. Better information sharing across campus can fix this problem.

Problem: Over-committed student mentors. Student mentors tend to be over-involved and sometimes don’t see their mentoring duties as a priority. We therefore need to improve students’ connection to and belief in their group’s mission. In other words, mentor positions should seem as vital to the student experience as they actually are. Certainly praise can help, but faculty, staff, and administrators can do more. Faculty, for instance, could partner with certain mentoring groups, help with on-going professional development, or assist student leaders within each group.

Problem: Training. Many of the above groups provide extensive training to mentors before the academic year begins. Once the year starts, however, little time exists for busy students to squeeze in further professional development. It is therefore worth exploring how the college can create and better support training that is accessible, useful, and compelling. Would an online module help? A workshop that involves joint faculty-staff-student training in mentorship?

As our college tries to improve students’ first-year experience, we should keep in mind the many student mentors who sometimes have as much, if not more than, an effect on incoming students’ lives as faculty, staff, and administrators.

_________________________________________________________________________

Thanks, Brian. This is an excellent example of one way that we could take advantage of existing programs to more fully integrate our students’ learning experience instead of adding something new.

If you’ve conducted a study of our students that you think the campus should know about, send me an email or meet me for coffee. I’d love the chance to share your work with the rest of the Augustana community.

Make it a good day,

Mark

Differences in a sense of belonging on campus by race and sex

One critical predictor of a student’s likelihood to persist (at the same college or university) is the degree to which that student feels like he or she belongs on campus. For this reason, many surveys of college students (including our own freshman and senior surveys) ask for a response to the statement “I feel a strong sense of belonging on campus” on a 5-point scale of strongly disagree, disagree, neutral, agree, and strongly disagree. After we collect this data, we converted the responses to a numerical scale ranging from 1 (strongly disagree) to 5 (strongly agree) so that we could run a variety of statistical analyses. The average score from our most recent graduating class was 3.94, which roughly translates to an “agree” response. This would seem to suggest that things overall are pretty good. But looking deeper, we found some differences that might help us focus our continuing efforts to ensure that Augustana is indeed a truly inclusive campus.

The first stage of this analysis involves parsing the responses by race/ethnicity. In last spring’s graduating class there were enough students in three race/ethnic categories to analyze separately – White, Black, and Hispanic. Although there were also graduates who identified as Asian, Native American, and multiracial (among others), the numbers in these categories were so small that we were obliged to organize them into one group for the purposes of this analysis.

When we generated average sense of belonging scores for each of these four groups (White, Black, Hispanic, and other), a particularly important difference appeared. Here are the average scores for each of the four groups.

  • White – 4.00
  • Hispanic – 3.91
  • Black – 3.29
  • Other – 4.16

Clearly, the Black students’ average response suggests a substantial gap between their sense of belonging on campus and each of the other groups. Further testing determined this difference to be statistically significant, validating what faculty and staff who interact closely with our Black students often report that they hear about these student’s experiences at Augustana.

Since we often find that sex also plays a role in shaping our students’ experience, we added a second layer to our analysis to see if the interaction of race/ethnicity and sex would produce even more exacting differences in the data. Interestingly, we did find such a difference among one specific group. Here is how this additional stage of analysis played out.

Race/Ethnic Category             Male            Female

White                                       3.90             4.06

Hispanic                                   3.46             4.14

Black                                        3.25             3.30

Other                                        4.00             4.25

In addition to male and female Black students experiencing a lesser sense of belonging on campus, Hispanic men also expressed a lower sense of belonging on campus. This difference was statistically significant when compared to either the overall average or Hispanic women.

So what should we make of these findings? First, I think it’s important to be reminded that the multiple dimensions of diversity within our student body play out it in tangible ways that can profoundly shape our students’ sense of belonging at Augustana College. Second, these findings further affirm that race/ethnicity and sex are still influential lenses through which students see and experience this community. No matter what we would like to hope to be true, the sense of belonging on campus among Black students and Hispanic men at Augustana appears to differ in a way that can have powerfully detrimental consequences. Third, designing ways to help students who feel less of a sense of belonging is complicated. There are very few universal quick fixes, and the ones that exist were likely put into place a long time ago. Now our work requires a recognition of nuance and the degree to which different perspectives shaped before coming to college can impact students’ lives. Finally, all of us – students and educators – play a critical role in addressing this dynamic.

Make it a good day,

Mark

We Are Improving a Key Aspect of the Academic Feedback Loop (And We Can Prove it!)

A few years ago we began to ask our freshmen about the degree to which they received academic feedback early enough in the term for them to adjust their study habits. The survey item read like this:

  • “I had access to my grades or other feedback early enough in the term to adjust my study habits or seek additional academic help.”

Students could respond by selecting:

  • strongly disagree
  • disagree
  • neutral
  • agree
  • strongly agree

One of the reasons we began to ask this question was because we wanted to gather more information on the nature and scope of the feedback our freshmen received during their first term. Based on the wealth of research on the critical impact of regular and clear feedback, we have always known that this is an important aspect of an ideal learning environment. However, we had been surprised by the number of struggling first-year students who claimed to be unaware of how poorly they were doing in their classes.

Upon reviewing our first round of data near the end of the 2013/14 academic year, we had to swallow hard. 46.0% of the respondents selected disagree or strongly disagree, while only 34.1% agreed or strongly agreed with that statement.

During the 2014/15 academic year we had serious, and at times even tense, conversations about these findings. Even though it was certainly possible that some students who claimed to be unaware of their grades had simply chosen to avoid checking the grades that were clearly posted for just this purpose, these conversations led to several faculty development workshops and a lot of reconsideration of the scheduling of student assignments and the nature of the feedback provided to students. In addition, a number of conversations delved deeper into the degree to which students need to be shown how to use the feedback they receive and learn how to approach learning at Augustana differently than they may have approached learning in high school.

Over the subsequent two years, we’ve seen substantial movement on this item. In 2014/15, 36.7% of respondents selected disagree or strongly disagree (a roughly 10 percentage point decrease from the prior year) and 38.6% agreed or strongly agreed (a 4.5 percentage point increase from the prior year). Although this was encouraging to see, many instructors had already planned their course for that year by the time we had begun to discuss the findings from the prior year.

Now that faculty have had a full year to contemplate and infuse this concept into course syllabi, our 2015/16 data suggests that freshman are experiencing a substantially improved learning environment.  Examining responses from 515 freshmen (a 75.8% response rate), only 24.9% disagreed or strongly disagreed while 52.8% agreed or strongly agreed.

In two years, we’ve seen a (roughly) 20 percent swing toward an improved learning environment for our students. Although there are certainly plenty of reasons to drill deeper and continue to improve the ways that we cultivate a vibrant feedback loop between instructor and student (i.e., instructor gives student feedback, student applies feedback to improve academic work, instructor sees evidence of improvement in subsequent student work, instructor give student feedback that notes improvement and points to further opportunities to improve. etc.), I think we deserve to take a moment and realize that we’ve just accomplished something that many colleges only dream of but rarely get to see: actual evidence of improvement in the act of educating. This data provides concrete evidence that we identified an opportunity to get better, did the work to plug that finding into our daily efforts, and produced a real and significant change for the better.

I’m really proud of us.

Make it a good day,

Mark

“Close the Gap” Passes the Test

It’s no secret that students who choose to attend Augustana College (or for that matter, any other private liberal arts college like us) make a substantial financial commitment to their undergraduate education. As numerous economic trends over the last decade have combined to squeeze most families’ financial resources, this commitment has increasingly come under pressure to produce results. In examining our own data, it has become more and more clear that this financial pressure also contributes to students’ decision to persist or withdraw after the first year. In recent years, we’ve noted the large subpopulation of departing students who leave with a respectable, if not enviable, GPA after their first year. At the same time, we’ve seen an uptick in the number of students who claim that financial issues are a significant reason for their choice to depart.

In preparation for the incoming cohort of 2014, Augustana developed a financial aid program called “Close the Gap” to help those students who appeared to need some extra financial assistance to attend Augustana. By now, many of you know of this program. Many of you contributed to it. And the story of its success has been well documented, with about 100 freshmen in the class of 2014 receiving some assistance from this endeavor.

But with all warm and fuzzy stories of philanthropy comes the stickier question. Is this program actually effective? Does it affect more than the initial decision to attend Augustana? Specifically, would it have any impact on these students’ decision to return after their first year and continue toward graduation?

This is a tough thing to test because it’s hard to find a legitimate comparison group. We didn’t (and wouldn’t) create some sort of shadow “control” group within the first year class of students who needed the money but didn’t get it. And we can’t really compare these Augustana students with similar students at other institutions because 1) we don’t have access to those institutions’ data, and 2) those students didn’t choose Augustana so their first year experience isn’t similar. In the end, the only plausible and reasonable way to test the success of this program was to identify students from prior cohorts who would have likely been offered Close the Gap funds if such a program existed and then see if the first-to-second year retention rates of these students differed from the rate of the students who actually received Close the Gap funds.

This plan gets dicey, too, because very little stays exactly the same in the world of recruitment and enrollment. Scholarship amounts change, patterns of classifying the interest level of prospective student change, and the individuals who actually do the recruiting change. Nonetheless, although this approach might not gets us to an exact apples-to-apples comparison, it does get us within a pickpocket’s reach of the same fruit stand. (Yeah, I made that up.)

So here are the retention rates of students who likely would have received Close the Gap funds in 2012 and 2013, compared with the students who received those funds in 2014.

  • 2012 Cohort – 77.8%
  • 2013 Cohort – 77.3%
  • 2014 Cohort – 88.2%

There are some pretty good reasons to take this finding with a grain of salt. First, we have instituted a number of other campus-wide programs and support systems to assist our retention efforts. Second, when we put the Close the Gap program in place we also set in motion an increased effort to track these students, which in turn likely increased our inclination to informally support these particular students during their first year. Third, every incoming class is different and the overall make up of the 2014 group may well have fortified the environment most conducive to these students’ success.

Yet, even with all of these caveats in mind, an 11 percentage point swing is big. In tracking the retention rates of many different subpopulations of students (e.g., race/ethnic categories, gender, first generation status, etc.), we never see a swing that large between two years, especially if the two years prior are almost identical.

I think it’s reasonable to suggest that the Close the Gap program has improved the retention rate of students with this particular level of need, and it appears that this improvement did contribute to an increase in our overall retention rate between last year and this year. This is certainly cause for celebration. We seem to be getting better at addressing the different needs of different types of students.

Yes, we’ve got plenty more work to do. And we are diving into those challenges, too. But for today, I think it’s o.k. to smile, celebrate some success, and give a shout-out to the folks who initiated and continue to raise the funds for this program. Thanks and Congrats!

Make it a good day,

Mark

Motivated Much? Meh . . .

Intellectual curiosity is a fundamental goal of a liberal arts education. So it’s no surprise that we included it as one of Augustana’s nine learning outcomes. In our own words we chose to call this outcome “Wonder,” describing it as “a life-long engagement in intellectual growth,” and describing the students who exhibit this attribute as individuals who “take responsibility for learning.” It seems pretty clearly implied in these descriptions that we believe the graduates who exemplify intellectual curiosity would have developed a motivational orientation toward learning that is:

  • optimistic about the potential that additional learning provides,
  • continually seeking to grow and develop,
  • and intrinsically driven to pursue deeper knowledge.

As an aspirational goal, all of that sounds bright and shiny and downright wonderful. But the realities of dealing with our students’ motivations aren’t always quite so dreamy. We are often keenly aware of our students’ tendency toward external rewards such as high grades, acceptance to a prestigious grad school, or the allure of a high-paying job. Most of us have seen the blank look on a student’s face when we extoll the benefits of learning just because it’s interesting and even exciting to learn. Moreover, we all understand how much more difficult it is to shift a student’s motivational tendencies when they come to college after twelve years (or more) of high-stakes testing. In short, although we each might have had some flash of brilliance about how to stoke a student’s intrinsic motivation (or maybe in some cases just get a single flame to flicker), we know less about how to reliably team up with students to build that fire and keep it burning. If that weren’t enough, we’re not even sure about the degree to which we can influence a student’s motivational orientations at all. Maybe those orientations are mostly hard-wired by earlier life experience and aren’t really malleable again until well into adulthood.

Four and a half years ago, we decided to tackle this question in more depth by studying if, and how, our students’ motivational orientations change during their college career. As a part of our rolling outcomes assessment plan (our way of utilizing each incoming cohort to study how students change on a particular aspect of our learning outcomes), the 2011 cohort took a survey instrument assessing orientations toward three different types of motivation during Welcome Week. These three orientations approximate intrinsic, extrinsic, and impersonal (i.e., when one is motivated to avoid something) motivation. You can learn more about the instrument we used here. Last spring, those same students took the same survey as a part of the senior survey, allowing us to test how their responses changed over four years. In addition, we will be able to use their responses to the senior survey questions to explore which experiences might statistically predict change on any of these three motivational orientations.

The consensus understanding of how motivational orientations change suggests that as people age, they develop a stronger orientation toward intrinsic motivation and a weaker orientation toward both extrinsic motivation and impersonal orientation. These findings seem to match up with what we know about the maturation process as well as other research findings that suggest the way that people’s values shift over time. With these prior findings in mind, we tested our freshman and senior year data, hypothesizing that our students’ orientation toward intrinsic motivation would go up and their orientations toward extrinsic and impersonal motivation would go down.

Well, we were partially right.  We had complete data from 397 students and only included those cases in the analysis presented below. The range for each orientation scale is 1-5. The three asterisks (***) indicate that the change between freshman year and senior year is statistically significant (for the stats junkies, that p-value is <.001).

Minimum Maximum Mean Std. Deviation
Freshman year – Intrinsic Orientation 2.88 5.00 4.1243 .37228
Senior year – Intrinsic Orientation 1.00 5.00 4.0783 .51475
Freshman year – Extrinsic Orientation 1.94 4.24 3.1235  .38384
Senior year – Extrinsic Orientation *** 1.00 4.06 2.9623  .46230
Freshman year – Impersonal Orientation 1.69 4.00 2.8638 .40125
Senior year – Impersonal Orientation *** 1.29 4.12 2.7108 .50168

Our data suggests an interesting, and potentially troubling, possibility.  Although both orientations toward extrinsic and impersonal motivation dropped over four years, the orientation toward intrinsic motivation did not change significantly. This doesn’t reflect what we hypothesized and what prior research findings would have predicted. Furthermore, the notion that our students’ orientation toward intrinsic motivation hasn’t changed doesn’t match well with our goal of developing a more robust sense of intellectual curiosity.

There are numerous ways to explain this finding as an anomaly. Maybe our students’ relatively high scores on the intrinsic motivation scale as freshmen made it harder for them to score much higher. But that doesn’t seem to comport with many faculty opinions on campus regarding an absence of intrinsic motivation in most students. Maybe the 2011 cohort of students was just an unusual group and that changes in other cohorts would parallel other research findings. Yet our analysis of Augustana’s Wabash National Study data from our 2008 cohort revealed an even more troubling pattern where markers of intrinsic motivation dropped precipitously between the freshman and senior year. Or maybe the measurement instrument we used doesn’t really capture the construct we are trying to measure. However, this is an instrument that seems to have been validated repeatedly by a variety of researchers to reasonably capture these three aspects of motivation.

Cultivating intrinsic motivation is certainly not an easy thing. But if one of our core goals as a liberal arts college is developing young people who possess a more substantial orientation toward intrinsic motivation at the end of their senior year than they had at the beginning of their freshman year, then it seems to me that this finding should give us pause. In future posts I’ll share the experiences that we find statistically predict an increase in intrinsic motivational orientation.  If you can think of something that we should test, by all means shoot me an email and we’ll see what happen!

Make it a good day,

Mark

Celebrate another number – Zero!

It’s been almost two years in the making, but today we submit the completed Assurance Argument to the Higher Learning Commission in order to maintain our status as an accredited institution of higher learning.  The completed file is 33,839 words long and comes with a massive file of supporting documents.  These supporting documents include ten years of financial statements, meeting minutes from all of the most prominent faculty committees, a litany of strategic planning documents, all of the handbooks and catalogs that we use, and a host of other copies of emails, memos, spreadsheets, data summaries, and who knows what else. If that weren’t enough, we also submitted over 1,300 pages of documents to show that we’ve met the long list of federal compliance standards.

So after months of writes, rewrites, edits, re-edits, a couple of start-overs, and even a re-rewrite (and way too much wordsmithing – we are academics and we can’t help ourselves!), we have declared it to be done. Sometimes its entirely appropriate to celebrate selfishly. And for the IR office and Academic Affairs, this is just such a day. Yahoo! Zero more days of HLC assurance arguing!

Of course, there is no way that one office or one committee could possibly put all of this together.  Many of you contributed to this project by writing parts of these documents, finding evidence of a change that Augustana made at some point in the last ten years, editing big parts or small sections of these documents, identifying further evidence that might bolster an argument or better answer a question, or just reading over various parts of the text and telling us that things looked pretty good from your point of view.

Please accept a gigantic thank you on behalf of myself and all of us who’ve been consumed with this project for the last six months.

Lastly, there is no way that this project would look anything like it’s final iteration if it weren’t for Kimberly Dyer.  Many of you know Kimberly already and know that she is the main reason why the IR office is able to pull off all of the things that we do.  She spent countless hours writing, editing, researching, and scouring the backwaters of our computer drives and servers for just the right documentation to go into the Assurance Argument.  She also kept an amazingly complicated record of all the evidence items that were needed, requested, found, missing, entered, and ultimately linked in the text.

Next month (October 19 and 20 to be exact), a team of external reviewers will come to Augustana to follow up with us, ask additional questions of faculty, administrators, staff, and students, and conduct their due diligence to complete the process of the HLC accreditation cycle.  You’ll see a lot more information about that visit as we get closer to it, no doubt.  But for now, I just wanted to give everyone who contributed to creating the Assurance Argument a big shout out.  And if you see Kimberly around campus this week, don’t hesitate to applaud, bow, thank, or express gratitude in whatever way you choose. She deserves it.

Oh . . . what’s that?  You say you’d like to read the whole thing?  Really???  I believe that the entire file will still be available on the HLC site if you want to use the login and password that we set up last spring for anyone who wanted to review it.  If that doesn’t work, let me know and I’ll find a way to get you a copy of the final documents.  In the meantime, revel in the fact that no one is going to ask you to write an accreditation document for the long while!

Make it a good day,

Mark

Take a Moment to Be Happy about a Number!

With all the talk of a shrinking high school student population, changing demographics within that population, and the increasing number of college students who take courses online or transfer on a whim, it’s hard not to feel like the sky is about to come crashing down on higher education institutions like ours. Although I have no idea how to gauge the “threat level” given all of the external changes that are happening simultaneously (what ever happened to our good ol’ color-coded threat barometer from Homeland Security?), if you listen hard enough you can hear the entire system creaking and groaning like an old ship in tumultuous water. So even if it’s not the beginning of a fiery apocalypse, surviving all of this stuff isn’t necessarily a foregone conclusion and survival is not the same as coming out no worse for wear.

Yet in the midst of all this high anxiety, it’s easy to get so caught up in the fear of the unknown that we forget to notice moments worth celebrating. A big part of navigating change is keeping a balanced frame of mind and paying attention to evidence that we might be moving in the right direction. With this in mind, today I’d like to point to one number that is worth smiling about.

86.1%.  That is the proportion of the 2014 cohort of freshmen who returned for their second year at Augustana.  For short, we call that our retention rate.

The reason why that number is worth celebrating is that over the last few years we’ve been retaining somewhere between 85.1% and 82.9% of freshmen to the second year.

There are certainly several reasons to keep this party to a dull roar. Retention rates fluctuate, and even though we have instituted several good programs to help different types of students find a niche and succeed, managing the decision-making patterns of 19-year-olds is not a precise exercise. But today, it is worth noting that our retention rate of first-to-second year students is higher than it has been in three years.

That is worth letting yourself smile for a moment. It’s even worth going to someone on campus who works with first year students – LSFY instructors, 100 and 200 level course instructors, first-year advisers, financial aid administrators, learning commons administrators, librarians, residence life staff, coaches, and student life administrators (you get the idea at this point . . . there are a lot of people who influence the lives of first-year students) – and congratulate them. If you are one of the many who play a role in first-year students’ lives, take moment to smile and be proud of your effort.

Make it a good day,

Mark

 

 

Want to Improve Our Work Culture? Own Up to Your Blind Spots

Whether you want to call it employee “climate,” “culture,” “satisfaction,” “or engagement,” I think we all know the difference between a vibrant and a corrosive working environment. A vibrant work environment can make it feel like you love every minute on the job (believe it or not, that is actually possible!). A corrosive work culture makes it feel like you can’t get out the door fast enough. Even though we’d probably all like to think otherwise, if we’re honest I suspect we can all remember experiencing both kinds of workplace vibes in our professional lives, maybe even here at Augustana.

You might remember that last spring we conducted two surveys of Augustana employees to better understand the nature of our workplace culture and employee engagement. Although those of us who are here during the summer started mulling over the wide array of findings right away, now that everyone is back on campus the Employee Engagement Taskforce (full disclosure: I’m the chair of this Taskforce) has officially begun to delve deeper into the results of those surveys. Our charge from President Bahls is two-fold. First, we need to learn about the underlying factors that produced our employee’s responses by talking with people across all of the functional areas of the college. Second, after triangulating the data from our surveys with the insights gathered from these conversations, we need to identify a set of changes (recommendations that will almost certainly vary based on local circumstances) that we can make in ourselves, our policies, or our organizational structure that will help us improve the culture in which we all work, ultimately improving our overall level of employee engagement.

Yet while the Employee Engagement Taskforce is doing its work, it seems strange to me that we might all simply “keep calm and carry on” while waiting for some edict from on high. In fact, a wealth of research on the nature of organizations has found that it is the collective “we” that plays the dominant role in shaping employee culture, not the amorphous “they” (no matter how badly I’d love to blame some else for my annoyances du jour). So if there were something that I could do right away, I wouldn’t want to wait to read it in a report.

It turns out that an influential predictor of a healthy work environment that repeatedly pops up in our own analyses is something that we could all plug into our work right away. Consistently, how often we thought that our co-workers tried to understand the perspectives of others on campus predicted higher perceptions of transparency and trust. In turn, higher perceptions of transparency and trust predicted workplace satisfaction. Even more specifically, while perceptions of the degree to which co-workers tried to understand the perspectives of others on campus mattered regardless of the role of the co-worker, this effect was most pronounced when respondents perceived that administrators exhibited this trait.

Both findings are important. First, all of us can make our campus a better place by purposefully trying to understand issues from the perspectives of others. This doesn’t mean that you have to change your mind about something or acquiesce to someone else’s wishes. It just means that it needs to be apparent to others that you’ve recognized a measure of legitimacy in their perspective. Second, if you are someone in an administrative role, the impact of adopting this behavior is potentially transformative. With referent capital (the power that comes from a position of authority), the choice to genuinely show others that you want to understand their perspective – even if you ultimately choose to take a different course – goes a long way toward cultivating an environment that increases employee engagement across the board.

But in order to adopt this behavior, we all have to own up to our own blind spots. We’ve all got them, even if we aren’t so good at admitting it. In my case, I’ve got more than a few potential blind spots. For example, I can be overly (bordering on naively) optimistic. In addition, although I know something about student learning, I don’t have nearly the direct experience in the classroom like that of a seasoned faculty member. In fact, because I don’t interact with students nearly as much as most of you I am susceptible to confusing what I see in our quantitative data with the true breadth of our student population. Finally, I don’t know what it’s like to work in any other place at Augustana than Academic Affairs. So even though I’ve held plenty of other jobs in my life, I could assume that I know more than I do about the working lives of our non-academic employees.

These are just a few of my blind spots. My perpetual challenge is to make sure that I own up to them and seek to understand how the world looks through the lens of others before starting to dream up possible solutions. One of the early exercises of the Employee Engagement Taskforce was to collectively own up to each of our potential blind spots and to realize that others on the committee can help shine a light for each other. Furthermore, to a person we recognized that we will have to go outside of our group often if we are to fully understand the nature of the employee experience at Augustana and to identify the right set of recommendations to improve our work culture and employee engagement.

What are your blind spots? If you can own up to them, you are that much closer to making Augustana a better place to work. If we can all do that together, hmmm . . . .

Uh, oh – I think my optimism might be kicking into overdrive!

Make it a good day,

Mark