What do we know about successful first-year Augustana students?

Good morning! I hope you took some time over the three-day weekend to relax and refuel. Before you know it, we’ll be watching the leaves turn, wondering where the warm weather went, and counting down the days til the end of fall term.

Although it seems like school has just started, some of our first-year students already feel like they might be in deeper water than they can handle. And even though you can tell them that the trimester current moves faster than the gentle drift of semesters, it doesn’t get real until the first wave hits them in the face. So week three is the perfect time to hammer home the behaviors that make first-year Augustana students successful.

Now that we have more than five years of data from first-year students that track their behaviors, experiences, and growth, we can start to make some pretty confident assertions about what successful students do. Based on repeated analyses that identify statistically significant relationships between specific student behaviors and outcomes like GPA, a sense of fitting in, and an increased sense of direction and purpose, successful first-year students do these three things.

  • Successful first-year students build a healthy network of friends, guides, mentors, and resources.

This doesn’t mean that successful students have a larger network of friends, guides, and mentors than less successful students. The key factor is the healthy nature of that network. This means that a successful student’s friend network brings out the best in each person and stretches every member of that network to make their community a better place. Likewise, successful students find at least one guide or mentor who both believes in them and challenges them to grow, mature, and think in more complex terms. Finally, successful students seek out the campus resources that they might need before they actually need them, and use them to get better instead of waiting until trouble bubbles up.

  • Successful first-year students dedicate themselves to study smart.

Successful Augustana students have dedicated themselves to four rules that define the way they study. Data from Augustana students repeatedly indicates that these behaviors impact everyone regardless of their pre-college academic preparation or ability.

  1. Religiously use a planner. Although important, it’s not just to keep track of what things need to get done. Really, it’s about organizing and logging when to do each thing on that list.
  2. Study during the day. Just like an 8 to 5 job, get up early and make every minute of the day count – especially the time between classes. The impact of this behavior on stress, sleep, and the quality of academic work turns out to be sort of amazing.
  3. Don’t study in the dorm room. Even though first year students might be used to studying in their rooms when they were in high school, the residence hall environment is pretty different from home in terms of visitor frequency, noise, and potential distractions. Similar to what happens to students who do most of their studying at night instead of during the day, studying in one’s dorm room invites a level of inefficiency that often make studying take longer and be less effective.
  4. Build a like-minded study group. Sometimes it is necessary to study alone, but other times it’s much more beneficial to study with a group. Successful students find like-minded students (not unlike the characteristics of a healthy network of friends) to study with when a group session might be particularly helpful.

If you want your students or your advisees to make the most of their first term at Augustana, tell them to grab hold of those four points and don’t let go.

  • Successful Augustana students take charge of their own growth.

It’s hard to get through a single day without seeing or hearing an invitation or exhortation to get involved in a student club, activity, organization, or event. And we’ve all seen the student email signature that lists membership in more groups than there is time in the day. But the most successful Augustana students aren’t the ones who are involved in a lot of stuff. Instead, the most successful students are the ones who focus on experiences that specifically impact their growth in learning more about themselves and learning more about how they can better relate to others. This bit of advice can get lost if we don’t emphasize it to our students – don’t just get involved in stuff, get involved in the right stuff.

In addition to choosing the right combination of involvement in activities, organizations, and events, successful first year Augustana students connect with CORE right away. They recognize the importance of the relationship between the things they do right now and the person they want to be when they graduate. All the services that CORE provides help students embrace and develop a sense of purpose and fuel an increasing sense of momentum in that direction. As simple as it might sound, students who start building a resume or a grad school portfolio during their first year are more likely to have a job or graduate school place at graduation – regardless of their college GPA. This isn’t magic or assembly line educating – it’s just that these students start considering and articulating the connection between what they are doing now and where they want to be four years from now.

So if you want to drop some knowledge on your students that is virtually guaranteed to make a difference, hit them with these three golden nuggets.

Make it a good day,

Mark

 

Three highlights from the 2016 Student Readiness Survey Results

As most of you know by now, we developed the Student Readiness Survey a few years ago to give us more nuanced information about key traits and dispositions that impact the nature of our student’s transition to college. Instead of basing our conclusions about readiness for college on indicators of a student’s academic preparation or intellectual strength, we wanted to zero in on the dispositions and traits that make a student successful in every aspect of the residential college experience. The results of this instrument have become a key piece of first-year advising and have turned out to be statistically predictive of numerous important developmental and learning outcomes.

The 36 statements on the survey describe a trait or a disposition. For each item, the respondent chooses from a response set that ranges from “never like me” to “always like me.” As an example, one item states, “I like to cooperate with others.” The response that a student selects gives us a glimpse into the way that he or she perceives him or herself regarding an important interpersonal skill that will undoubtedly shape the transition to residential college life.

As you might suspect, most of our student’s responses tend toward the kind of traits and dispositions that we’d like to see (i.e., if we look at the item about cooperation I listed above, scoring “never like me” =1 and “always like me” = 5 produces an average across all incoming students of 4.26). However, There are some dips in scores on a few items that might be telling.

There are six groups of items that are organized into categories, or as a stats geek would call them, scales. The scales attempt to capture:

  • Academic Confidence
  • Academic Habits
  • Comfort with Social Interaction
  • Interpersonal Skills
  • Persistence and Grit
  • Stress Management

Interestingly, a gap seems to appear in the average scale scores that put these six scales into two groups. The scores for Academic Confidence, Persistence and Grit, and Interpersonal Skills each average between a 4.11 and 4.25. By contrast, Academic Habits, Stress Management, and Comfort with Social Interaction each average between 3.76 and 3.85. Even at its narrowest (i.e., 3.85 to 4.11), this gap is statistically significant, suggesting that this gap might be more than random chance. I’m not sure I have any answers – or even hypotheses – as to why this might be, but it seems to me that there might be something more fundamental going on here.

In addition, the three individual items with the lowest overall average scores all sit in the Academic Habits category.

  • When I am confused by an assignment, I seek help right away. (3.48)
  • I highlight key points when I read assigned materials. (3.39)
  • I start homework assignments early enough to avoid having to rush to complete them. (3.38)

Each of these items try to capture an element of academic habits that would indicate self-efficacy and the wherewithal to take assertive action in response to a challenge. These items seem to me to fit into a larger conversation about the degree to which we need to move many students from thinking that education “happens to them” to thinking that “they make their learning happen.”

In your conversation with students this week, just as they are starting to feel the first wave of readings and homework fully wash over them, it might make sense to consider the degree to which your students still need to shift from thinking that education happens to them to actively making their learning happen. Sometimes it turns out that we have to tell our students how to do what we want them to do just as much as we have to tell them what we want them to turn in. I am realizing how much I have forgotten about that difference as I am teaching an FYI 100 section for the first time.

So hang in there with your students, even when they give you that glazed look of overwhelminghood (I know it’s not a word, but you get the idea).

Make it a good day,

Mark

Something tells me this is gonna be a great year!

Good morning everyone!

Welcome to campus – no matter if you can’t remember being anywhere else in late August or if you are the first person in your family to start your fall on a college campus! No matter how you got here, how long you’ve been here, or how soon you’ll be diving into your next great adventure, I’m really glad each of you are here right now.

Somehow you’ve stumbled onto a blog called “Delicious Ambiguity” written by me, Mark Salisbury. (Ok, so I emailed you the link and you clicked on it thinking it might be important). I’m the Director of Institutional Research and Assessment at Augustana College or, as some students have taken to calling me (a supreme compliment, IMHO) the Chief Nerd. I started writing this blog in 2011 as a column in the Faculty Newsletter. The goal then was to share snippets of Augustana data with everyone and hopefully encourage each of us to take a moment to ponder the implications of that data. Most of the time, it’s been statistical data (hence the name Chief Nerd), but sometimes it’s data that comes from interviews or focus groups. No matter the source, I try to explore data points that can help all of us – faculty, staff, and students alike – maximize our experience at Augustana. In case you’re wondering, if you ever think to yourself, “Why doesn’t Mark write about that?” send me an email or comment at the bottom of a blog post. If we’ve got the relevant data, I’ll try to write about it.

With every new group of students, be they traditional freshmen or non-traditional transfers, we gather a set of data points that help us better understand the breadth and depth of the diversity contained within that group. Tracking these data points is one way to remind all of us that cultivating a diverse and vibrant community is about exponentially more than just tracking skin color or biological sex.

Today I’d like to share two data tidbits from our incoming class that seem worth pondering.

First, 31.8% of our new students indicate that neither of their parents earned a four-year college degree. Certainly a substantial proportion of these students come from families where they are the first to go to any kind of college. Equally important, this is not a new phenomenon; this proportion has stayed near 30% since we began asking this question of incoming students in 2012 and, as best we can tell, Augustana has already enrolled a substantial proportion of “first generation” college students. While we can certainly parse the nuances of this student category, our reality remains that many students may not grasp the unstated but oft-assumed implications of our liberal arts college culture, both in terms of the intentions behind various policies or the behaviors that many of us enact every day without a second thought. Moreover, many of these students likely harbor an additional layer of internal anxiety about whether or not they truly “belong” in college at all, let alone a private institution like Augustana.

Second, Augustana’s growing enthusiasm for interfaith understanding in recent years couldn’t have come at a better time. Our incoming class is peppered with students from every kind of western and non-western faith. We have new students who self-identify as Muslim, Buddhist, Hindu, Jewish, Mormon, Greek Orthodox, Catholic, Episcopal, Lutheran, Presbyterian, Methodist, Baptist, Pentecostal, non-denomination, and a whopping 6% of students who categorized themselves as “other.” Oh, and to top it off, 16.7% of our incoming students identify as “no religious background” or “atheist.” I don’t know if this is a one-year phenomenon or if we’ve crossed a tipping point of some sort, but this year that group of students is larger than our incoming proportion of Lutheran students (14.4%).

These two data points hold important implications for the assumptions we make about individual students. All of us probably have some growing to do as we think about the way that we interact with each student. I certainly do. I’ve already made the mistake of assuming that someone I had just met came to Augustana from another country. Based on this faulty assumption, I made a comment that I wish I could take back because it might have been interpreted to reiterate the sense that I am a part of the “natural” in-group and they are still a member of a “probationary” out-group. I owe that person an apology, one that I intend to deliver soon.

I don’t say any of that to hold myself up as some grand example, but rather to suggest that adapting to this increasingly prevalent and multifaceted diversity is a process during which we are each likely to stumble. But in stumbling, depending on how we respond to it, we might just be able to communicate more clearly that we genuinely want to make Augustana a welcoming and inclusive place for everyone – no matter where they are from, who they are, or what they want to become.

Make it a good day,

Mark

Triangulating our assessment of quantitative literacy

Whether we like it or not, the ability to convey, interpret, and evaluate data affects every part of our personal and professional lives. So it’s not a surprise to find quantitative literacy among Augustana’s nine student learning outcomes. Yet, of all those outcomes, quantitative literacy may be the most difficult to pin down. First of all, this concept is relatively new when compared to other learning outcomes like intercultural competence or critical thinking. Second, there isn’t nearly the range measurement mechanisms – surveys or otherwise – that capture this concept effectively. And third, quantitative literacy is the kind of skill that is particularly susceptible to social desirability bias (i.e., the tendency to believe that you are better at a desirable intellectual skill than you actually are).

Despite the obstacles I noted above, the Assessment for Improvement Committee (AIC) felt like this was an outcome ripe for the assessing. First, we’ve never really measured quantitative literacy among Augustana students before (it wasn’t addressed in the Wabash National Study when we participated between 2008 and 2012). Second, it isn’t clear that we know how each student develops this skill, as we have defined it in our own college documents, beyond what a student might learn in a “Q” course required by the core curriculum. As a result, it’s entirely possible that we have established a learning outcome for all students that our required curriculum isn’t designed to achieve. Uh oh.

In all fairness, we do have one bit of data – imperfect as it is. A few years ago, we borrowed an idea from the National Survey of Student Engagement (NSSE) and inserted a question into our senior survey that asked students to respond to the statement, “I am confident in my ability to interpret numerical and statistical quantities,” giving them five response options that ranged from “strongly disagree” to “strongly agree.”

Since we began asking this question, about 75% of seniors have indicated that they “agree” or “strongly agree” with that statement. Unfortunately, our confidence in that number began to wain as we looked more closely at those responses. For that number to be credible, we would expect to see that students from majors that have no quantitative focus were less confident in their quantitative abilities than students from majors that employed extensive quantitative methods. However, we found the opposite to often be the case. It turned out that students who had learned something about how complicated quantitative methods can be were less confident in their quantitative literacy skills than those students who had no exposure to such complexities, almost as if knowing more about the nuances and trade-offs that can make statistics such a maddeningly imperfect exercise had a humbling effect. In the end it appeared that in the case of quantitative literacy, ignorance might indeed be bliss (a funny story about naming another bias called the Dunning-Kruger Effect).

So last year the AIC decided to conduct a more rigorous study of our students’ quantitative literacy skills. To make this happen, we first had to build an assessment instrument that matched our definition of quantitative literacy. Kimberly Dyer, our measurement ninja, spent weeks pouring over the research on quantitative literacy and the survey instruments that had already been created to find something that fit our definition of this learning outcome. Finally, she ended up combining the best of several surveys to build something that matched our conception of quantitative literacy and included questions that addressed interpreting data, understanding visual presentations of data, calculating simple equations (remember story problems from grade school?), applying findings from data, and evaluating the assumptions underlying a quantitative claim. We then solicited faculty volunteers who would be willing to take time out of their upper-level classes to give their students this survey. In the end, we were able to get surveys from about 100 students.

As you might suspect, the results of this assessment project provided a bit more sobering picture of our students quantitative literacy skills. These are the proportions of questions within each of the aforementioned quantitative literacy categories that students who had completed at least one Q course got right.

  • Interpreting data  –  41%
  • Understanding visual presentations of data  –  41%
  • calculating simple equations  –  45%
  • applying findings from data  –  52%
  • evaluating assumptions underlying a quantitative claim  –  51%

Interestingly, students who had completed two Q classes didn’t fare any better.  It wasn’t until students had taken 3 or more Q classes that the proportion of correct answers improved significantly.

  • Interpreting data  –  58%
  • Understanding visual presentations of data  –  59%
  • calculating simple equations  –  57%
  • applying findings from data  –  65%
  • evaluating assumptions underlying a quantitative claim  –  59%

There are all kinds of reasons that we should interpret these results with some caution – a relatively small sample of student participants, the difficulty of the questions in the survey, or the uneven distribution of the student participants across majors (the proportion of STEM and social science majors that took this survey was higher than the proportion of STEM and social science majors overall). But interpreting with caution doesn’t mean that we discount these results entirely. In fact, since prior research on students’ self-reporting of learning outcomes attainment indicates that students often overestimate their abilities on complex skills and dispositions, the 75% of students who agree or strongly agree is probably substantially higher than the proportion of graduates who are actually quantitatively literate. Furthermore, since the proportion of students who took this survey was skewed toward majors where quantitative literacy is a more prominent part of that major, these findings are more likely to overestimate the average student’s quantitative literacy than underestimate it. Triangulating these data with prior research suggests that our second set of findings might paint a more accurate picture of our graduates.

So how should we respond to these findings? To start, we probably ought to address the fact that there isn’t a clear pathway between what students are generally expected to learn in a “Q” course and what the college outcome spells out as our definition of quantitative literacy. That gap alone creates the condition in which we leave students’ likelihood of meeting our definition of quantitative literacy up to chance. So our first question might be to explore how we might ensure that all students get the chance to achieve this outcome; especially those students who major in disciplines that don’t normally include quantitative literacy skills.

The range of quantitative literacy, or illiteracy as the case might be, is a gnarly problem. It’s not something that we can dump onto an individual experience and expect that box to be checked. It’s hard work, but if we are serious about the learning outcomes that we’ve set for our students and ourselves, then we can’t be satisfied with leaving this outcome to chance.

Make it a good day,

Mark

A Motherload of Data!

It’s probably a bit of a reach to claim that the Institutional Effectiveness and Mission Fulfillment report (begrudgingly called the IEMF) is the cutting edge of data reporting, but it is true that this annual report is something that a lot of people work pretty hard on for several months at the end of each academic year. Unlike the college’s dashboard – a single page of data points that is supposed to cut the quantitative quick, the IEMF is a motherload of data and a treasure trove of information about Augustana College.

In past years we have posted the IEMF on the Institutional Research web page and hoped that people would look at it because, you know . . . nerd click-bait! Not since the first year that we produced this report have we hosted a public gathering to invite comment from anyone who might have an observation about the data and how it is conveyed. One thing I will not soon forget from that meeting was the degree to which data becomes political as soon as it becomes public, and therefore how important it is to convey precisely and anticipate how data presentations might be interpreted from different points of view.

With that in mind, I want to share with you the 2016 version of the IEMF. It is organized into nine sections that each cover different aspects of what and how we do what we do. For example, in the section titled Persistence, Graduation, and Attrition (p. 1) you might be interested in the distribution of reasons that students give for withdrawing and how those reasons might have changed over the last three years. Or, in the section titled Our Practices (p. 20) you might be interested in the rising costs to recruit a single student over the last three years. There are a lot of tidbits throughout the document that provide a glimpse into Augustana College – areas of strength, opportunities for growth, and how we compare to similar liberal liberal arts colleges around the country.

Click on the link below and swim in a river of data to your heart’s content.

2016_IEMF_Report

Certainly, the IEMF isn’t a perfect snapshot. Even though it has improved considerably from it’s first iteration several years ago, there are plenty of places where we wish our data were a little better or a little more precisely able to show who we are and what we do. Most importantly, this document isn’t intended to be a braggart’s bible. On the contrary, the IEMF is designed to be an honest presentation of Augustana College and of us. We aren’t perfect. And we know that. But we are trying to be as good as we can be with the resources we have. And in more than a few instances, we are doing pretty well.

Before I forget, a special and sincere “thank you” goes out to everyone who played a role in hunting down this data and putting the document together: Kimberly Dyer, Keri Rursch, Cindy Schroeder, Quan Vi, Erin Digney, Angie Williams, Katey Bignall, Kelly Hall, Randy Roy, Lisa Sears, Matt Walsh, Sheri Curran, Robert Scott, Jeff Thompson, Dom Sullivan, Katrina Friedrich, Bonnie Hewitt, Scott Dean, Shawn Beattie, and Kent Barnds.

So have a look. If you have any questions or critiques or suggestions, please send them to me. I’m genuinely looking for ways to improve this document.

For starters . . . anyone got any catchy ideas for a better name?

Make it a good day,

Mark

 

Even more details regarding term-to-term retention

The more we dig into our retention data, the more interesting it gets. Earlier this term, I shared with you some of our findings regarding term-to-term retention rates. These data seem to suggest that we are slowly improving our within-year retention rates.

As always, the overall numbers only tell us so much. To make the most of the data we collect, we need to dig deeper and look at within-year retention rates for subpopulations of students that have historically left at a higher rate than their peers. Interestingly, this data might also tell us something about when these students are most vulnerable to departing and, as a result, when we might increase our focus on supporting their success.

The table below presents 2014-15 within-year retention rates of the five subpopulations of students that significantly deviated from the overall term-to-term retention rates. The percentages that are more than one point below the overall number are in red.

Student Demographic Group Fall to Winter Winter to Spring Fall to Spring
Overall 96.6% 97.6% 94.3%
Males 94.4% 95.9% 90.5%
Multicultural Students 98.7% 93.9% 92.7%
Gov’t Subsidized Loan Qualifiers 94.8% 97.6% 92.5%
Non IL/IA Residents 96.0% 90.0% 90.0%
First-Generation Students 95.3% 96.7% 92.3%

The first thing I’d like to highlight is a pair of subpopulations that aren’t on this list. Analyses of older data would no doubt highlight the lagging retention rates of students who came to Augustana with lower ACT scores or who applied test-optional (i.e., without submitting a standardized test score). However, in the 2014-15 cohort these subpopulations retained from fall to winter (96.9% and 97.9%, respectively) and from winter to spring (96.8% and 97.9%, respectively) at rates similar to the overall population. The winter-to-spring numbers are particularly encouraging because that is when first-year students can be suspended for academic performance. Although it would be premature to declare that this improvement results directly from our increased student support efforts, these numbers suggest that we may indeed be on the right track.

In looking at the table above, the highlighted demographic groups are probably not a surprise to those who are familiar with retention research. However, this table gives us  a glimpse into when certain groups are more vulnerable to departure. For example, male students’ retention rates are consistently lower than the campus average. By contrast, multicultural students were retained at a higher rate from fall to winter. But from winter to spring, our early success evaporated completely. Winter term might also play a role for non IL/IA residents who retain at rates similar to their peers from fall to winter but from winter to spring depart at a higher rate than the rest of the cohort.

Since this is only one year of data, I wouldn’t suggest making any emphatic claims based on it. But I do think that these findings should challenge us to think more deeply about the kind of support different types of student might need and when they might benefit most from it.

Make it a good day,

Mark

 

Applying a Story Spine to Guide Assessment

As much as I love my assessment compadres, sometimes I worry that the language we use to describe the process of continual improvement sounds pretty stiff. “Closing the loop” sounds too much like teaching a 4 year-old to tie his shoe. Over the years I’ve learned enough about my own social science academic nerdiness to envy those who see the world through an entirely foreign lens. So when I stumbled upon a simple framework for telling a story called a “Story Spine,” it struck me that this framework might spell out the fundamental pieces of assessment in a way that just makes much more sense.

The Story Spine idea can be found in a lot of places on the internet (e.g., Pixar and storytelling), but I found out about it through the world of improv. At its core, the idea is to help improvisers go into a scene with a shared understanding of how a story works so that, no matter what sort of craziness they discover in the course of their improvising, they know that they are all playing out the same meta-narrative.

Simply put, the Story Spine divides a story into a series of sections that each start with the following phrases. As you can tell, almost every story you might think of would fit into this framework.

Once upon a time . . .

And every day . . .

Until one day . . .

Because of that . . .

Because of that . . .

Until finally . . .

And ever since then . . .

These section prompts can also fit into four parts of a cycle that represent the transition from an existing state of balance (“once upon a time” and “every day”), encountering a disruption of the existing balance (“until one day”), through a quest for resolution (“because of that,” “because of that,” and “until finally”), and into a new state of balance (“and ever since then”).

To me, this framework sounds a lot like the assessment loop that is so often trotted out to convey how an individual or an organization engages assessment practices to improve quality. In the assessment loop, we are directed to “ask questions,” “gather evidence,” “analyze evidence,” and “use results.” But to be honest, I like the Story Spine a lot better. Aside from being pretty geeky, the assessment loop starts with a vague implication that trouble exists below the surface and without our knowledge. This might be true, but it isn’t particularly comforting. Furthermore, the assessment loop doesn’t seem to leave enough room for all of the forces that can swoop in and affect our work despite our best intentions. There is a subtle implication that educating is like some sort of assembly line that should work with scientific precision. Finally, the assessment loop usually ends with “using the results” or, at its most complex, some version of “testing the impact of something we’ve added to the mix as a result of our analysis of the evidence.” But in the real world, we are often faced with finding a way to adjust to a new normal – another way of saying that entering a new state of balance is as much a function of our own adjustment as it is the impact of our interventions.

So if you’ve ever wondered if there was a better way to convey the way that we live an ideal of continual improvement, maybe the Story Spine works better. And maybe if we were to orient ourselves toward the future by thinking of the Story Spine as a map for what we will encounter and how we ought to be ready to respond, maybe – just maybe – we will be better able to manage our way through our own stories.

Make it a good day,

Mark

Some comfort thoughts about mapping

I hope you are enjoying the bright sunshine today.  Seeing that we might crack the 70 degree mark by the end of the week makes the sun that much more invigorating!

As you almost certainly know by now, we have been focusing on responding to the suggestions raised in the Higher Learning Commission accreditation report regarding programmatic assessment. The first step in that response has been to gather curricular and learning outcome maps for every major.

So far, we have 32 out of 45 major-to-college outcomes maps and 14 out of 45 courses-to-major outcomes maps.  Look at it as good or look at it as bad – at least we are making progress, and we’ve still got a couple weeks to go before I need to have collected them all. More importantly, I’ve been encouraged by the genuine effort that everyone has made to tackle this task. So thank you to everyone.

Yet as I’ve spoken with many of you, two themes have arisen repeatedly that might be worth sharing across the college and reframing just a bit.

First, many of you have expressed concern that these maps are going to be turned into sticks that are used to poke you or your department later. Second, almost everyone has worried about the inevitable gap between the ideal student’s progress through a major and the often less-ideal realities of the way that different students enter and progress through the major.

To both of those concerns, I’d like to suggest that you think of these maps as a perpetually working document instead of some sort of contract that cannot be changed. The purpose of drawing out these maps is to make explicit the implicit only as a starting point from which your program will constantly evolve. You’ll change things as your students change, as your instructional expertise changes, and as the future for which your program prepares students changes. In fact, probably the worst thing that could happen is a major that never changes anything no matter what changes around it.

The goal at this point isn’t to produce an unimprovable map. Instead, the goal is put a map together that is your best estimate of what you and your colleagues are trying to do right now. From there, you’ll have a shared starting point that will make it a lot easier to identify and implement adjustments that will in turn produce tangible improvement.

So don’t spend too much time on your first draft. Just get something on paper (or pixels) that honestly represents what you are trying to do and send it to me using the templates I’ve already shared with everyone. Then expect that down the road you’ll decide to make a change and produce a second draft. And so on, and so on. It really is that simple.

Make it a good day,

Mark

I so wish I had written this!

Hi Folks,

Yes, I’m late with my blog this week. And I’m sorry about that. But I’ve been busy thinking about ways to organize my desk. And that’s something.

Brian Leech shared this with me yesterday, so he deserves whatever credit someone is supposed to get when they share something with someone who then “borrows” it to present to his blog audience in place of something that actually required original work. So all thanks goes to Brian for enabling my slacker gene this week.

We all need to laugh at ourselves and the absurd parts of our work sometimes. So enjoy having a “go” at the assessment culture run amok and the weird world of Institutional Research.

RUBRIC FOR THE RUBRIC CONCERNING STUDENTS’ CORE EDUCATIONAL COMPETENCY IN READING THINGS IN BOOKS AND WRITING ABOUT THEM.

From Timothy McSweeney’s Internet Tendency blog.

Make it a great day!

Mark

So how do our retention numbers look now?

Early in the winter term, I wrote about the usefulness of tracking term-to-term retention. This approach is particularly valuable in evaluating and improving our efforts with first-year students, since they are the ones most susceptible to the challenges of transitioning to college and for whom many of our retention programs are designed. Now that we have final enrollment numbers for the spring term, let’s have a look at our term-to-term retention rates over the last five years and see if our increased student success efforts might be showing up in the numbers.

Here are the last five years of fall-to-winter retention rates for the first-year cohort.

  • 2011 – 94.1%
  • 2012 – 95.6%
  • 2013 – 97.0%
  • 2014 – 95.9%
  • 2015 – 96.6%

As you can see, we’ve improve by 2.5 percentage points over the last five years. This turns out to be real money, since a 2.5% increase in the number of first-year students returning for the winter term means that we retained an additional 17 students and added roughly $84,000 in revenue (assuming we use 3-year averages for the incoming class and the first-year net tuition revenue per term: 675 students and $4940, respectively).

But one of the difficult issues with retention is that success is sometimes fleeting. In other words, retaining a student for one additional term might just delay the inevitable. Furthermore, in the case of first-year term-to-term retention the fall-to-winter retention rates can be deceiving because we don’t impose academic suspensions on first-year students after the fall term. Thus students who are in serious academic trouble might just hang on for one more term even though there is little reason to think that they might turn things around. Likewise, students who are struggling to find a niche at Augustana may begrudgingly come back for one more term even though they are virtually sure that this place isn’t the right fit. With that in mind, looking at our fall-to-spring retention rates would give us a more meaningful first glimpse at the degree to which our retention efforts are translating into a sustained impact. If the fall-to-winter retention rates are nothing more than a mirage, then the fall-to-spring retention rates would remain unchanged over the same five year period. Conversely, if our efforts are bearing real fruit then the fall-to-spring retention rates ought to reflect a similar trend of improvement.

Here are the last five years of fall-to-spring retention rates for the first-year cohort.

  • 2011 – 92.1%
  • 2012 – 93.1%
  • 2013 – 93.3%
  • 2014 – 93.5%
  • 2015 – 94.1%

As you can see, it appears that the improving fall-to-winter retention rate largely carries through to the spring term. That translates into more real money: approximately $69,100 in additional spring term revenue. Overall, that’s about $153,000 that we wouldn’t have seen in this year’s revenue column had we not improve our term-to-term retention rates among first-year students.

Certainly this doesn’t mean that we should rest on our laurels. Even though retaining a student to the second year gets them over the biggest hump in terms of the likelihood of departure, it still seems to me like small consolation if that student doesn’t ultimately graduate from Augustana. However, especially facing the financial challenges that the state of Illinois has dumped in our lap, we ought to pat each other on the back for a moment and take some credit for our work to help first-year students succeed at Augustana. The data suggests that our hard work is paying off.

Make it a good day,

Mark