I so wish I had written this!

Hi Folks,

Yes, I’m late with my blog this week. And I’m sorry about that. But I’ve been busy thinking about ways to organize my desk. And that’s something.

Brian Leech shared this with me yesterday, so he deserves whatever credit someone is supposed to get when they share something with someone who then “borrows” it to present to his blog audience in place of something that actually required original work. So all thanks goes to Brian for enabling my slacker gene this week.

We all need to laugh at ourselves and the absurd parts of our work sometimes. So enjoy having a “go” at the assessment culture run amok and the weird world of Institutional Research.

RUBRIC FOR THE RUBRIC CONCERNING STUDENTS’ CORE EDUCATIONAL COMPETENCY IN READING THINGS IN BOOKS AND WRITING ABOUT THEM.

From Timothy McSweeney’s Internet Tendency blog.

Make it a great day!

Mark

So how do our retention numbers look now?

Early in the winter term, I wrote about the usefulness of tracking term-to-term retention. This approach is particularly valuable in evaluating and improving our efforts with first-year students, since they are the ones most susceptible to the challenges of transitioning to college and for whom many of our retention programs are designed. Now that we have final enrollment numbers for the spring term, let’s have a look at our term-to-term retention rates over the last five years and see if our increased student success efforts might be showing up in the numbers.

Here are the last five years of fall-to-winter retention rates for the first-year cohort.

  • 2011 – 94.1%
  • 2012 – 95.6%
  • 2013 – 97.0%
  • 2014 – 95.9%
  • 2015 – 96.6%

As you can see, we’ve improve by 2.5 percentage points over the last five years. This turns out to be real money, since a 2.5% increase in the number of first-year students returning for the winter term means that we retained an additional 17 students and added roughly $84,000 in revenue (assuming we use 3-year averages for the incoming class and the first-year net tuition revenue per term: 675 students and $4940, respectively).

But one of the difficult issues with retention is that success is sometimes fleeting. In other words, retaining a student for one additional term might just delay the inevitable. Furthermore, in the case of first-year term-to-term retention the fall-to-winter retention rates can be deceiving because we don’t impose academic suspensions on first-year students after the fall term. Thus students who are in serious academic trouble might just hang on for one more term even though there is little reason to think that they might turn things around. Likewise, students who are struggling to find a niche at Augustana may begrudgingly come back for one more term even though they are virtually sure that this place isn’t the right fit. With that in mind, looking at our fall-to-spring retention rates would give us a more meaningful first glimpse at the degree to which our retention efforts are translating into a sustained impact. If the fall-to-winter retention rates are nothing more than a mirage, then the fall-to-spring retention rates would remain unchanged over the same five year period. Conversely, if our efforts are bearing real fruit then the fall-to-spring retention rates ought to reflect a similar trend of improvement.

Here are the last five years of fall-to-spring retention rates for the first-year cohort.

  • 2011 – 92.1%
  • 2012 – 93.1%
  • 2013 – 93.3%
  • 2014 – 93.5%
  • 2015 – 94.1%

As you can see, it appears that the improving fall-to-winter retention rate largely carries through to the spring term. That translates into more real money: approximately $69,100 in additional spring term revenue. Overall, that’s about $153,000 that we wouldn’t have seen in this year’s revenue column had we not improve our term-to-term retention rates among first-year students.

Certainly this doesn’t mean that we should rest on our laurels. Even though retaining a student to the second year gets them over the biggest hump in terms of the likelihood of departure, it still seems to me like small consolation if that student doesn’t ultimately graduate from Augustana. However, especially facing the financial challenges that the state of Illinois has dumped in our lap, we ought to pat each other on the back for a moment and take some credit for our work to help first-year students succeed at Augustana. The data suggests that our hard work is paying off.

Make it a good day,

Mark

No matter how bad you have it . . .

Hi Everyone,

I suspect some of you are still scarred from my many requests for syllabuses in preparation for the HLC visit. I won’t soon forget the enormity of that project.

Yet I am outdone once again by a news report of a syllabus collecting project that makes my little effort seem like a game of pick-up sticks.

Thanks to Cyrus Zargar for sharing this with me – even if I am a little concerned by how quickly my name came to mind!

“What a Million Syllabuses Can Teach Us”

Ugh.

Make it a good day,

Mark

 

A short post for a short week!

Its hard not to think that Thanksgiving might need a better lobbyist. Every year the warm anticipation of the holiday seems to get overrun by Christmas decoration sales that start the day after Halloween, and then drowned out by the deafening onslaught of Black Friday advertising during the week of the actual holiday.

I’m not looking to start a movement, but over the weekend I decided to take back my own Thanksgiving. So from my little corner of the world, I want thank all of you, my friends and family at Augustana College. Thanks for inviting the Office of Institutional Research into your conversations, valuing our observations, and utilizing our insights. Thanks for trusting us with all of your data and for making us feel like genuine contributors to the life of the college.

I hope each of you get the chance to find a moment of peace on Thanksgiving.

Make it a good day,

Mark

A casual and incomplete FAQ for our current employee survey

Even though this is my fifth year at Augustana, the concept of Muesday still throws me for a loop. Maybe this is because I don’t have to think about it much, counting beans in my little office all day every day like I do. Conversely, most faculty I know talk about it as if it’s the most normal concept in the world, no matter if they’ve taught at Augustana for a couple of years or a couple of decades. And even though I think I’ve developed a failsafe cover to hide my ignorance (toss my head back, laugh, lean in while I bat the air in front of my face, say emphatically, “of course, what was I thinking!” while rolling my eyes), it’s an annual reminder for me that the concepts each of us take for granted aren’t always so obvious to everyone else.

I’ve been reminded of this reality again as I’ve been inviting everyone to fill out current Augustana College Employee Survey.  More than a few people have expressed concerns about anonymity and confidentiality.  A few have even floated impressive conspiracy theories of NSA-caliber data scrubbing.  So before I have to run off to my weekly administrator neural network reprogramming and empathy reduction session, I thought that I’d try to answer the anonymity and confidentiality questions in a little more detail. (Yes, I’m kidding. The administrator neural network reprogramming and empathy reduction sessions are every OTHER week and don’t meet this week because it’s MUESDAY!)

When I promise anonymity to everyone who responses to the Augustana College Employee Survey, that means that I don’t ask for your name or other information that directly identifies you. It also means that the software doesn’t collect your Augustana user ID or the IP address of the computer that you used to complete the survey. In order to do this, I turn off a setting in the Google Forms software that would normally add this information to the dataset.

Turning this feature off also means that the survey is publicly accessible – a potential downside to be sure. So it is technically possible that each of the 340+ survey responses I’ve received aren’t actually coming from Augustana employees. But that would mean that somebody somewhere else has acquired the web address of the survey and has spent their days and nights repeatedly filling the survey out over and over with just enough variation of answer choices to avoid suspicion.  Yeah, I doubt it.

Some folks have pointed out that there are enough demographic questions that there might be a way to identify some respondents. This is technically true: if someone had access to both the college’s employee database and the current employee survey dataset, one could probably figure out a way to be pretty sure about the identify of some of the respondents, particularly if one were to triangulate several demographic characteristics (e.g., race and age data) to pick out subgroups of employees that have only a few members. Of course, the only person on campus who has access to both of these datasets is, well, me. If you think that this is a likely explanation for how I spend my time … I guess I sort of doubt that you are even reading this post. Nonetheless, to be clear – I’m not trying to figure out what you said in your survey. And I’m not taking that information and slipping it under someone else’s door so that they can hire henchmen to come to your office and hide your keys. It’s not that I don’t care.  I’m just too busy.

All joking aside, this survey does ask some questions that can easily be perceived as risky to answer. So, if you are concerned about anonymity but want to respond to the survey, just leave any demographic question that cuts too close to the quick blank.  That way you don’t have to worry about having your anonymity violated. I think we’d rather be able to stir your opinion into the mix even if it might not get included in more complex analysis.

Confidentiality is a little different from anonymity. There are numerous student surveys where we promise confidentiality but not anonymity. We often will ask students for their ID number so that we can merge the data they provide with prior institutional data so that we can take a longer view of our students’ four years at Augie, looking for patterns across the entirety of their college experience. Confidentiality specifically refers to how we will share any of our survey findings. When I promise confidentiality, I am promising that I won’t share the data in any way that might link your set of responses to you. Instead, all data findings will be shared as averages of groups, whether that be the entire group of respondents or small subgroups of respondents.

This does again raise the question that some have asked about protecting the anonymity and confidentiality of those who are members of sparsely populated subgroups. When I promise confidentiality, I have to also consider the possibility that presenting data in all of the ways that it can be sliced and diced could lead to violating someone’s confidentiality. To allay this concern, I am ensuring confidentiality by simply not sharing any results in a way that might allow folks to reasonably infer any individual’s responses. I will not share any average responses to questions where the number of respondents in that particular subgroup is less than five. This makes it much less likely that anyone could determine the nature of someone’s individual responses based on the average responses from any particular subgroup of responses. So, for example, if we have less than five respondents in the category of employees who have worked here between six and ten years, then we won’t share any results for any question by the number of years employees have worked at Augustana.

Just like the anonymity question above, if you are worried that your confidentiality will be violated, don’t provide answers to those specific questions.

Even though we have already received many responses to this survey, we still need many more because the more that we have, the more likely it will be that we can look at subgroups of responses and analyze this data without violating anonymity and confidentiality.

Getting better as an organization is hard work. At its core, it requires that we all put something into it.  Completing this survey is a big first step. I hope you’ll all give it a shot.

Make it a good day,

Mark

 

Participation: A Prerequisite for Improvement

Usually I post to my blog on Monday mornings, but I hope you’ll indulge this early post and keep it in mind as you begin your week.

Sexual assault is a problem on virtually every college campus. Yet it is only very recently that colleges and universities, no doubt pushed by public outcry and increasing stern federal action, have begun to face the need to more fully understand and address this issue.

Within the last year, Augustana substantially revised a host of policies regarding sexual assault. But other than those cases that are reported, we don’t know nearly enough about our students’ perception of, and experiences with, sexual assault on campus.

For the last two weeks we’ve been participating in a survey of campus climate and sexual assault conducted by the Higher Education Data Sharing Consortium. This upcoming Friday, March 27th, the data collection phase of our participation in this survey will end. Although we repeatedly invited responses from all students, as of last Thursday we had only received responses from 570 individuals. While that means we’ve heard from almost 25% of our study body (good enough in statistical terms to make some inferences based on the results), we need to hear from as many students as possible. This is in large part because the most useful information is likely to come from those who are most reticent to share their experiences, making the number of total responses all that much more important.

So I am asking – no matter if you interact with students as their instructor, their mentor, their work supervisor, or even their friend – that you encourage your students to complete this survey. Please remind them that participation in this survey is a prerequisite for improvement. In other words, we can’t improve what we do as a college if we don’t know what our students experience.

I know you have plenty of things on your mind as you prepare for this week. But your comments, even if they are brief, will demonstrate the degree to which Augustana is serious about facing this issue and eradicating sexual assault from our campus. I know that eliminating sexual assault might seem like a rather high bar; I just don’t know how we could aim for anything less.

So please mention this survey to your students. They all received an email on Sunday evening inviting those students who had not participated one last time to respond. Their unique link to the survey was in that email. They can complete it any time this week, but after Friday the survey will no longer be accepting new data.

Thank you for your help.

Make it a good day,

Mark

 

On days like these, I love trimesters!

Just think; if we were on semesters we’d be slogging through mid-terms instead of looking forward to a week of relative calm just over the horizon!  In the spirit of the final push to get through finals, I’ll leave you alone to your grading of final papers and exams.

Make it a good break,

Mark

This week gonna need some laughs!

Since it’s finals week, since it’s snowing (AGAIN!), and since you all are going to be busy grading and shoveling for the next several days, I decided this this was as good a Monday as any to share some faux news stories that will hopefully make you laugh and momentarily forget the work piling up outside and inside.

An oldie but goodie from the Onion’s vault . . .

Professor Deeply Hurt by Student’s Evaluation

An Assessment Coordinator’s Dream from the Cronk News . . .

One Learning Outcome to Rule Them All

And finally, another Onion article that cuts it a little close . . .

University Implicated in Checks-for-Degrees Scheme

Hang in there everyone!  See you in a few weeks.

Make it a good day,

Mark

The Holiday Wish List for a Measurement Geek

Sincerely apologies to anyone who tried to find a new post on my blog yesterday. Apparently our server went “walk-about” over the weekend and our IT folks have been working day and night to salvage everything that was no longer operational.  I think that we are in the clear today, so I’ll try to put this post up a day late.

______________________________________________________________________

This is the week where I can’t help but overhear all the talk of the holiday gifts that people are getting for their spouses, partners, kids, friends, or in-laws.  And it struck me that there aren’t nearly enough suggestions for measurement folks who need to just own their geekdom and go big with it.  So here are a few ideas, discoveries, and possibilities.

  • Statistics ties.  Any formula, pie chart, or dumb stats pun on a tie.  Because nothing bludgeons humor to death better than a stupid stats pun.
  • The children’s book Magnus Maximus, a Marvelous Measurer.  It’s a pretty fun book with wonderful illustrations.  And it’s never too early to stereotype your profession.
  • The world’s largest slide rule.  Of course, it’s located in Texas.
  • The complete DVD set of the TV show NUMB3RS. This show managed to tease my people with the hope that someday complex math skills could really save a life. And yet, to this day I’ve never been in a public venue where someone suddenly yelled frantically, “Is there a statistician in the house!?”
  • A Digicus. They were made in the late 70s and early 80s by the electronic’s company Sharp. Apparently many Japanese were suspicious of the digital calculator when it was first introduced, so the Digicus was created to allow people to check their calculator results against an abacus. And you thought higher ed types were skeptical of change???
  • And last but not least, anything by the band Big Data. Yes, there is a band called Big Data. They describe themselves as a “paranoid electronic music project from the internet.”  Okey dokey.

Make it a good holiday break,

Mark

Expanding our Academic Challenge Distinction beyond the First Year

Since 2011, two national studies of successful learning outcome improvement through educational assessment have highlighted our efforts at Augustana College.  First, the National Institute of Learning Outcomes Assessment (NILOA) published a report detailing the ways that a small group of uniquely successful institutions developed and maintain a positive culture of assessment and improvement.  Second, the National Survey of Student Engagement (NSSE) conducted an in-depth study of eight institutions, chosen from an original pool of 534 colleges and universities that had made significant gains on various NSSE benchmark scores, to identify some of the organizational values and practices that allow these institutions to make such clearly demonstrable improvements in their educational environments.

The data point that most clearly jumped out to both research teams involved the degree to which our first-year scores on the NSSE Academic Challenge benchmark increased between 2003 and 2009.  This benchmark scale asked a series of questions about the amount of time and effort students must put into their coursework to meet academic expectations and has been a staple of NSSE and the Wabash National Study.  As many of you know, we can pin our own improved Academic Challenge scores to the overhaul of our general education and LSFY programs about seven years ago, when a preponderance of earlier data simply didn’t comport with the kind of institution we wanted to be.  And even though we continue to note, discuss, and tweak perceived weaknesses that have emerged since implementing AGES, we shouldn’t let these more recently identified concerns detract from the fact that our earlier efforts were thoroughly successful in improving the educational quality of Augustana’s first year experience.

Yet the evidence of an improved educational environment (as represented by an increase in the academic challenge experienced by our students) did not seem to extend beyond the first year.  In our 2009 NSSE report, despite a significant difference in first-year academic challenge scores between Augustana and a group of 30 similar residential liberal arts colleges, our fourth-year academic challenge scores remained no different than other institutions.  Many of us were troubled by the possibility that the distinction in academic quality that we might have established in the first year could have eroded entirely by the end of the fourth year.  Although senior inquiry was intended to help us increase our level of academic challenge in the fourth year, the 2009 NSSE report did not reflect any impact of that effort (likely because SI was not fully implemented until 2010 or 2011).  So when we received our Wabash National Study four-year summary report a few weeks ago, I specifically wanted to examine our seniors’ overall score to the Academic Challenge scale to see if we’d made any progress on this rather important measure of educational quality.

(At this point, the empathetic side of my brain/soul/elven spirit/gaseous particles has guilted me into offering a pre-emptive apology.  I am going to talk about some numbers without giving you all the detailed context behind those numbers.  If you want more context, you know where to find me.  Otherwise, try to hang in there and trust that the changes these numbers represent are substantial and worth discussing.)

The Wabash National Study evidence suggests that, once again, our efforts to respond to assessment data with changes that will improve Augustana’s educational quality seem to have born fruit.  Between 2009 and 2012, our seniors’ Academic Challenge score jumped from 62.6 to 64.3 – a statistically significant increase.  Moreover, the difference between our mean score and the average Academic Challenge score of the 32 similar institutions that participated in the Wabash National Study (61.0) was statistically significant – suggesting that something we are doing during the fourth year distinguishes the academic quality that we provide from those institutions.  For my own information and confidence in this conclusion, I also looked at the 2012 NSSE annual report just to see if these Wabash Study numbers differed in any meaningful way from the much larger sample of institutions that participated in NSSE.  Again, our Academic Challenge scores placed us above the NSSE average of similar liberal arts institutions (62.5) and well above the overall NSSE average (58.4).

All of this evidence seems to point toward a familiar and heartening – if not downright exciting – conclusion.  Our efforts to improve the educational quality of an Augustana experience are working (or as the famous line goes, “I love it when a plan comes together!” . . . yes, I just quoted Hannibal Smith from the 1980s TV show “The A-Team” in a blog about institutional research.  I’m fired up – deal with it.).  The academic challenge our students’ experience in their fourth year appears to have increased.  And while we don’t have comparative data on the degree to which this effort has increased our students’ learning outcome gains (because we don’t have identical pretest-posttest outcomes data from 2009), it is clear from the Wabash National Study data that our 2012 Wabash Study participants repeatedly made larger learning outcome gains than students at the 32 similar institutions participating in same study.

Later this year we will receive the full Wabash study dataset that will allow us to examine the responses to each individual question in this scale.  I am looking forward to digging deeper into that data.  But for the time being, I think we deserve to take a moment and congratulate ourselves as a community of educators dedicated to the success of our students.  Although we continually hear critics of higher education lament that institutions refuse to collect the kind of data necessary to meaningfully assess themselves, or that faculty perpetually resist making the kind of changes that might substantively improve an institution’s educational quality, we now have multiple sources of evidence to demonstrate that, while we might not be without reproach, we have living, breathing evidence of our successful efforts to improve the Augustana education.

Are we there yet?  No.  Will we ever be there?  Of course, not.  But are we genuinely walking the walk of an institution committed to its students and its educational mission?  Absolutely.

Make it a good day,

Mark

 

62.6 to 64.3