Ch-Ch-Ch-Ch-Changes

Most of us have heard the old light bulb joke . . . .

Question: “How many faculty does it take to change a light bulb?”

Answer: “Change?!?!”

Even if that quip sparks something between a snicker and a harrumph (depending on your point of view and sense of humor), the snark underlying it should really be applied to all of higher education. Most higher education institutions’ response to stagnant or slipping retention numbers makes for a telling example of this phenomenon. After decades of shifting student demographics, the dominant narrative about student persistence continues to emphasize the degree to which the students who leave are in some way not smart enough, not mature enough, or not adaptable enough to acclimate to the rarified air of the college campus. In short, the prevailing opinion is that students need to change to fit in, regardless of the cultural distance between their lives prior to college and the embedded environment at their college. So although the demographic makeup of college students has been changing for a long time, most institutions have done little more than add small spaces or programs at the margins while leaving the historically homogenous dominant culture of the campus intact.

Until the early part of the last decade, Augustana could probably have gotten away with that approach. For example, looking over the proportions of students who were not white throughout the 1990s, the percentages hovered between five and eight percent. But in the early 2000s that proportion began to climb substantially. By the fall of 2014, the proportion of non-white students had reached 20% of the total student population.

The scope of this change really jumps out if we look at two numbers across a roughly 25-year span. In the fall of 1990, there was a total of 134 non-white students at Augustana (scattered throughout an overall student population of 2,253). By comparison, in the fall of 2014 there were 146 non-white students in the freshman class alone and a total of 489 non-white students among an overall student population of 2,473.

Although this change is substantial, it is only one of several ways in which Augustana’s student demographics has shifted dramatically. In the last ten years, the number of students who qualify for a Pell Grant has almost doubled (from 355 to 606) and the proportion of freshmen with unmet financial need has jumped almost twenty percentage points (from 39.6% to 56.6%). At the same time, the proportion of Lutheran students has dropped by more than half since 1990 (from 31.7% to 12.8%) while the proportion of students with no religious affiliation has almost doubled (from 9.2% to 17.2%). Add to these changes a growing LGBT population (a number we didn’t even track until a few years ago), and the multi-dimensional scope of change in our student demographic makes previously narrow definitions of diversity – especially those that limit their focus to the color of one’s skin – surprisingly insufficient. Furthermore, the implications of this explosion of difference suggest that merely revising our assumptions, or even adding more layers of assumptions, about the backstory of our students will almost certainly leave us short. Things are changing in too many ways simultaneously for us to merely come up with a new “normal.” Even if we were to come up with a new background template for the typical Augustana student, we would almost certainly be wrong more often than we are right.

Instead, the extended scope of this change and the increased prominence of this tapestry requires that we revisit an old but useful adage. We must genuinely know our students. That doesn’t mean just knowing their names, their high school, and their academic ability. We must know their backstory; the multi-layered context through which they will make meaning of this educational experience. It is the nuance of each individual context that will define the lens through which each student sees us and the way that they hear what we say. Knowing this context and knowing how this context might shape our students’ first impressions will make a world of difference in helping all of us – student, educator, and institution – adapt together to ensure that every student succeeds.

Make it a good day,

Mark

Welcome back to a smorgasbord of ambiguity!

Every summer I get lonely.  Don’t get me wrong, I love the people I work with in Academic Affairs and in Founders Hall . . . probably more than they love me sometimes.  But the campus just doesn’t feel right unless there is a certain level of manageable chaos, the ebb and flow of folks scurrying between buildings, and a little bit of nervous anticipation in the air.  Believe it or not, I genuinely missed our student who sat in the trees and sang out across the quad all last year!  Where are you, Ellis?!

For those of you who are new to Augustana, I write this column/blog every week to try to drop a little dose of positive restlessness into the campus ether.  I first read the phrase “positive restlessness” in the seminal work by George Kuh, Jillian Kinzie, John Schuh, and Liz Whitt titled Student Success in College. This 2005 book describes the common threads the authors found among 20 colleges and universities that, no matter the profile of students they served or the amount of money squirreled away in their endowment portfolio, consistently outperformed similar institutions in retention and graduation rates.

More important than anything else, the authors found that the culture on each of these campuses seemed energized by a perpetual drive to improve. No matter if it was a massive undertaking or a tiny little tweak, the faculty, staff, and students at these schools seemed almost hungry to get just a little bit better at who they were and how they did what they do every day.  This doesn’t mean that the folks on these campuses were some cultish consortium of maniacal change agents or evangelical sloganeers. But over and over it seemed that the culture at each of the schools featured in this study coalesced around a drive to do the best that they could with the resources that they had and to never let themselves rest on their laurels for too long.

What continues to strike me about this attribute is the degree to which it requires an optimistic willingness to wade into the unknown. If we were to wait until we figured out the failsafe answer to every conundrum, none of us would be where we are now and Augustana would have almost certainly gone under a long time ago.  Especially when it comes to educating, there are no perfect pedagogies or guaranteed solutions. Instead, the best we can do is continually triangulate new information with our own experience to cultivate learning conditions that are best suited for our students. In essence, we are perpetually focused on the process in order to increase the likelihood that we can most effectively influence the product.

The goal of this blog is to present little bits of information that might combine with your expertise to fuel a sense of positive restlessness on our campus.  Sometimes I point out something that we seem to be doing well.  Other times I’ll highlight something that we might improve.  Either way, I’ll try to present this information in way that points us forward with an optimism that we can always make Augustana just a little bit better.

By a lot of different measures, we are a pretty darn good school.  And we have a healthy list of examples of ways in which we have embodied positive restlessness on this campus (if you doubt me, read the accreditation documents that we will be submitting to the Higher Learning Commission later this fall).  We certainly aren’t perfect, but frankly that would be a fool’s errand because perfection is a static concept – and maintaining an effective learning environment across an entire college campus is by definition a perpetually evolving endeavor.

So I raise my coffee mug to all of you and to the deliciously ambiguous future that this academic year holds.  Into the unknown we stride together.

Make it a good day!

Mark

 

… and Warm Fuzzy beats Cranky Skeptic by a nose!

It’s the last week of our spring horse race. No, I’m not referring to the Preakness (although that horse race was run this weekend and the most poignant reason yet to run a spell-checker won again). I’m referring to the horse race that we all feel at the end of the year, thundering around the final turn (some great horse race calls here or Spike Jones epic spoof here) to finish classes, deal with students, turn in grades, and send our graduates off to the next phase of their lives – all so that we can get out to our own summer pastures.  In the midst of trying to slog through all of this end-of-the-term slop (not unlike the muddy track at the Preakness on Saturday), it’s easy to let our cranky side get the best of us.

So in honor of the end-of-the-year horse race, those wonderfully quirky horse names, and the warm fuzzy that we could all use right about now, I thought it would be a perfect time to share some data fresh from our 2015 senior survey that is worth smiling about – maybe even worth a solid pat on the back.

Here are three years’ worth of results from three senior survey questions that, if I were forced to cut the survey down to a handful of items, these would be among the questions I’d keep. Read ’em and smile!

  • I felt a strong sense of belonging on campus. (response options scored from 1-5: strongly disagree, disagree, neutral, agree, or strongly agree)
    • 2013 – 72.1% agree or strongly agree
    • 2014 – 66.8% agree or strongly agree
    • 2015 – 75.4% agree or strongly agree!
  • I am certain that my post-graduate plans are a good fit for who I am right now and where I want my life to go. (response options scored from 1-5: strongly disagree, disagree, neutral, agree, or strongly agree)
    • 2013 – 75.5% agree or strongly agree
    • 2014 – 76.7% agree or strongly agree
    • 2015 – 81.2% agree or strongly agree!
  • If you could relive your college decision, would you choose Augustana again? (response options scored from 1-5: definitely no, probably no, not sure, probably yes, and definitely yes)
    • 2013 – 80.6% agree or strongly agree
    • 2014 – 72.3% agree or strongly agree
    • 2015 – 83.0% agree or strongly agree!

When three items are all moving in the same positive direction over time, I think we can put aside our wonky skeptical stuff for a few minutes and enjoy it.  That’s right – kick back, relax for a moment, and smile a big toothy grin.

You worked hard to make Augustana a better place for our students this year.  It just might have paid off.  Now let yourself enjoy it – you deserve it.

Make it a good day (and a great summer),

Mark

So after the first year, can we tell if CORE is making a difference?

Now that we are a little over a year into putting Augustana 2020 in motion, we’ve discovered that assessing the implementation process is deceptively difficult. The problem isn’t that the final metrics to which the plan aspires are too complicated to measure or even too lofty to achieve. Those are goals that are fairly simple to assess – we either hit our marks or we don’t. Instead, the challenge at present lies in devising an assessment framework that tracks implementation, not the end results. Although Augustana 2020 is a relatively short document, in actuality it lays out a complex, multi-layered plan that requires a series of building blocks to be constructed separately, fused together, and calibrated precisely before we can legitimately expect to meet our goals for retention and graduation rates, job acquisition and graduate school acceptance rates, or improved preparation for post-graduate success. Assessing the implementation, especially at such an early point in the process, by using the final metrics to judge our progress would be like judging a car manufacturer’s increased production speed right after the company had added a faster motor to one of the assembly lines. Of course, without having retrofitted or changed out all of the other assembly stages to adapt to this new motor, by itself such a change would inevitably turn production into a disaster.

Put simply, judging any given snapshot of our current state of implementation against the fullness of our intended final product doesn’t really help us build a better mousetrap; it just tells us what we already know (“It’s not done yet!”). During the process of implementation, the focus of assessment is much more useful if it identifies and highlights intermediate measures that give us a more exacting sense of whether we are moving in the right direction. In addition, assessing the process should tell us if the pieces we are putting in place will work together as designed or if we have to made additional adjustments to make sure the whole systems works as it should. This means narrowing our focus on the impact of individual elements on specific student behaviors, testing the fit between pieces that have to work together, and tracking the staying power of experiences that are intended to permanently impact our students’ trajectories.

With all of that said, I thought that it would be fitting to try out this assessment approach on arguably the most prominent element of Augustana 2020 – CORE. Now that CORE is finishing its first year at the physical center of our campus, it seems reasonable to ask whether we have any indicators in place that could assess whether this initiative is bearing the kind of early fruit we had hoped. Obviously, since CORE is designed to function as a part of a four-year plan of student development and preparation, it would be foolhardy to judge CORE’s ultimate effectiveness on some of the Augustana 2020 metrics until at least four years has past. However, we should look to see if there are indications that CORE’s early impact triangulates with the student behaviors or attitudes necessary for improved post-graduate success. This is the kind of data that would be immediately useful to CORE and the entire college. If indicators suggest that we are moving in the right direction, then we can move forward with greater confidence. If the indicators suggest that things aren’t working as we’d hoped, then we can make adjustments before too many other things are locked into place.

In order to find data that suggests impact, we need more than just the numbers of students who have visited CORE this year (even though it is clear that student traffic in the CORE office and at the many CORE events has been impressive). To be fair, these participation patterns could simply be an outgrowth of CORE’s new location at the center of campus (“You’ve got candy, I was just walking by, why not stop in?”). To give us a sense of CORE’s impact, we need to find data where we have comparable before-and-after numbers. At this early juncture, we can’t look at our recent graduate survey data for employment rates six months after graduation since our most recent data comes from students who graduated last spring – before CORE opened.

Yet we may have a few data points that shine some light on CORE’s impact during its first year. To be sure, these data points shouldn’t be interpreted as hard “proof.” Instead, I suggest that they are indicators of directionality and, when put in the presence of other data (be they usage numbers or the preponderance of anecdotes), we can start to lean toward some conclusions about CORE’s impact in its first year.

The first data point we can explore is a comparison of the number of seniors who have already accepted a job offer at the time they complete the senior survey. Certainly the steadily improving economy, Augustana’s existing efforts to encourage students to begin their post-graduate planning earlier, and the unique attributes of this cohort of students could also influence this particular data point. However, if we were to see a noticeable jump in this number, it would be difficult to argue that CORE should get no credit for this increase.

The second data point we could explore would be the proportion of seniors who said they were recommended to CORE or the CEC by other students and faculty. This seems a potentially indicative data point based on the assumption that neither students nor faculty would recommend CORE more often if the reputation and result of CORE’s services were no different than the reputation and results of similar services provided by the CEC in prior years. To add context, we can also look at the proportion of seniors who said that no one recommended CORE or the CEC to them.

These data points all come from the three most recent administrations of the senior survey (including this year’s edition, to which we already have 560 out of a 580 eligible respondents). The 2013 and 2014 numbers are prior to the introduction of CORE, and the 2015 number is after CORE’s first year. I’ve also calculated a proportion that includes all students whose immediate plan after graduation is to work full-time in order to account for the differences in the size of the graduating cohorts.

Seniors with jobs accepted when completing the senior survey –

  • 2013 – 104 of a possible 277 (37.5%)
  • 2014 – 117 of a possible 338 (34.6%)
  • 2015 – 145 of a possible 321 (45.2%)

Proportion of seniors indicating they were recommended to CORE or the CEC by other students –

  • 2013 – 26.9%
  • 2014 – 24.0%
  • 2015 – 33.2%

Proportion of seniors indicating they were recommended to CORE or the CEC by faculty in their major or faculty outside their major, respectively –

  • 2013 – 47.0% and 18.8%
  • 2014 – 48.1% and 20.6%
  • 2015 – 54.6% and 26.0%

Proportion of seniors indicating that no one recommended CORE or the CEC to them –

  • 2013 – 18.0%
  • 2014 – 18.9%
  • 2015 – 14.4%

Taken together, these data points seem to suggest that CORE is making a positive impact on campus.  By no means do these data points imply that CORE should be ultimately judged as a success, a failure, or anything in between at this point. However, this data certainly suggests that CORE is on the right track and may well be making a real difference in the lives of our students.

If you’re not sure what CORE does or how they do it, the best (and probably only) way to get a good answer to that question is to go there yourself, talk to the folks who work there, and see for yourself.  If you’re nice to them, they might even give you some candy!

Make it a good day,

Mark

What about faculty retention?

Last week my colleague in the institutional research office, Kimberly Dyer, suggested that although we talk about student retention all the time, it’s reasonable to argue that faculty retention may also be an important metric worth tracking. Since turnover and longevity are well-documented markers of a healthy organizational environment, it certainly makes sense for us to delve into our employee data and see what we find.

From my perspective, this question also presents an opportunity to spell out the critical importance of context in making sense of any institutional data point. In the same way that we want our students to develop the ability to withhold judgment while evaluating a claim, we help ourselves in all sorts of ways by knowing how to place institutional metrics in their proper context before concluding that everything is “just peachy,” or that “the sky is falling,” or that, more realistically, we are somewhere in between those two extremes.

Although it would be interesting to look at employee retention across all the different positions that Augustana employees hold, the variation across these positions makes it pretty hard to address the implications of all those differences in a single blog post. So today I’ll focus on faculty retention primarily because, since faculty work is so closely tied to the traditional academic calendar, we can apply an already familiar framework for understanding retention (i.e., students being retained from one fall to the next) to this discussion.

Making sense of faculty retention numbers requires an understanding of two contextual dimensions. The first involves knowing something about the range of circumstances that might influence a proportion of faculty to leave their teaching positions at Augustana. Every year there are faculty who retire and faculty who move into administrative roles (just as there are individuals who give up their administrative roles to return to teaching). In addition, there are numerous term-limited visiting and fellowship positions that are designed to turn over. There are also the cases of faculty who leave because they are not awarded tenure (although, if we’re being honest with ourselves we know that in some of these cases this decision may not be entirely because of deficiencies exhibited by the individual faculty member). Obviously, if 10% of the faculty leave in a given year it would be silly to assume that all of those individuals left because Augustana’s work environment drove them away. To make a more insightful sense of a faculty retention data point, it’s critical to understand the proportion of those individuals whose departure is attributable to flaws, weaknesses, or dysfunctions in our community climate versus the subset of faculty departures that result from the normal and healthy movement of faculty within the institution (or within higher education generally) and/or within the life course.

The second contextual dimension requires some sense of what should be considered “normal.” Since it is probably not reasonable to expect an organization to have no turnover, the next question becomes: What do similar institutions experience in faculty retention and turnover?  Without this information, we are left with the real possibility that our biases, loyalties, and aspirations will coerce us into setting expectations far above what is reasonable. Comparable data helps us check our biases at the door.

So after all of that . . . what do our faculty retention numbers look like? To come up with some numbers, we first removed all of the visiting and fellowship positions for this analyses in order to avoid counting folks whom we expect to leave. Instead, we focused our analysis on tenured and tenure-track faculty.

Without accounting for any of the faculty who moved into an administrative post or faculty who retired, our retention rate of tenured and tenure-track faculty has been 91% in each of the last three years.  When you exclude retirements and internal movement, those proportions jump to 96%, 95%, and 94% respectively. In terms of actual people (with about 150 tenured/tenure-track faculty each year), this translates into about 6 people each year. This group of people would include faculty who aren’t awarded tenure as well as those who leave for any other reason.

The one obstacle to fully placing these numbers in context is that we don’t have any real way of establishing comparable numbers from similar institutions. Maybe most institutions like us would give a lot of money for a 95% faculty retention rate. Or, maybe none of them have lost a single faculty member in the last ten years. All we know is that the number of Augustana tenured or tenure-track faculty departing each year is relatively small. In the end, even if we begrudgingly accept faculty retention as the roughest of proxies for the quality of our organizational climate, these numbers seem to suggest that we have maintained a reasonably healthy faculty climate at Augustana in the last few years.

Of course, in these cases there may well be entirely understandable reasons for each departure that have nothing to do with our working environment. At the same time it’s always worth asking, no matter how small the number of people who choose not to come back, if there are things we can do to improve the quality of our work environment. Certainly there are things that we can improve that might never become so influential as to drive someone to leave. With the almost-completed Augustana College Employee Engagement study, we are on our way to identifying some of those issues. But at least on one measure of organizational quality that seems a reasonable, albeit rough, metric, we might actually be doing pretty well.

Make it a good day,

Mark

 

Riding the waves of within-year retention

I was talking with the Faculty Council recently about this year’s term-to-term retention rates when one council member suggested that I should share these numbers with the campus community.  Of course, this was a very good idea – and something that I should have done several weeks ago. So, with apologies to everyone who cares about retention (AKA everyone), here we go.

In the table below, I’ve listed the fall-to-winter term and fall-to-spring term retention rates for each class as well as the four-year averages for these data points in order to give some of these numbers context.

Fall-to-Winter Term Retention

Fall-to-Spring Term Retention
Class 4 Yr. Avg. This Year 4 Yr. Avg. This Year
1st Year 96.5% 95.7% 92.9% 93.4%
2nd Year 97.9% 98.3% 95.5% 95.4%
3rd Year 98.3% 97.1% 97.9% 96.7%
4th Year 98.3% 97.4% 93.6% 93.3%

There are a couple of things that jump off the page immediately when trying to take in all of these numbers at once. First, breaking retention down to this level of detail can make it pretty overwhelming. It is easy to get a little vertigo staring at all the different percentages, wondering how in the world anyone decides which ones are good or bad or somewhere in between.

Second, the numbers – as well as the differences between any particular number and its corresponding four-year average – bounce around a bit. For example, although the first year students’ fall-to-winter retention rate was slightly below the four-year average, their fall-to-spring retention rate exceeds the four-year norm. Conversely, while the second year students’ fall-to-winter retention rate was higher than the four-year average, their fall-to-spring retention rate ensures that we don’t get a big head.

Third, it’s not necessarily true that a given year’s retention rate below the four-year average is uniformly a bad thing. For example, over the last several years we’ve been watching the number of seniors who finish a term early inch upward. It seemed inevitable that this would happen at some point with the increasing number of college and AP credits that incoming students bring to Augustana. And as the cost of college has jumped, we probably shouldn’t be surprised at all if a few more students want to avoid that 12th term of tuition by graduating after the winter term. I get that fewer students = less tuition = budget reductions = more stress. But if our mission is to educate, and if a student has completed all that we have asked him or her to do, then I’m not sure we can be all that disappointed that they don’t stay for the spring term – especially since we haven’t designed the broader Augustana experience to culminate in any unique way during the spring of the senior year. This is not a criticism one way or another; rather I only point to this example to demonstrate how complicated this retention conversation can be.

In the end, making accurate sense of any particular within-year retention number requires a black belt in withholding judgment, a hefty dose of context, and a battle-tested nervous system. In the end, retention data is sort of like the “check engine” light in your car. When it lights up it might mean that the only thing that doesn’t work is the fuse that controls the “check engine” light. Or it might mean that something serious is going wrong under the hood and you could be in big trouble if you don’t take your car to a mechanic today. Either way, you don’t panic just because the light comes on. At the same time, you don’t shrug it off. You take a deeper look at what you are doing and try to figure out if there is anything you could do better.

Make it a good day.

Mark

 

Don’t look now, but the wheels of improvement are already in motion

430 responses.  Wow.

About 75% of Augustana’s full-time employees responded to the Augustana College Employee Survey over the last three weeks. Moreover, we got a great response from each segment of Augustana employees – faculty, staff, and administrators. I have to admit, after doing almost everything I could to encourage responses short of marching around campus in an sandwich board and chicken costume, I am thrilled. I would have been genuinely happy with 350 responses.

So … Congratulations! This means that the average response to each item is almost certain to closely reflect the perception of the entire employee population. As a result, we can be confident that whatever issues emerge from this data are not mere artifacts of the numbers we happen to collect. In addition, the quality of this data set will allow us to pursue all sorts of interesting analyses of various smaller segments of our employee population, further improving the potential for this study to help us legitimately improve the environment in which we all work.

Of course, this also means that if your earnestly held belief about a prevailing attitude among Augustana employees is contradicted by the findings of this study, you are going to be faced with a gnarly dilemma. Either you’ll have to accept the strong likelihood that your opinion has turned out not to be so, or you’ll have to present compelling evidence that refutes these findings. I suppose you could choose to double down on your belief, facts be damned, full speed ahead. But the reality of a 75% response rate means that, like it or not, the findings from this survey are pretty solid. And just so you don’t think that I’m trying to be some sort of righteous researcher revelling in my own rectitude (that line sounds great if you roll your R’s), I’ve already had to eat crow on one issue where the data makes it pretty clear that I was dead wrong. (Yes, it tastes about like what you’d think.)

Today (Monday, April 20th, 2015) we start collecting data for the second half of our employee engagement project. Later today I’ll send out an email with an invitation to participate in the The Gallup Employee Engagement Survey (otherwise known as the Q12 if you want to sound hip and “in the know” around other geeked out quant researchers). While our first survey was designed internally so that we could hone in on some important questions specific to Augustana College, the second survey, the Q12, gives us some comparison data that can function as a sort of grounding point to more realistically assess ourselves. Moreover, we will be able to get data from Gallup that we can use to compare ourselves to other educational organizations, giving us an even better sense of how we might realistically improve. The Gallup Q12 Survey is built on several decades of in-depth research on employee engagement. Some of the questions might strike you as unusual at first, but know that a virtual ocean of analysis has gone into developing the questions that compose this survey.

And in order to ensure that we don’t unintentionally bias the responses to the Q12, we won’t publicly release the results of our own Augustana Employee Survey until the Gallup data has been collected.  Even though the questions in both surveys are not identical, there is enough overlap that we need to be careful. Beside, this will give me a few weeks to process all of the data and turn it into something that will be a lot easier to read. As much as my inner quant geek would love it, I suspect that you don’t want me to send you a massive excel spreadsheet and call it good!

You will receive an email soon with a link to participate in the Q12. You’ll hear about this survey from me more than a few times in the next three weeks. Just like the first half of this project, your participation in the Q12 matters immensely.

430 responses to our first survey makes a giant statement about how much we value making Augustana a great place to work and a great community to join. It also means that this community made the collective commitment to improve – even if you did not individually complete the first survey. Whether you like or not, the improvement train has left the station and we’re all on it.

Make it a good day,

Mark

How many responses did you get? Is that good?

As most of you know by now, the last half of the spring term sometimes feels like a downhill sprint. Except in this case you’re less concerned about how fast you’re going and more worried about whether you’ll get to the finish line without face-planting on the pavement.

Well, it’s no different in the IR Office.  At the moment, we have four large-scale surveys going at once (the recent graduate survey, the senior survey, the freshman survey, and the employee survey), we’ve just finished sending a year’s worth of reports to the Department of Education, and we’re preparing to send all of the necessary data to the arbiter of all things arbitrary, U.S. News College Rankings. That is in addition to all of the individual requests for data gathering and reporting and administrative work that we do every week.

So in the midst of all of this stuff, I wanted to thank everyone who responded to our employee survey as well as everyone who has encouraged others to participate. After last week’s post, a few of you asked how many responses we’ve received so far and how many we need. Those are good questions, but as is my tendency (some might say “my compulsion”) the answer is more complicated than you’d probably prefer.

In essence, we need as many as we can get from as many different types of employees as we can get. But in terms of an actual number, defining “how many responses is enough” can get pretty wonky with formulas and unfamiliar symbols. So I shoot for 60% of an overall population. That means, since Augustana has roughly 500 full-time employees, we would cross that threshold with 300 employee survey responses.

However, that magic 60% applies to any situation where we are looking at the degree to which a set of responses to a particular item can be confidently applied to the overall population. What if we want to look at responses from a certain subgroup of employees (e.g., female faculty)?  In that case, we need to have responses from 60% of the female faculty, something that isn’t necessarily a certainty just because we have 300 out of 500 total responses.

This is why I am constantly hounding everyone about our surveys in order to get as many responses as possible. Because we don’t know all of the subgroups that we might want to analyze when we start collecting data; those possibilities arise during the analysis. And once we find out that we don’t have enough responses to dig into something that looks particularly important, we are flat out of luck.

So this week, I’m asking you to do me a favor.  Ask one person who you don’t necessarily talk to every day if they’ve taken the survey. If they haven’t, encourage them to do it. It might end up making big difference.

Make it a good day,

Mark

A casual and incomplete FAQ for our current employee survey

Even though this is my fifth year at Augustana, the concept of Muesday still throws me for a loop. Maybe this is because I don’t have to think about it much, counting beans in my little office all day every day like I do. Conversely, most faculty I know talk about it as if it’s the most normal concept in the world, no matter if they’ve taught at Augustana for a couple of years or a couple of decades. And even though I think I’ve developed a failsafe cover to hide my ignorance (toss my head back, laugh, lean in while I bat the air in front of my face, say emphatically, “of course, what was I thinking!” while rolling my eyes), it’s an annual reminder for me that the concepts each of us take for granted aren’t always so obvious to everyone else.

I’ve been reminded of this reality again as I’ve been inviting everyone to fill out current Augustana College Employee Survey.  More than a few people have expressed concerns about anonymity and confidentiality.  A few have even floated impressive conspiracy theories of NSA-caliber data scrubbing.  So before I have to run off to my weekly administrator neural network reprogramming and empathy reduction session, I thought that I’d try to answer the anonymity and confidentiality questions in a little more detail. (Yes, I’m kidding. The administrator neural network reprogramming and empathy reduction sessions are every OTHER week and don’t meet this week because it’s MUESDAY!)

When I promise anonymity to everyone who responses to the Augustana College Employee Survey, that means that I don’t ask for your name or other information that directly identifies you. It also means that the software doesn’t collect your Augustana user ID or the IP address of the computer that you used to complete the survey. In order to do this, I turn off a setting in the Google Forms software that would normally add this information to the dataset.

Turning this feature off also means that the survey is publicly accessible – a potential downside to be sure. So it is technically possible that each of the 340+ survey responses I’ve received aren’t actually coming from Augustana employees. But that would mean that somebody somewhere else has acquired the web address of the survey and has spent their days and nights repeatedly filling the survey out over and over with just enough variation of answer choices to avoid suspicion.  Yeah, I doubt it.

Some folks have pointed out that there are enough demographic questions that there might be a way to identify some respondents. This is technically true: if someone had access to both the college’s employee database and the current employee survey dataset, one could probably figure out a way to be pretty sure about the identify of some of the respondents, particularly if one were to triangulate several demographic characteristics (e.g., race and age data) to pick out subgroups of employees that have only a few members. Of course, the only person on campus who has access to both of these datasets is, well, me. If you think that this is a likely explanation for how I spend my time … I guess I sort of doubt that you are even reading this post. Nonetheless, to be clear – I’m not trying to figure out what you said in your survey. And I’m not taking that information and slipping it under someone else’s door so that they can hire henchmen to come to your office and hide your keys. It’s not that I don’t care.  I’m just too busy.

All joking aside, this survey does ask some questions that can easily be perceived as risky to answer. So, if you are concerned about anonymity but want to respond to the survey, just leave any demographic question that cuts too close to the quick blank.  That way you don’t have to worry about having your anonymity violated. I think we’d rather be able to stir your opinion into the mix even if it might not get included in more complex analysis.

Confidentiality is a little different from anonymity. There are numerous student surveys where we promise confidentiality but not anonymity. We often will ask students for their ID number so that we can merge the data they provide with prior institutional data so that we can take a longer view of our students’ four years at Augie, looking for patterns across the entirety of their college experience. Confidentiality specifically refers to how we will share any of our survey findings. When I promise confidentiality, I am promising that I won’t share the data in any way that might link your set of responses to you. Instead, all data findings will be shared as averages of groups, whether that be the entire group of respondents or small subgroups of respondents.

This does again raise the question that some have asked about protecting the anonymity and confidentiality of those who are members of sparsely populated subgroups. When I promise confidentiality, I have to also consider the possibility that presenting data in all of the ways that it can be sliced and diced could lead to violating someone’s confidentiality. To allay this concern, I am ensuring confidentiality by simply not sharing any results in a way that might allow folks to reasonably infer any individual’s responses. I will not share any average responses to questions where the number of respondents in that particular subgroup is less than five. This makes it much less likely that anyone could determine the nature of someone’s individual responses based on the average responses from any particular subgroup of responses. So, for example, if we have less than five respondents in the category of employees who have worked here between six and ten years, then we won’t share any results for any question by the number of years employees have worked at Augustana.

Just like the anonymity question above, if you are worried that your confidentiality will be violated, don’t provide answers to those specific questions.

Even though we have already received many responses to this survey, we still need many more because the more that we have, the more likely it will be that we can look at subgroups of responses and analyze this data without violating anonymity and confidentiality.

Getting better as an organization is hard work. At its core, it requires that we all put something into it.  Completing this survey is a big first step. I hope you’ll all give it a shot.

Make it a good day,

Mark

 

The race to get old started yesterday. Hurry up!

A little over a week ago the Wall Street Journal published a short piece entitled, “Today’s Anxious Freshmen Declare Majors Far Faster Than Their Elders:Weak job market and high debt loads prompt broad shift away from intellectual exploration.” They cited data from their own small but random survey of colleges and universities suggesting that more and more freshmen declare their majors earlier. While the article and those interviewed for it speculated about a variety of factors that might be driving this phenomenon, the conclusion seemed pretty clear: college is now much less about discovering yourself first and finding a career later and much more about locking into a track for a career.

I thought it would be interesting to see if our own data reflected a similar trend. We were able to examine data over a similar time period, exploring the differences between students who entered Augustana as freshmen in the fall of 2007 and students who entered Augustana as freshmen in the fall of 2013. In addition, I thought it would be interesting to expand on the Wall Street Journal analysis since they aren’t clear about when the institutional data they presented was collected (in the fall of the first year? in the spring of the first year? at the beginning of the second year?). So we compared the two freshmen cohorts noted above in three ways. First, what proportion of the class indicated that they were undecided on their major when they applied to Augustana? Second, what proportion of those undecided students had declared a major by the beginning of their second year? And third, what proportion of the entire freshman class had declared a major by the beginning of the second year?

Our Augustana results seem to parallel the findings reported by the Wall Street Journal. During the application process, 16% (111 of 713) of the 2007 first-year cohort indicated that they were undecided about their major. During the 2013 cohort’s application process, only 11% (70 of 627) selected “undecided” when asked about their intended major. Interestingly, the proportion of these initially undecided students who had chosen a major by the beginning of their second year did not change appreciably between the fall of 2008 and the fall of 2014. Of the undecided majors from the 2007 cohort, 68% (63 of 92 – the remaining 19 did not persist to the second year) had still not selected a major one year later.  From the 2013 cohort, 69% (40 of 58 – the remaining 12 did not return to Augustana) of the initially undecided remained undeclared.

The biggest difference between the two cohorts can be found in the proportion of students who had declared a major by the beginning of the second year. Remember, the position taken by the Wall Street Journal article was that students take less time for intellectual pursuits and narrow their focus on a major earlier than in previous years. At Augustana, It appears that we are seeing a similar phenomenon.  While 54% of the 2007 first-year cohort had not yet declared their major by the beginning of the second year, only 36% of the 2013 cohort were still undeclared majors by the beginning of the second year.

So . . . is this a bad thing?

Honestly, I’m not sure.  In the end, I don’t know that we will have much success telling students that they are wrong to respond to external pressures of a tight job market and high student debt by choosing their major earlier. That kind of approach is likely to come across as tone-deaf to some very real concerns. It seems to me that this data re-emphasizes the importance of timely and substantive conversations between students and all of us who impact their education (faculty, administrators, work supervisors, residence life staff, student life staff, and fellow students) that push students to develop themselves even as they are preparing for life after college. Personal and intellectual development and career preparation ought to be a “both/and” enterprise.

If we can do that, our students are likely to grow and change in just the ways that we hoped they would.

Make it a good day,

Mark