Busy is as busy does . . .

Hey Folks,

This is the time of the term when everyone conjures up whatever remaining powers they have left to slog through finals, grade furiously, and put the term out of its misery.  Or, if you have a slightly more optimistic view of life (and I hope you do), you are overcome with a surge of pride in your students for all they have learned, all they have endured, and all they have become over ten short weeks.  See, that wasn’t so hard now, was it?

To be honest, I’m not inclined to say much this week only because I don’t think many of you have the time to read my blathering about some little data point that has me all atwitter.  And aside from that somewhat uncomfortable image, the last thing I want this blog to become is long, myopic, and just too much.

So I’ll throw this out into the cybertron and let you do what you want with it.  I’ve been privileged to be involved with a number of senior inquiry and service-learning projects this term.  I’ve been very impressed and even proud of the work that I’ve seen these students produce.  They’ve thought carefully about their research, wrestled with tough problems, dealt with mishaps and unpredictability, and throughout have remained honest, genuine, and intent on doing their best work.  Was it all perfect?  Of course not.  Was is supposed to be?  no.  But did I see growth that should make a college proud?  Damn straight.

Even though I am constantly talking about ways that we might improve, it is important to remind ourselves that we often do very good work.  And we deserve the chance to step back from time to time and soak it all in.  You put your heart into the work of making young people better.  And in many cases you help students realize a little bit more of who they aspire to become – even when they don’t fully know who that is or why it might be important.

So – grade like a banshee.  Then relax like a champion.  You deserve it.

Make it a good day,

Mark

Talking, albeit eloquently, out of both sides of our mouths

Many of my insecurities emerge from a very basic fear of being wrong.  Worse still, my brain takes it one step further, playing this fear out through the infamously squeamish dream in which I am giving a public presentation somewhere only to discover in the middle of it that my pants lie in a heap around my ankles.  But in my dream, instead of acknowledging my “problem,” buckling up, and soldiering on, I inexplicably decide that if I just pretend not to notice anything unusual, then no one in the audience will notice either.  Let’s just say that this approach doesn’t work out so well.

It’s pretty hard to miss how ridiculous this level of cognitive contortionism sounds.  Yet this kind of foolishness isn’t the exclusive province of socially awkward bloggers like me.  In the world of higher education we sometimes hold obviously contradictory positions in plain view, trumpeting head-scratching nonsequiturs with a straight face.  Although this exercise might convince many, including ourselves, that we are holding ourselves accountable to our many stakeholders, we actually make it harder to meaningfully improve because we don’t test the underlying assumptions that set the stage for these moments of cognitive dissonance.  So I’d like to wrestle with one of these “conundrums” this week: the ubiquitous practice of benchmarking in the context of a collective uncertainty about the quality of higher education – admitting full well that I may well be the one who ends up pinned to the mat crying “uncle.”

It’s hard to find a self-respecting college these days that hasn’t already embedded the phrase “peer and aspirant groups” deep into its lexicon of administrator-speak.  This phrase refers to the practice of benchmarking – a process to support internal assessment and strategic planning that was transplanted from the world of business several decades ago.  Benchmarking is a process of using two groups of other institutions to assess one’s own success and growth.  Institutions start by choosing a set of metrics to identify two groups of colleges: a set of schools that are largely similar at present (peers) and a set of schools that represent a higher tier of colleges for which they might strive (aspirants). The institution then uses these two groups as yardsticks to assess their efforts toward:

  1. improved efficiency (i.e., outperforming similarly situated peers on a given metric), or
  2. increased effectiveness (i.e., equaling or surpassing a marker already attained by colleges at the higher tier to which the institution aspires).

Sometimes this practice is useful, especially in setting goals for statistics like retention rates, graduation rates, or a variety of operational measures.  However, sometimes this exercise can unintentionally devolve into a practice of gaming, in which comparisons with the identified peer group too easily shine a favorable light on the home institution, while comparisons with the aspirant group are too often interpreted as evidence of how much the institution has accomplished in spite of its limitations.  Nonetheless, this practice seems to be largely accepted as a legitimate way of quantifying quality.  So in the end, our “go-to” way of demonstrating value and a commitment to quality is inescapably tethered to how we compare ourselves to other colleges.

At first, this seems like an entirely reasonable way to assess quality.  But it depends on one  fundamental assumption: the idea that, on average, colleges are pretty good at what they do.  Unfortunately, the last decade of research on the relative effectiveness of higher education suggests that, at the very least, the educational quality of colleges and universities is uneven, or at worst, that the entire endeavor is a fantastically profitable house of cards.

No matter which position one takes, it seems extraordinarily difficult to simultaneously assert that the quality of any given institution is somewhere between unknown and dicey, while at the same time using a group of institutions – most of which we know very little about beyond some cursory, outer layer statistics – as a basis for determining one’s own value.  It’s sort of like the sixth grade boy who justifies his messy room by suggesting that it’s cleaner than all of his friends’ rooms.

My point is not to suggest that benchmarking is never useful or that higher education is not in need of improvement.  Rather, I think that we have to be careful about how we choose to measure our success.  I think we need to be much more willing to step forward and spell out what we think success should look like, regardless of what other institutions are doing or not doing.  In my mind, this means starting by selecting a set of intended outcomes, defining clearly what success will look like, and then building the rest of what we do in a purposeful way around achieving those outcomes.  Not only does this give us a clear direction simply described to people within and without our own colleges, but gives us all the pieces necessary to build a vibrant feedback loop to assess and improve our efforts and our progress.

I fully understand the allure of “best practices” – the idea that we can do anything well simply by figuring out who has already done it well and then copying what they do.  But I’ve often seen the best of best practices quickly turn into worst practices when plucked out of one setting and dropped wholesale into a different institutional culture.  Maybe we’d be better off paying less attention to what everyone else does, and concentrate instead on designing a learning environment that starts with the end in mind and uses all that we already know about college student development, effective teaching, and how people learn.  It might look a lot different than the way that we do it now.  Or it might not look all that different, despite being substantially more effective.  I don’t know for sure.  But it’s got to be more effective than talking, albeit eloquently, out of both sides of our mouths.

Make it a good day,

Mark

 

Who are the students who said that no one recommended the CEC to them?

Last week I wrote about the way that our seniors’ responses to the question “Who recommended the Community Engagement Center to you?” might reflect the values that we communicate through out actions even if they aren’t necessarily the values that we believe we have embraced.  At the end of my post I promised to dig deeper into our senior survey to better understand the students who said that no one recommended the Community Engagement Center to them.  During the past several days my students and I have been peeling the data back in all kinds of ways.  Based on prior findings on students’ experiences with major advising and its connection to post-graduate planning, we thought that we might be able to identify some pattern in the data that would give us some big answers.  So we laid out a couple of hypotheses to test for the students who said no one recommended the CEC to them.  We thought:

  • These students would be more likely to intend to go to graduate school
  • These students would be more likely to major in a humanities discipline
  • These students would be less involved in co-curricular activities
  • These students would be generally less engaged in their college experience

Here is what we found.

First, these students weren’t more likely to be headed to graduate school.  This hypothesis was based on an earlier finding that students who intended to go to grad school were more likely to work with a professor to guide them through the application process while students planning to get a full time job would be referred to Career Services.  But our  students were distributed across the post-graduate plan options of grad school and work just like everyone else.  So this first hypothesis was a total bust.

Genius IR Shop : 0  —  Data : 1

Second, these students were not significantly more likely to major in humanities disciplines.  This hypothesis evolved from some earlier conversations with students that suggested less of a natural connection between the career center and the more “pure” liberal arts disciplines.  In the end, while some of the humanities disciplines did seem to appear slightly more often than most pre-professional degrees, there were plenty of students from the natural and physical sciences who also said no one recommended the CEC to them.  So even though there was an initial glimmer of possibility, the reality is that this second hypothesis was also a flop.

(Aspiring to be but clearly not yet) Genius IR Shop : 0  —  Data : 2

Third, we couldn’t find much in our data to support our assertion that these students were less involved in co-curricular activities.  Our originating hypothesis was based on the idea that students who are less social might not end up in situations where the CEC would be recommended as often.  Although these students found slightly fewer student groups that matched their interests, they were still involved in at least one student organizations and clubs as often as other students.  Despite looking at this data through the most friendly lens, we just couldn’t say that this group of students’ responses was a function of their lack of co-curricular involvement.

(Nothing but a bumbling shadow of a) Genius IR Shop : 0  —  Data : 3

At this point in the story, you ought to suspect some stress on my part.  It’s not all that much fun to be wrong repeatedly.  Furthermore, our last hypothesis about a general passivity is qualitatively more difficult to test than simply looking at differences across one particular question.  Nonetheless, my minions and I soldiered on.  We looked across all of the questions on the senior survey, identifying significant differences and looking for trends.  Thankfully, we found a host of evidence to support our last hunch.

We found that the students who said no one recommended the CEC to them were less plugged in to their college experience across the board.  Their responses to every one of the advising questions were significantly lower, their responses to many of the co-curricular experiences questions were significantly lower, and their responses to a number of curricular experience questions both in the major and across the curriculum were significantly lower.

(Salvaging the crumbling remains of my) Genius IR Shop : 1  —  Data : 3

What jumps out to me as a result of this exercise is the importance of our informal educational efforts.  There will always be a subset of students who simply, magically do the things we hope they would do, take the initiative to ask the next question, and get themselves ahead of the curve simply because they are the cream of our crop.  However, there will always be a subset of students who stumble out of the gate, drift passively into the fog, and avoid choices simply because they are . . . human.  Because we have many cream of the crop types, its all too easy to miss those who suffer from being, well, normal.  So to me, this is why we must take the initiative to ask students if they’ve done the things that might seem completely obvious to us, like recommending to them that they should check out the services at the CEC early in their college career – and tell them exactly why this can matter in the broader scheme of their life’s journey.  If we want all of our students – no matter if they are already perfectly formed adults or if they are bumbling, stumbling, grumbling prepubescents masquerading as undergraduates on the cusp of adulthood – to wring every developmental drop out of their college learning experience, then we have to take on a proactive role to ensure that no one gets left out in the cold, especially those who are more susceptible to float off with the current du jour.

Remember, this study isn’t about who did and did not use the CEC.  The question we examined asks “who recommended the CEC to you.”  We asked the question this way specifically to give us feedback on the nature of the experience we are delivering to our students – not just to find out what our students did.  And as it turns out, the degree to which we are proactive educators may be one of the most crucial ways in which we might purposefully guide our more passive students.  Not rocket science?  Maybe.  Worth remembering as we bustle through our own madcap world?  Absolutely.

Make it a good day,

Mark

Recommending Students to the Community Engagement Center

Sometime I worry that I tend to look at our student data through an overly quantitative lens.  I’ll look for significant predictors of specific outcomes or statistically significant differences between two groups.  And as trained, I instinctively take the steps necessary to avoid the most common statistical stumbling blocks such as claiming significant when there is none or mistaking correlation for causation.  But there are times when this propensity to immediately dive deep into the data means that I miss a critical point that sits in plain view, screaming at the big nosed, bearded face looming over it, “Hey, you idiot!  I’m right here!”

One such moment came recently while I was revisiting the simple distribution of seniors’ responses to the question, “Who recommended the CEC (Community Engagement Center) to you?”  Student could select as many options as might apply in their case: faculty within my major(s), faculty outside my major(s), my major adviser, my first year adviser, residential life staff, student activities staff, my parents, another student, other administrators, and finally, no one recommended the CEC to me.

As I stared at the percentages under each response option, I began to think that this question might be the type that holds within it an array of discoveries.  First, the distribution of responses appeared to reflect a set of values that we communicate to students about 1) the role of the CEC on campus and, 2)  the way in which we see our educational efforts as a process of preparing students for life after Augustana.  Second, since the CEC often functions as a student gateway to all sorts of other important educational experiences, I began to wonder if students who indicate that no one recommended the CEC to them might also score lower on a host of other experiences that either might follow from an initial interaction with the CEC or might suggest a broader degree of disengagement.

So here is the question followed by the distribution of students’ responses:

Who recommended the CEC (Community Engagement Center) to you? Check as many as might apply.

  • Faculty within my major(s) – 41.5%
  • My major adviser – 28.1%
  • No one recommended the CEC to me – 23.4%
  • Another student – 21.4%
  • Faculty outside my major(s) – 17%
  • Other administrator – 14%
  • My first year adviser – 11.4%
  • My parents – 9.6%
  • Student Activities staff – 5.2%
  • Residential Life staff – 1.6%

These numbers alone tell us something pretty interesting.  Clearly, recommendations to the CEC tend to come out of students’ academic experience in their major.  First, this suggests that these recommendations probably come later in one’s college career – junior or senior year (sophomore year at the earliest).  Further, these recommendations rarely come from the co-curricular side of the student experience.  Thus, it appears that in general we conceive of the role of the CEC as either, a) a means of resolving an absence of post-graduate career purpose (students in their later years who still don’t seem to know what they want to do after college or students who in the midst of searching for a career plan “B”), or, b) a support service to help students bundle the totality of their college experience in preparation for the job or grad school search.  Either way, the role we see for the CEC seems more retroactive than proactive. It doesn’t appear that we have generally thought of the CEC as a students’ compass with which they might plot out  – from the moment they arrive on campus – their college experience in a way that allows them to move forward with intentionality.  Nor do we appear to have thought much about linking our students’ co-curricular experiences – one of Augustana’s true, albeit often under-appreciated strengths – with the role of the CEC.  All of this doesn’t seem to comport with our belief that a liberal arts college experience is holistic, developmental, and fully integrated; one that starts, from the very beginning, with the end in mind, and one that believes the whole must be greater than the sum of the parts.

Now there may be lots of lengthy explanations for this particular distribution of responses; some of them might even be entirely legitimate.  But it doesn’t change the nature of the values that we appear to be expressing – or not expressing – as portrayed through student-reported experiences.  In addition, 23.4% of our seniors indicated that no one recommended them to the CEC.  Given the array of services that originate out of the CEC, I’d suggest that we would like that number to be much lower than effectively one-quarter of a graduating class.

Admittedly, there were some interesting anomalies in the data and caveats that we should consider.  A few students indicated that no one recommended the CEC to them AND indicated that another student recommended the CEC to them.  And it was during this cohort of students’ career at Augustana that the CVR (Center for Vocational Reflection) merged with a variety of other services including Career Services to create the CEC – making it possible that some students might not have considered their earlier recommendations to the CVR when responding to this question.  But even in the presence of these caveats, we should be willing to ask ourselves whether our students’ experience mirrors the values that we purport to hold.

The other aspect of this particular question that I find interesting is the degree to which the difference in responses to this question (no one recommended the CEC to me vs. someone recommended the CEC to me) might mask statistically significant differences on many other questions in the senior survey.  Now I’m not claiming that there is a direct relationship between this question and all of the others on which student responses also differed.  However, it seems to me highly possible that, like many other situations in life where one unique opportunity correlates with or begets a series of other opportunities that ultimately separates a person from the pack, interaction with the CEC may indeed open up pathways and ways of thinking about the college experience in the same way that color changes the fundamental nature of black and white film.

It turns out that students who said no one recommended the CEC to them differed significantly (in a statistical sense) on many items on the senior survey that involve the advising experience, the broader curricular experience, and the co-curricular experience.  Next week I’ll talk more about what we might learn from this array of differences.

Make it a good day,

Mark