Wait . . . why focus on life after college?

In the last few years several prominent liberal arts colleges have made post-graduate success a central measure of the institution’s educational quality (for example, see such plans or programs at St. Olaf College and The College of Wooster). Despite what some old school moon-howlers might have you believe, this move isn’t driven by a rejection of the liberal arts or an administrative coup d’etat. (If we’d given up on the liberal arts, we’d have all gone for higher paying gigs at some fly-by-night for-profit a long time ago, and – with all due love and respect for my administrative compatriots – we aren’t nearly smart enough to pull of a decent coup d’etat.)

Rather, what most of us have come to realize is that in order for the liberal arts to thrive into the next century, we have to reframe, refocus, and refine the way that we operationalize the liberal arts in the context of our current social, cultural, and economic conditions. Just like the approach that drove the successful emergence of small liberal arts college during the 18th and 19th centuries, we are again faced with the need to adapt our commitment to developing creative and innovative problem-solvers through interdisciplinarity and the synthesis of great ideas.

In the next three posts, I’d like to explain in more depth what I mean by reframing, refocusing, and refining the way that we operationalize the liberal arts in a 21st century context.  But for now, I’d like to highlight one data point that I think underscores a need to strengthen our focus on preparing all students equally for life after college.

In the 2013 administration of our recent graduate survey (alumni surveyed nine months after they’ve graduated from Augustana), we asked our alums about the degree to which they thought Augustana prepared them to succeed in their current endeavors.  Since we had already organized the survey to take those who had chosen to go to graduate school and those who had pursued immediate employment to separate subsets of questions, they answered two different versions of this questions, each worded to more specifically get at the relationship between their Augustana education and their current path.

The difference between the proportion of alums who felt they were prepared well depending upon whether they went to grad school or went to work seemed large enough to consider further.  Among those who went to grad school, 78% said that they were “fairly well” or “very well” prepared for their grad program.  By contrast, among those who went to full-time employment, only 65% said that they were “fairly well” or “very well” prepared for their first job. (The other response options were “somewhat,” “a little,” and “not at all.”)

Why might this be?  In digging further into the data, it appears that some initial suppositions don’t always hold true.  Pre-professional majors don’t always feel well prepared for their subsequent path (whether it be work or grad school) and more traditional liberal arts majors don’t always feel less well-prepared for their subsequent path.  We didn’t find a strong correlation between a student’s final GPA and their sense of preparation. Moreover, both groups of respondents (grad students and full-time employees) were scattered across a range of programs or professions, so it didn’t appear that any specific undergraduate programs were driving the results one way or the other.

Interestingly, it appears that the students who got a preview of what life would be like during the next phase of their own post-graduate path were the ones who felt best prepared by their Augustana experience.  Alums currently employed in a job who also participated in an internship while at Augie tended to feel better prepared than those who did not do an internship.  Similarly, the alums currently in grad school who also participated in a research project either with faculty or on their own tended to feel better prepared than those who did not have any research experience.

In general, this data point suggests that we might need to improve the degree to which we prepare students who go directly into the workforce after college.  In some ways, it doesn’t seem particularly surprising that an institution largely governed by faculty with Ph.D.s from elite research universities would be better at preparing students to succeed in grad school than in full-time employment.  This might be simply a case of something where we have to put a more concerted effort into preparing the students who aim for a full-time job right out of college just because our frame of reference would likely advantage graduate school preparation.

This finding also seems to provide some insight into the kinds of advice that we ought to give students based on their post-graduate goals.  Students intending to go into the workforce might be better suited for internships while students with plans for grad school might be ideal candidates for an extended research experience.  And while some of these experiences might be credit-bearing, in many cases they are outside the scope of the curriculum; another reason why it is important for advising to be conceived as a means of developing students holistically instead of merely selecting courses.

In the end, I don’t know that this finding demands a wealth of professional development for faculty – although if done right such assistance is not a bad thing.  Instead, resolving this gap may require little more than recognizing the extent of our own experiences, adjusting for our biases, and explicitly connecting students with the  right resources that uniquely fit with their post-graduate aspirations.

Welcome back to campus!  It’s lonely around here without you.

Make it a good day,

Mark

The Holiday Wish List for a Measurement Geek

Sincerely apologies to anyone who tried to find a new post on my blog yesterday. Apparently our server went “walk-about” over the weekend and our IT folks have been working day and night to salvage everything that was no longer operational.  I think that we are in the clear today, so I’ll try to put this post up a day late.

______________________________________________________________________

This is the week where I can’t help but overhear all the talk of the holiday gifts that people are getting for their spouses, partners, kids, friends, or in-laws.  And it struck me that there aren’t nearly enough suggestions for measurement folks who need to just own their geekdom and go big with it.  So here are a few ideas, discoveries, and possibilities.

  • Statistics ties.  Any formula, pie chart, or dumb stats pun on a tie.  Because nothing bludgeons humor to death better than a stupid stats pun.
  • The children’s book Magnus Maximus, a Marvelous Measurer.  It’s a pretty fun book with wonderful illustrations.  And it’s never too early to stereotype your profession.
  • The world’s largest slide rule.  Of course, it’s located in Texas.
  • The complete DVD set of the TV show NUMB3RS. This show managed to tease my people with the hope that someday complex math skills could really save a life. And yet, to this day I’ve never been in a public venue where someone suddenly yelled frantically, “Is there a statistician in the house!?”
  • A Digicus. They were made in the late 70s and early 80s by the electronic’s company Sharp. Apparently many Japanese were suspicious of the digital calculator when it was first introduced, so the Digicus was created to allow people to check their calculator results against an abacus. And you thought higher ed types were skeptical of change???
  • And last but not least, anything by the band Big Data. Yes, there is a band called Big Data. They describe themselves as a “paranoid electronic music project from the internet.”  Okey dokey.

Make it a good holiday break,

Mark

For the want of a response, the data was crap

Any time I hear someone use data from one of the new freshman, senior, or recent graduate surveys to advocate for a particular idea, I can’t help but smile a little.  It is deeply gratifying to see faculty and administrators comfortably use our data to evaluate new policy, programming, and strategic direction ideas.  Moreover, we can all point to a growing list of data-driven decisions that we know have directly improved student learning.

So it might seem odd, but that smile slips away almost as quickly as it appears. Because underneath this pervasive use of data lies a deep trust in the veracity of those numbers. And the quality of our data depends almost entirely upon the participation of 18-22 year-olds who are . . . . let’s just say “still developing.”  Data quality is like milk – it can turn on you overnight. If the students begin to think that survey questions don’t really apply to them or they start to suspect that the results aren’t valued by the college, they’ll breeze through the questions without giving them much thought or blow off the survey entirely. If that happens on a grand scale . . . . I shudder to think about it.  So you could say that I was “mildly concerned” as I organized fall IDEA course feedback forms for processing a few weeks ago and noticed several where the only bubbles colored in were “fives.”  A few minutes later I found several where the only darkened bubbles were “ones.”

Fortunately, a larger sampling of students’ IDEA forms put my mind at ease.  I found that on most forms the distribution of darkened circles varied and, as best as I could tell, student’s responses to the individual questions seemed to reflect at least a minimal effort to provide truthful responses.  However, this momentary heart attack got me wondering: to what degree might student’s approach to our course feedback process impact the quality of the data that we get?  This is how I ended up in front of Augustana’s student government (SGA) earlier this week talking about our course feedback process, the importance of good data, the reality of student’s perceptions and experiences with these forms, and ways that we might convince more students to take this process seriously.

During this conversation, I learned three things that I hope you’ll take to heart.  First, our students really come alive when they feel they are active participants in making Augustana the best place it can be.  However, they start to slip into passive bystanders when they don’t know the “why” about processes in which they are expected to be key contributors.  When they become bystanders, they are much less likely to invest their own emotional energy in providing accurate data.  Many of the students honestly didn’t think that the IDEA data they provided on the student form was used very often – if ever. If the data doesn’t really matter anyway, so their thinking goes, the effort that they put in to providing it doesn’t matter all that much either.

Second, students often felt that not all of the questions about how much progress they made on specific objectives applied in all classes equally.  As I explained to them how the IDEA data analysis worked and how the information that faculty received was designed to connect the objectives of the course with the students’ sense of what they learned, I could almost hear the light bulbs popping on over their heads.  They were accustomed to satisfaction-type surveys in which an ideal class would elicit a high score on every survey question.  When they realized that they were expected to give lower scores to questions that didn’t fit the course (and that this data would be useful as well), their concern about the applicability of the form and all of the accompanying frustrations disappeared.

Third, even though we – faculty, staff, and administrators – know exactly what we mean when we talk about learning outcomes, our students still don’t really know that their success in launching their life after college is not just a function of their major and all the stuff they’ve listed on their resume.  On numerous occasions, students expressed confusion about the learning objectives because they didn’t understand how they applied to the content of the course.  Although they may have seen the lists of skills that employers and graduate schools look for, it seems that our students think these are skills that are largely set in stone long before they get to college, and that college is mostly about learning content knowledge and building a network of friends and “connections.”  So when they see learning objectives on the IDEA forms, unless they they have been clued in to understand that these are skills that the course is designed to develop, they are likely to be confused by the very idea of learning objectives above and beyond content knowledge.

Although SGA and I plan to work together to help students better understand the value of the course feedback process and its impact on the quality of their own college experience, we – faculty, staff, and administrators – need to do a much better job of making sure that our students understand the IDEA course feedback process.  From the beginning of the course, students need to know that they will be learning more than content.  They need to know exactly what the learning goals are for the course. Students need to know that faculty want to know how much their students’ learned and what worked best in each class to fuel that learning, and that satisfaction doesn’t always equate to learning.  And students need to know how faculty have used course feedback data in the past to alter or adapt their classes.  If you demonstrate to your students how this data benefits the quality of their learning experience, I think they will be much more willing to genuinely invest in providing you with good data.

Successfully creating an evidence-based culture of perpetual improvement that results in a better college requires faculty, staff, and administrators to take great care with the sources of our most important data.  I hope you will take just a few minutes to help students understand the course feedback process.  Because in the end, not only will they benefit from it, but so will you.

Make it a good day,

Mark

 

 

 

 

Could a focus on learning outcomes unwittingly sacrifice process for product?

A central tenet of the learning outcomes movement is that higher education institutions must articulate a specific set of skills, traits, and/or dispositions that all of its students will learn before graduation. Then, through legitimate means of measurement, institutions must assess and publicize the degree to which its students make gains on each of these outcomes. Although many institutions have yet to implement this concept fully (especially regarding the thorough assessment of institutional outcomes), this idea is more than just a suggestion. Each of the regional accrediting bodies now requires institutions to identify specific learning outcomes and demonstrate evidence of outcomes assessment as a standard of practice.

This approach to educational design seems at the very least reasonable. All students, regardless of major, need a certain set of skills and aptitudes (things like critical thinking, collaborative leadership, intercultural competence) to succeed in life as they take on additional professional responsibilities, embark (by choice or by circumstance) on a new career, or address a daunting civic or personal challenge. In light of the educational mission our institutions espouse, committing ourselves to a set of learning outcomes for all students seems like what we should have been doing all along.

Yet too often the outcomes that institutions select to represent the full scope of their educational mission, and the way that those institutions choose to assess gains on those outcomes, unwittingly limits their ability to fulfill the mission they espouse. For when institutions narrow their educational vision to a discrete set of skills and dispositions that can be presented, performed, or produced at the end of an undergraduate assembly line, they often do so at the expense of their own broader vision that would cultivate in students a self-sustaining approach to learning. What we measure dictates the focus of our efforts to improve. As such, it’s easy to imagine a scenario in which the educational structure that currently produces majors and minors in content areas is simply replaced by one that produces majors and minors in some newly chosen learning outcomes. Instead of redesigning the college learning experience to alter the lifetime trajectory of an individual, we allow the whole to be nothing more than the sum of the parts – because all we have done is swap one collection of parts for another. Although there may be value in establishing and implementing a threshold of competence for a bachelor’s degree (for which a major serves a legitimate purpose), limiting ourselves to this framework fails to account for the deeply-held belief that a college experience should approach learning as a process – one that is cumulative, iterative, multi-dimensional, and, most importantly, self-sustaining long beyond graduation.

The disconnect between our conception of a college education as a process and our tendency to track learning as a finite set of productions (outcomes) is particularly apparent in the way that we assess our students’ development as life-long learners. Typically, we measure this construct with a pre-test and a post-test that tracks learning gains between the years of 18 and 22 – hardly a lifetime (the fact that a few institutions gather data from alumni five and ten years after graduation doesn’t invalidate the larger point). Under these conditions, trying to claim empirically that (1) an individual has developed and maintained a perpetual interest in learning throughout their life, and that (2) this life-long approach is direct attributable to one’s undergraduate education, probably borders on the delusional. The complexity of life even under the most mundane of circumstances makes such a hypothesis deeply suspect. Yet we all know of students that experienced college as a process through which they found a direction that excited them and a momentum that carried them down a purposeful path that extended far beyond commencement.

I am by no means suggesting that institutions should abandon assessing learning gains on a given set of outcomes. On the contrary, we should expect no less of ourselves than substantial growth in all of our students as a result of our efforts. Designed appropriately, a well-organized sequence of outcomes assessment snapshots can provide information vital to tracking student learning over time and potentially increasing institutional effectiveness. However, because the very act of learning occurs (as the seminal developmental psychologist Lev Vygotsky would describe it) in a state of perpetual social interaction, taking stock of the degree to which we foster a robust learning process is at least as important as taking snapshots of learning outcomes if we hope to gather information that helps us improve.

If you think that assessing learning outcomes effectively is difficult, then assessing the quality of the learning process ought to send chills down even the most skilled assessment coordinator’s spine. Defining and measuring the nature of process requires a very different conception of assessment – and for that matter a substantially more complex understanding of learning outcomes. Instead of merely measuring what is already in the rearview mirror (i.e., whatever has already been acquired), assessing the college experience as a process requires a look at the road ahead, emphasizing the connection between what has already occurred and what is yet to come. In other words, assessment of the learning that results from a given experience would include the degree to which a student is prepared or “primed” to make the most of a future learning experience (either one that is intentionally designed to follow immediately, or one that is likely to occur somewhere down the road). Ultimately, this approach would substantially improve our ability to determine the degree to which we are preparing students to approach life in a way that is thoughtful, pro-actively adaptable, and even nimble in the face of both unforeseen opportunity and sudden disappointment.

Of course, this idea runs counter to the way that we typically organize our students’ postsecondary educational experience. For if we are going to track the degree to which a given experience “primes” students for subsequent experiences – especially subsequent experiences that occur during college – then the educational experience can’t be so loosely constructed that the number of potential variations in the ordering of different students’ experiences virtually equals the number of students enrolled at our institution. This doesn’t mean that we return to the days in which every student took the same courses at the same time in the same order, but it does require an increased level of collective commitment to the intentional design of the student experience, a commitment to student-centered learning that will likely come at the expense of an individual instructor’s or administrator’s preference for which courses they teach or programs they lead and when they might be offered.

The other serious challenge is the act of operationalizing a concept of assessment that attempts to directly measure an individual’s preparation to make the most of a subsequent educational experience. But if we want to demonstrate the degree to which a college experience is more than just a collection of gains on disparate outcomes – whether these outcomes are somehow connected or entirely independent of each other – then we have to expand our approach to include process as well as product.  Only then can we actually demonstrate that the whole is greater than the sum of the parts, that in fact the educational process is the glue that fuses those disparate parts into a greater – and qualitatively distinct – whole.

Make it a good day,

Mark

Athletes, Enrollment, and Retention

It’s becoming more and more clear that the way we have thought about retention in the past is just too simplistic. Too often we use terms like “levers” or “buttons” in suggesting that if we could only identify the right thing to change, then retention would improve. However, when we don’t take the time to fully match our metaphor to the complexity of our circumstances, we run the real risk of putting in a lot of effort for very little improvement. For example, if we like the idea of one or more “levers” that we think we can move to systematically impact our retention rate, our metaphor can’t assume that the levers under our control are independent from each other. As we all know, the educational endeavor in which we are involved is much too complex. For our metaphor to be accurate (and therefore useful in identifying a course of action that has the best chance of producing positive results), we have to understand that each lever over which we have control is welded to other levers. In essence, moving one lever will automatically re-position others that also affect the long-term health of the college.

One example of this complexity became more apparent recently as we were examining our retention data among athletes.  Over the years we’ve found that typical first-to-second year retention rates among students who self-report as athletes are higher than our college average, and four, five, and six year graduation rates of athletes don’t differ between athletes and non-athletes. However, in digging a little deeper we found that about 45% of the students who left during the 2012 school year (a subset of the all the students who leave sometime between their first and second fall terms) started that academic year as athletes, a much higher proportion than the overall percentage of students who identify as athletes at the end of the year (about 30%). Unlike prior retention analysis where we used student self-reports of athletic status, for this analysis we looked at all of the students who were listed on all sports team’s initial rosters – including all the students who quit their sport before the end of the season and therefore didn’t report themselves as athletes on the end-of-the-year survey.

At first, one might think that this is a problem for athletics to solve (stereotypes of the hard-nosed dictator/coach chasing off less capable athletes might come to mind). However, further exploration exposes the degree to which our levers are welded together. You’ll forgive me if I borrow from my decade of experience in college athletics here to make my point.

It is no secret that our investment in athletics is, at least in part, based on the reality that athletic opportunity is a potent enrollment draw.  Our coaches play a significant role in encouraging perspective students to come to Augustana, both by initiating recruiting relationships and by offering opportunities to those students who inquire. This is clearly evident in the size of many of our sports’ initial rosters; especially among men’s sports. However, in the same way that the student-faculty ratio matters in creating a high touch, personalized college experience, the athlete-coach ratio matters too.  Large rosters can make it more difficult for a coach to connect with each player. And especially among younger athletes who may have less opportunity to compete due to the presence of older, more skilled players, this can exacerbate feelings of uncertainty and self-doubt that sometimes produce a decision to leave the team – and even the college. In the end, the way that we are using one lever (athletics) to meet enrollment goals may be increasing the likelihood of attrition among a certain subset of students.

Please understand that I am not advocating that we change anything.  Instead, given the the number of sports we offer and the way that we current organize our athletics programs, I am simply pointing out an example in which a lever we use quite effectively to meet one goal (enrollment) might well be creating an obstacle that limits our ability to meet another goal (retention).  I suppose one could argue that we should consider offering an additional sport or two so that athletics could still recruit the same overall number of students while reducing the average roster size of the individual sports. However, that depends on whether the increased costs of an additional sport (salaries, facilities, operating funds) would be offset by a potential increase in retention of students who came to Augustana with the intention of playing a sport. Obviously this is a pretty sticky question without a clear answer.

Again, my point here is only to highlight a trade-off – one that might be entirely legitimate – where we meet one set of goals in a way that potentially increases the difficulty of meeting another set of goals. Optimizing our retention rate is about finding our sweet spot. It’s not just about moving individual levers. That is what makes it so incredibly challenging – especially when we are trying to squeeze the last drops of optimizing out of something that we already do comparatively pretty well.

Make it a good day.  And enjoy the holiday.

Mark

The Fallacy of Matching Majors with Careers

It seems that most of the talk in recent months about the ROI (return on investment) of a college degree from a given institution has been focused on the degree to which new graduates from that institution can get well-paying jobs related to their major.  For liberal arts colleges and those of us who believe in the importance of a well-rounded education, the whole idea of assuming an inherent connection between major choice and career seems problematic.  Not only are there plenty of majors that don’t have a natural correlate on the job market (e.g., philosophy majors come to mind), but we are also regularly bombarded with the claims that individuals in today’s world will hold multiple jobs in multiple professions over the course of their working careers. Thus it seems odd to suggest that a college’s effectiveness could be pinned to the proportion of graduates who have landed jobs in their field within six months of graduation.

One data point from our survey of recent graduates seems to highlight this conundrum. Nine months after a class of seniors graduates, we ask them to complete a survey that asks a variety of questions about their current status, the degree to which their Augustana experience helped prepare them for their present circumstance, and the degree to which they believe that they are on the right long-term path.

One of the questions we asked our 2012 graduates last spring (about nine months after they had received their BA degrees from Augustana) was:

“Have your long-term professional goals changed since you graduated from Augie?”

The distribution of responses was revealing.

Not at all

48%

A little

21%

Somewhat

20%

Substantial

4%

Completely

3%

In other words, fewer than 50% of the 2012 graduating class considered themselves on the exact same long-term path that they were on when they walked across the stage to collect their diplomas.  In addition, over a quarter of the respondents said that their long-term goals had changed “somewhat,” “substantially,” or “completely.”

I believe the result of this single question holds critical implications for our efforts to best prepare our students to succeed after college.  First of all, this finding supports what we already know to be true – many of our students are going to change their long-term goals during their first several years after graduation. This is what happens to young people during their first foray into the world of working adulthood. We would be foolish to tie ourselves too tightly to a data point that doesn’t allow for these natural developments in the life a young adult.

Second, rather than mere job or graduate school placement, we would be smart to begin thinking about our students’ post-graduate success in terms of direction and momentum. Our students need to develop a clear sense of direction in order to decide what the best “next step” is for them. In addition, our job is to help them know when to take that “next step,” whether it be getting into the right graduate school or finding the right job or taking advantage of a once-in-a-lifetime opportunity that will better position them to move in the direction they have chosen for themselves. If we can do that, then no matter what happens to our students in the years after they graduate, they will be better able to succeed in the face of life’s inevitable challenges.

In concert with a sense of direction, our students need momentum.  This momentum should be self-perpetuating, cultivated by the right mix of motivations to handle setbacks and success. More importantly, it needs to be strong enough to thrive in the midst of a change in direction. This means that we develop their ability to be autonomous while holding themselves to high standards.  It means that they know how to be strategic in staying true to themselves and their goals no matter the distractions that might appear.

This doesn’t mean that we shouldn’t care about our students’ success in applying to graduate school or entry-level jobs in a given profession. On the contrary, we absolutely should care about statistics like these – especially if they support a student’s chosen direction and momentum.  But we should remember that a successful life isn’t etched in stone upon graduation from college.  And we should have the courage to track our students’ life trajectory in a way that doesn’t limit both us and them.

Make it a good day,

Mark

 

 

How does student learning happen?

Since it’s finals week, I’ll be quick.  However, I hope you’ll take some time to think about this little tidbit below as our strategic planning conversations address examine how we are going to make sure that every student develops the ability to integrate ideas to solve complex problems.

I saw George Kuh give a talk on Saturday afternoon in which he showed the following cartoon.  Even though the whole audience found it funny, the point he was trying to make about the degree to which we often fail to ensure that students learn what we say we teach them was dead serious.

We claim that a liberal arts education teaches students how to integrate disparate ideas from a wide range of disciplines and contexts to solve complex 21st century problems.  At the same time, however, the experiences we require are specific to individual disciplines or topics while the truly integrative experiences remain optional add-ons . . . if they exist at all outside of the major.

So the question I’d ask you to think about is this:  How do we know that every student participates in a rigorously designed activity that explicitly develops the ability to integrate knowledge from multiple fields of study to solve substantive, complex problems? And how could we design a college experience where we could demonstrate that every student participated in such an activity?

Make it a good day.  And have a great fall break.

Mark

Week 10 + Halloween + Slicing Data = Disengaged Zombie Students!

I suspect that the confluence of Week 10 and Halloween brings out a little crazy in each of us.  So I thought I’d share a brief response that I prepared for a recent media request regarding the potential existence of one underserved student population on our campus.

From our senior survey data, we find that students who self-report as Zombies also report statistically significantly lower levels of engagement across a wide range of important student experiences. These differences include lower levels of participation in class discussion despite higher satisfaction with faculty feedback.

Zombie students also report lower levels of co-curricular influence on understanding how one relates to others. Further qualitative study suggests a broad lack of self-awareness.

In addition, Zombie students indicate that they have fewer serious conversations with students who differ by race, ethnicity, socioeconomic status, or social values.  Instead, Zombie students seem to congregate together and rarely reach out of their comfort zone.

Interestingly, our first-to-second year retention rate of student zombies is 100%, despite the high number of PUGS and CARE reports.  Yet our six year graduation rate is 0%. While some have expressed concern over this dismal data point, a few administrators who are closely involved in managing the graduation ceremony have suggested that the graduation ceremony is long enough already without having Zombie students shuffling aimlessly across the stage to get their diploma.

Interestingly, Zombie students report an increased level of one-on-one student/faculty interaction outside of class.  We find no evidence to suggest that this correlates in any way with the substantial drop in the number of part-time and adjunct faculty from last year (108) to this year (52).

Happy Halloween and have a wonderful Week 10.

Make it a good day,

Mark

Does our educational community lose something when seniors live off campus?

I’ve yet to find an Augustana senior who wishes they lived on campus.  In fact, the seniors I’ve talked to seem almost relieved to finally stretch their wings and move into the surrounding neighborhoods, even though they often say they had hoped to find a cheaper or nicer place nearby.  As far as I can tell, seniors have lived off campus at least since the 1970s, and this practice is so embedded into our culture that the very name of our junior students’ housing – Transitional Living Areas (TLAs) – announces our desire to prepare seniors to live on their own.

As our strategic planning discussions have coalesced around designing and implementing a purposefully integrated, comprehensive Augustana learning experience, I’ve been thinking about the real challenge of creating a plan that allows us to balance the individualized needs of each student with the core elements of a genuine community.  Although this might not appear all that difficult at first, efforts to achieve goals for individuals or certain subgroups of students can sometimes run at cross-purposes with maintaining a community culture optimal for student learning.  Several years ago we found an interesting example of such unintended consequences when we discovered that our efforts to encourage students to join multiple campus organizations (knowing that such behavior often enhances social integration and ultimately influences retention) was likely, albeit unintentionally, limiting the chances for conversations between students from substantially different backgrounds or demographic groups (thus undermining our efforts to increase students’ intercultural competence).

With all of this in mind, I was stuck by one data point from last year’s seniors about the impact of our fourth year residential status. The question asked our graduating seniors, “How often did you participate in on-campus events during your senior year?”  Responses ranged as follows:

  • less than when I lived on campus (200 – 39.9%)
  • about the same as when I lived on campus (279 – 55.7%)
  • more than when I lived on campus (22 – 4.4%)

So how does this relate to the aforementioned tension between encouraging individual development and fostering an ideal educational community?

First of all, when we talk about Augustana College, we almost uniformly talk about the educational and developmental benefits of a four-year residential experience.  I suspect that when we talk in these terms, we imagine that this distinguishing characteristic plays an influential role at both the level of the individual and the community.  At the individual level it presents itself in the form of leadership positions and the responsibility of being the senior class.  At the communal level it presents itself through those same channels but in terms of the influence of those leaders on younger students and the atmosphere and legacy that a senior class can create that can permeate an entire campus.  While this can play out in both directions through formal channels and during formally organized events, the broader impacts are likely more pervasive through informal rituals and signaling (to use a term familiar to social psychologists and anthropologists).

However, if our seniors are living off campus in their last year, it seems like this could, at the very least, limit the educational potential and influence of the fourth year students on the rest of the student community.  Based on the substantial proportion of seniors who indicated that they participated in fewer campus events than when they lived on campus, and taking into account our other data that clearly shows a high level of overall involvement among our students overall, I’d suggest that we might have set up a situation where we have maintained the educational opportunities that contribute to individual development among our seniors, but we may be missing out on some of the benefits to a residential educational community that our senior class might provide if they lived on campus.

There are lots of reasons to suggest that we should be cautious in drawing too many conclusions from this particular data point.  For many of our seniors, they may be busy with off-campus internships, graduate school applications, or other involvements that emerge as they begin to prepare for life after college.  They could also be hosting off-campus parties that have varied effects – both good and bad – on our campus community.  And given the long history of seniors living off campus, I’ll bet that there are a certain set of beliefs or mythologies about one’s senior year that are deeply embedded into the student culture.

Yet I’d ask that as we endeavor to create an integrated learning experience that is truly comprehensive and clearly distinctive in terms of preparing students for lives of financial independence, unintended discoveries, and a legacy of success, I hope we are willing to seriously consider all of the possible design elements that might make such an educational experience and environment possible.  And I hope that we are bravely able to keep a balance between the necessary elements of the culture we hope to foster with the developmental needs of our individual students.

Make it a good day,

Mark

 

 

 

In Search of the Mysterious Muddler

On several recent occasions I have heard it said that about 25% of our students aren’t involved in anything on campus.  I am always intrigued by the way that some assertions or beliefs evolve into facts on a college campus, and this number seemed ripe for investigating.   Researchers into human behavior have found this phenomenon repeatedly and suggest that, because we want to believe our own intuition to be true, we tend to perk up at data points or anecdotes that support our beliefs.  We’ve all fallen prey to this temptation at least once – at least I have.  So I thought it might be worth testing this claim just to see if it holds up under the glare of our actual survey data.

First – to be fair, this claim isn’t totally crazy.  I can think of a particular data point that clearly nods in the direction of the 25% uninvolved claim.  For a few years, we’ve tracked the proportion of seniors who don’t use their Augie Choice money, and – although the number is steadily declining – over the last few years an average of about 25% have foregone those funds.  Others have suggested that every year we have a group of somewhere between 600 and 800 students (henceforth called “the muddlers”) who aren’t involved in anything co-curricular; athletics, music groups, or student clubs and organizations.  More ominously, some have suggested that there is a sub-population of students who are only involved in Greek organizations and that these students help to create an environment that isn’t conducive with our efforts to make Augustana a rigorous learning experience. (All of that is a wordy euphemism for “these lazy bums party too much.”).

Although the question of what should count as true involvement is a legitimate one, the question of simple participation is an empirical question that we can test.  So we looked at two sets of data – our 2013 senior survey data and our 2013 freshmen survey data – to see what proportion of students report not being involved in anything co-curricular. No athletics, no music, and no student clubs or organizations.  Then we added the question of Greek membership just to see if the aforementioned contingent of deadbeats really does exist in numbers large enough to foment demonstrable mayhem. (another wordy euphemism for “be loud and break stuff.”).

Well, I’ve got bad news for the muddlers.  Your numbers aren’t looking so hot.  From the students who graduated last spring, only 17 out of 495 said that they didn’t participate in anything (athletics, music, student groups, or Greeks).  When we took the Greek question out of the equation we only gained 5 students, ultimately finding that only about 5% (23/495) of our graduating seniors said that they didn’t participate in athletics, music, or some student group.

But what about the freshmen?  After all, the seniors are the ones who have stayed for four years.  If involvement is the magic ingredient for retention that some think it is, then we should expect this proportion to be quite a bit bigger in the freshman class.

Alas, though our muddler group appears a little bigger in the first year, it sure doesn’t approach the 25% narrative.  After eliminating freshmen who participated in athletics, music, a student group, and a Greek organization, we were left with only 15 out of 263 first year students who responded to our survey.  When we left out Greek membership, we only gained 4 students, increasing the number to 19 out of 263 (7%).  Now it’s fair to suggest that there is a limitation to this data in that we got responses from only about 45% of the freshman class.  However, even after calculating the confidence intervals (the “+/-“) in order to generalize with 95% confidence to the entire freshman class, we still end up with range in proportion of students not involved in anything co-curricular somewhere between 4 and 9 percent.

There are two other possible considerations regarding the muddler mystery.  One possibility is that there are indeed more than we know because the non-participant would also be more likely to not fill out the freshman survey.  On the other hand – as some of our faculty have observed, it’s possible that our muddlers are also the students who study more seriously; just the kind of students faculty often dream of teaching.

My reason for writing this post is NOT to suggest that we don’t have some students who need to be more involved in something outside of their classes.  We certainly have those students, and if it is almost 10% of our freshman class (as the upper bound of the confidence interval suggests), then we clearly have work to do.  Rather, it seems to me that this is another reason to think more carefully about the nature of involvement’s impact on students.  Because it appears that the students who depart after the first year are not merely uninvolved recluses (again, the limitations of the sample requires that I suggest caution in jumping to too certain a conclusion).  It seems to me that this evidence is another reason to think about involvement as a means to other outcomes that are central to our educational mission instead of an end in and of itself.

Make it a good day,

Mark