Look mom, it’s a blog!

Hi everybody,

Yes, its true.  What was once a simple column has now turned into a blog.

What difference will it make?  None.  This column will focus on the same topics that it has explored in the past.  Sometimes I’ll talk about an interesting finding from our student data, sometimes I’ll test a claim that has been made publicly, and sometimes I’ll muse about the various tensions that arise when one seriously commits to striving for perpetual improvement.

Yes, I’ll continue to be snarky from time to time.  But now, you can call me on it in the comments section and point out my flaws, my unsubstantiated leaps, or my bad grammar for all to see.  Of course, you can also throw me a bone everyone once in a while and tell me what you liked or what made you stop and think for a second or two.

Mostly, I hope you’ll add your perspective and make this blog a conversation dedicated to thinking about our work and making change for the better.

So here it is . . . Delicious Ambiguity.  Stay hungry, my friends.

Make it a good day,

Mark

Smile! Its the end of the academic year (almost!)

At this point in the term, there isn’t a lot of time for deep, contemplative thought.  Instead, it strikes me that a good laugh is the best source of that little extra fuel to get through the last week of the academic year.  So I thought I’d supply a little higher ed humor.  Here are links to some of the best spoof news stories about higher education in the past couple of years.  If nothing else, they’ll give you one more way to procrastinate grading!

 

Bard College Named Nation’s No. 1 Dinner Party School

 

New College Graduates To Be Cryogenically Frozen Until Job Market Improves

 

Area Man First In His Family To Coast Through College

 

There are so many more, but time is of the essence.  See you next fall!

 

Make it a good day!

 

Mark

Does a double major learn more?

One of the arguments raised repeatedly throughout the calendar discussion was the importance we place on multiple majors.  While there were numerous rationales in support of double majors, one of them was that increased access to gaining a double major reflects our commitment to a fundamental principle of liberal arts education and the emphasis we place on becoming more well-rounded intellectually, culturally, and personally.

 

Although this argument sounds wonderful, I heard less data to support the core claim that a double major was somehow preferable to a single major or a major and a minor.  This might well be so in terms of employability and flexibility in an uncertain job market.  But do students who double major make larger gains on the educational outcomes of a liberal arts education than those who do not double major?  Does earning a double major somehow produce greater broad-based learning gains?

 

I examined the Wabash National Study data from the 2006 cohort.  Furthermore, I restricted my analysis to students at the eleven small liberal arts colleges in that cohort. I didn’t investigate whether certain combinations of majors were more advantageous than others primarily because I didn’t hear anyone seriously advocate for one combination over another, although there seems to be a second claim floating around that truly interdisciplinary double majors are somehow better than intra-disciplinary double majors – an assertion we can test if this first analysis holds much water.

 

The table below shows nine educational and developmental outcomes of a liberal arts education and whether being a double major correlates with a larger gain between the first year and the fourth year.

 

Double Major Status Had No Impact

Double Major Status Had An Impact

Critical Thinking

Intellectual Curiosity

Moral Reasoning

Intercultural Maturity

Attitude toward Literacy

Civic Engagement

Academic Motivation

Leadership

Psychological Well Being

Based on these findings, it initially appears that double majoring provides some educational benefit, impacting two of the nine outcomes.  However, the size of the effect on intellectual curiosity and intercultural maturity is actually quite small.  Furthermore, in the two cases where an initial significant finding appears, the impact of being a double major vanishes once I introduce student experience such as diverse interaction (in the test of intercultural maturity) and integrative learning experiences (in the test of intellectual curiosity) into the equations.

 

Based on this evidence, it’s hard to make the case that double majoring – by itself – is necessarily significantly beneficial in the context of learning outcomes.  Again, this doesn’t mean that it couldn’t be beneficial in the very important context of job acquisition.  But it appears that this cow’s sacred status may require a bit more scrutiny before we summarily celebrate our embrace of the double major.

 

Make it a good day!

 

Mark

From “what we have” to “what we do with it”

We probably all have a good example of a time when we decided to make a change – maybe drastic, maybe minimal – only to realize later the full ramifications of that change (“Yikes! Now I remember why I grew a beard.”).  This is the problem with change – our own little lives aren’t as discretely organized as we’d like to think, and there are always unintended consequences and surprise effects.

 

When Augustana decided to move from measuring itself based on the quality of what we have (incoming student profile, endowment, number of faculty, etc.) to assessing our effectiveness based on what we do (student learning and development, educational improvement and efficiency, etc.), I don’t think we fully realized the ramifications of this shift.  Although there are numerous ways in which this shift is impacting our work, I’d like to talk specifically about the implications of this shift in terms of institutional data collection and reporting.

 

First, let’s get two terms clarified.  When I say “outcomes” I mean the learning that results from educating.  When I say “experiences” I mean the experiences that students have during the course of their college career.  They could be simply described by their participation in a particular activity (e.g., a philosophy major) or they could be more ambiguously described as the quality of a student’s interaction with faculty.  Either way, the idea is – and has always been – that student experiences should lead to gains on educational outcomes.

 

I remember an early meeting during my first few months at Augustana College where one senior administrator turned to me and said, “We need outcomes.  What have you got?”  At many institutions, the answer would be something like, “I’ll get back to you in four years,”  because that is how long it takes to gather dependable data.  Just surveying students at any given point only tells you where they are at that point – it doesn’t tell you how much they’ve changed as a result of our efforts. Although we have some outcome data from a several studies that we happened to join, we still have to gather outcome data on everything that we need to measure – and that will take time.

 

But the other problem is one of design.  Ideally, you choose what you want to measure, and then you start measuring it.  In our case, although we have measured some outcomes, we don’t have measures on other outcomes that are equally important.  And there isn’t a very strong centering framework for what we have measured, what we have not, and why.  This is why we are having the conversation about identifying college-wide outcomes.  The results of that conversation will tell us exactly what to measure.

 

The second issue is in some ways almost more important for our own purposes.  We need to know what we should do to improve student learning – not just whether our students are learning (or not).  As we should know by now, learning doesn’t happen by magic.  There are specific experiences that accelerate learning, and certain experiences that grind it to a halt.  Once we’ve identified the outcomes that define Augustana, then we can track the experiences that precede them.  It is amazing how many times we have found that, despite the substantial amount of data we have on our students, the precise data on a specific experience is nowhere to be found because we never knew we were going to need it.  This is the primary reason for the changes I made in the senior survey this year.

 

This move from measuring what we have to assessing what we do is not a simple one and it doesn’t happen overnight.  And that is just the data collection side of the shop.  Just wait until I start talking about what we do with the data once we get it! (Cue evil laughter soundtrack!)

 

Make it a good day!

 

Mark

student learning as I see it

At a recent faculty forum, discussion of the curricular realignment proposal turned to the question of student learning.  As different people weighed in, it struck me that, even though many of us have been using the term “student learning” for years, some of us may have different concepts in mind.  So I thought it would be a good idea, since I think I say the phrase “student learning” at least once every hour, to explain what I mean and what I think most assessment folks mean when we say “student learning.”

 

Traditionally, “student learning” was a phrase that defined itself – it referred to what students learned.  However, the intent of college teaching was primarily to transmit content and disciplinary knowledge – the stuff that we normally think of when we think of an expert in a field or a Jeopardy champion.  So the measure of student learning was the amount of content that a student could regurgitate – both in the short term and the long term.

 

Fortunately or unfortunately, the world in which we live has completed changed since the era in which American colleges and universities hit their stride.  Today, every time you use your smart phone to get directions, look up a word, or find some other byte of arcane data, it becomes painfully clear that memorizing all of that information yourself would be sort of pointless and maybe even a little silly.  Today, the set of tools necessary to succeed in life and contribute to society goes far beyond the content itself.  Now, it’s what you can do with the content.  Can you negotiate circumstances to solve difficult problems?  Can you manage an organization in the midst of uncertainty?  Can you put together previously unrelated concepts to create totally new ideas?  Can you identify the weakness in an argument and how that weakness might be turned to your advantage?

 

It has become increasingly apparent that colleges and universities need to develop the set of skills needed to answer “yes” to those questions.  So when people like me use the phrase “student learning” we are referring to the development of the skill sets necessary to make magic out of content knowledge.  That has powerful implications for the way that we envision a general education or major curriculum.  It also holds powerful implications for how we think about integrating traditional classroom and out-of-class experiences in order to firmly develop those skills in students.

 

I would encourage all of us to reflect on what we think we mean when we say “student learning.”  First, let’s make sure we are all referring to the same thing when we talk about it.  Second, let’s move away from emphasizing content acquisition as the primary reflection of our educational effectiveness.  Yes, content is necessary, but it’s no longer sufficient.  Yes, content is foundational to substantive student learning, but very few people look at a completed functioning house and say, “Wow, what an amazing foundation.”  I’m just sayin’ . . .

 

Make it a good day!

 

Mark

The law of diminishing returns

Welcome back from the short holiday weekend.  I hope you got your fill of celebratory dinner and dessert and, most importantly, put the rest of your work life away to send quality time with the family and friends.

 

A lot of the discussion in my office recently has been about data gathering through surveys.  After all, it’s nearing the end of the academic year and there are many who sincerely want to know if our students experienced Augustana College as we hoped, whether they learned what we intended them to learn, and if any one piece of the myriad of moving parts that make up a college experience has slipped in some way to require a readjustment.

 

In the process of gathering one such survey – the Wabash National Study of Liberal Arts Education – we’ve seen an almost perfect example of the law of diminishing returns – which basically says for each additional time you try to increase production, all else being equal, the rate of production will decline.  As many seniors are conducting senior inquiry projects that involve surveys, I thought it might be of some interest to share our experience gathering data for the Wabash National Study so far, talk about what it means for gathering survey data on campus, and propose some suggestions for folks planning to collect data in the future from students, faculty, staff, or alumni.

 

As you have likely seen in some format or another, I’ve been pumping the Wabash National Study to students, faculty, and staff over the last few months because of its potential to provide key guidance on a host of questions regarding our efforts to improve student learning.  We also were able to acquire $25 gift cards as rewards for those who participate in one of our data collection events.  I’ve listed below the participation rates for each of the four data collection dates.

 

Date of Data Collection

Number of Participants

Mon, March 12

78

Mon, March 26

35

Thurs, March 29

18

Mon, April 2

10

 

With only slight variation, the rate of participation drops in half for each subsequent data collection date.  This occurred despite the repeated promotion, coverage in the Observer, soliciting additional promotion from faculty and staff, and a consistently healthy incentive for those who participated.

 

It’s one thing to hear cautionary tales about this pattern – it’s another to see it so clearly play out right in front of you.  In our case, we are going to continue to host several more data collections during the month of April, but will shift from holding them at night to holding them in the middle of the morning during the convocation time.  I hope you’ll help promote these events to your seniors as you see them announced.

 

So I could strongly encourage those of you who are gathering data yourself or guiding students in their senior inquiry projects: Come up with multiple ways to gather your data and expect that no matter what you do, your participation will slip as you continue to promote your survey.  This means that you really have one shot to get it right, and everything you can do to incentivize initial participation is worth the effort in the long run.

 

Make it a good day!

 

Mark

Moving from satisfaction to experiences – a new senior survey

One of the “exciting” parts of my job is building surveys.  I’ve worked with many of you over the past two years to construct new surveys to answer all sorts of questions.  On the one hand, it’s a pretty interesting challenge to navigate all of the issues inherent in designing what amounts to a real life “research study.”  At the same time, it can be an exhausting project because there are so many things you just can’t be sure of until you field test the survey a few times and find all of the unanticipated flaws.  But in the end, if we get good data from the new survey and learn things we didn’t know before that help us do what we do just a little bit better, it’s a pretty satisfying feeling.

As many of you already know, Augustana College has been engaged in a major change over the last several years in terms of how we assess ourselves.  Instead of determining our quality as an institution based on what we have (student incoming profile, endowment amount, etc.), we are trying to shift to determining our quality based on what we do with what we have.  Amazingly, this places us in a very different place that many higher education institutions.  Unfortunately, it also means that there aren’t many examples on which we might model our efforts.

One of the implications of this shift involves the nature of the set of institutional data points that we collect.  Although many of the numbers we have traditionally gathered continue to be important, the measure of ourselves that we are hoping to capture what we do with those traditional numbers. And while we have long maintained pretty robust ways of obtaining the numbers you would see in our traditional dashboard, our mechanisms for gathering data that would help us assess what we do with what we have are not yet robust enough.

So over the last few months, I have been working with the Assessment for Improvement Committee and my student assistants to build a new senior survey.  While the older version had served its purpose well over more than a decade, it was ready for an update, it not an overhaul.

The first thing we’ve done is move from a survey of satisfaction to a survey of experiences.  Satisfaction can sometimes give you a vague sense of customer happiness, but it often falls flat in trying to figure out how to make a change – not to mention the fact that good educating can produce customer dissatisfaction if the that customer had unrealistic expectations or didn’t participate in their half of the educational relationship.

The second thing we’ve done is build the senior survey around the educational and developmental outcomes of the entire college.  If our goal is to develop students holistically, then our inquiry needs to be comprehensive.

Finally, the third thing we’ve done is “walk back” our thinking from the outcomes of various aspects of the college to the way that students would experience our efforts to produce those outcomes.  So, for example, if the outcome in intercultural competence, then the question we ask is how often students had serious conversations with people who differed by race/ethnicity, culturally, social values, or political beliefs.  We know this is a good question to ask because we know from a host of previous research that the degree to which students engage in these experiences predicts their growth on intercultural competence.

If you want to see the new senior survey, please don’t hesitate to ask.  I am always interest in your feedback.  In the mean time . . .

 

Make it a good day!

 

Mark

Sorry – I’m busy collecting data!

This is a potentially massive week for Augustana College.  We are hosting two data collections for the Wabash National Study.  The first one is tonight – Monday, March 26th – from 6-8 PM in Olin Auditorium.  The second is Thursday, March 29th – from 6-8 in Hanson Science 102.

 

PLEASE PLEASE PLEASE go out of your way to encourage any senior you know to come to one of those two dates.  We still have about 300 $25 gift cards to the Augie bookstore to give away to the seniors who show up.

 

Frankly, I’ve got nothing else to say at the moment.  Put more honestly, I’ve got not time to write anything right now – I doing everything I can to increase our participation rates that all of you have data that we can use over the next several years.

 

Yes, I really am “all about you.”

 

Make it a great day – Make it  Wabash National Study day!

 

Mark

Look what happens when you use your data to improve?

Even though, I know you have plenty of things to think and fret about these days with the start of a new term and the little matter of a proposed calendar and curriculum revision, I hope you are enjoying the weather and finding ways to keep your students motivated despite it!

 

With that said, I hope you’ve also had a chance to look through your IDEA course reports from the winter term and your packets of student forms.  Although many of you have attended one of the “interpreting the IDEA reports” sessions over the last year or so, I know that some of you continue to have questions.  I’m glad to sit down with you any time and answer any questions you might have.

 

I would like to share some of my observations after seeing almost every report over the last two terms.  My hope is that these observations are helpful, not only as you might be thinking about using your reports to inform your course design for future terms, but also in considering whether or not the switch to the IDEA Center process has been helpful for Augustana College in helping us to improve our teaching and student learning.

 

First, it appears to me as if the average PRO score (Progress on Relevant Objectives) went up between fall and winter terms.  There are a number of potential explanations for this – the types of courses offered, student acclimation to college (within the year as well as for first year students), and general attrition for those most unable to succeed at Augustana.  But it struck me that there are also some reasons why we might suspect learning (as represented by the PRO score) to decrease in the winter term – most notably the big break in the middle of the term and its impact on students’ motivation to restart the academic engine or remember what they had learned prior to the holiday break.  So I don’t think it’s completely out of the bounds to suggest that the increase in the overall PRO score is worth noting.

 

Second, it appears that many faculty members reduced the number of learning outcomes they selected for their individual courses.  I would argue that this is probably a good thing in the vast majority of cases.  First, I interpret the number of objectives selected as an indication of focus rather than an indication of learning.  In other words, as I’ve noted to some of you, in many cases your students reported learning substantially on objectives that you did not select.  In fact, it wasn’t uncommon at all to find faculty selecting fewer objectives and then finding that they could have selected additional objectives and the PRO score would have remained the same or even gone up.  The choice to choose fewer objectives and focus on them set the conditions for the “spill over” learning that was then evident on your reports.

 

Conversely, for faculty who initially selected many outcomes, the results of those reports suggested that the diffusion effect that I have mentioned repeatedly held true more often than not.  Folks who initially selected many objectives often found that, although some of the objectives they selected played out as they had intended, there were enough objectives on which students reported lower average learning that the average PRO score suffered as a result.  In my mind, the drop in the average number of objectives selected suggests to me that more faculty have engaged in the exact kind of purposeful thinking about course design and course outcomes that the adoption of this instrument was intended to produce.  Some of you might argue that this is only evidence of “gaming the system.”  I would argue that if “gaming the system” sets better condition for learning, then you can call it “manipulating,” “negotiating,” or “peppermint bon bon” for all I care.

 

With all of the uncertainty and ambiguity that goes with the work that we do – especially when it comes to trying to make decisions about the future of Augustana College – I think it is useful to look at a decision the faculty made last year and assess its impact.  In the case of the decision to switch to the IDEA Center system, I think that there is preliminary evidence to suggest that this switch is helping us improve the conditions for optimal student learning.  Whether or not it actually directly impacts student learning – I think that is a question for another Delicious Ambiguity Column that I will write more than a few years from now.

 

Make it a great day,

 

Mark

What do we know from our prior Wabash National Study data?

I am going to cut to the chase here – tonight is the first opportunity for senior students to participate in the Wabash National Study of Liberal Arts Education final phase of data collection.  It all starts at 6 PM in Hanson 102.  Please encourage your senior students to participate.  And remember – tell them that the first 400 participants get a $25 gift card to the Augie bookstore.

 

Instead of telling you why I think that the Wabash National Study might be so valuable to Augustana College, I thought I’d show you.  Over the course of this year, I’ve written 21 columns; almost all of them trying to help us think about ways that we can use our institutional data to improve what we do.  Nine of these columns examine data that is a part of the Wabash National Study.  Just in case you’ve forgotten, I’ve listed them below and provided links to the full column.

 

 

And these columns are only a miniscule sampling of the kinds of questions that could be answered using this dataset.  Moreover, if we can get enough seniors to participate, we could answer these same questions – and many others – within the context of each major.  This is the kind of data that would be gold for anyone thinking about how they can make their major experience the best it possibly can be.

 

I hope this demonstrates a little bit of why I hope you will help promote this study and encourage your students to participate.  If you have any questions about it, please don’t hesitate to email me.

 

Make it a great day,

 

Mark