The law of diminishing returns

Welcome back from the short holiday weekend.  I hope you got your fill of celebratory dinner and dessert and, most importantly, put the rest of your work life away to send quality time with the family and friends.

 

A lot of the discussion in my office recently has been about data gathering through surveys.  After all, it’s nearing the end of the academic year and there are many who sincerely want to know if our students experienced Augustana College as we hoped, whether they learned what we intended them to learn, and if any one piece of the myriad of moving parts that make up a college experience has slipped in some way to require a readjustment.

 

In the process of gathering one such survey – the Wabash National Study of Liberal Arts Education – we’ve seen an almost perfect example of the law of diminishing returns – which basically says for each additional time you try to increase production, all else being equal, the rate of production will decline.  As many seniors are conducting senior inquiry projects that involve surveys, I thought it might be of some interest to share our experience gathering data for the Wabash National Study so far, talk about what it means for gathering survey data on campus, and propose some suggestions for folks planning to collect data in the future from students, faculty, staff, or alumni.

 

As you have likely seen in some format or another, I’ve been pumping the Wabash National Study to students, faculty, and staff over the last few months because of its potential to provide key guidance on a host of questions regarding our efforts to improve student learning.  We also were able to acquire $25 gift cards as rewards for those who participate in one of our data collection events.  I’ve listed below the participation rates for each of the four data collection dates.

 

Date of Data Collection

Number of Participants

Mon, March 12

78

Mon, March 26

35

Thurs, March 29

18

Mon, April 2

10

 

With only slight variation, the rate of participation drops in half for each subsequent data collection date.  This occurred despite the repeated promotion, coverage in the Observer, soliciting additional promotion from faculty and staff, and a consistently healthy incentive for those who participated.

 

It’s one thing to hear cautionary tales about this pattern – it’s another to see it so clearly play out right in front of you.  In our case, we are going to continue to host several more data collections during the month of April, but will shift from holding them at night to holding them in the middle of the morning during the convocation time.  I hope you’ll help promote these events to your seniors as you see them announced.

 

So I could strongly encourage those of you who are gathering data yourself or guiding students in their senior inquiry projects: Come up with multiple ways to gather your data and expect that no matter what you do, your participation will slip as you continue to promote your survey.  This means that you really have one shot to get it right, and everything you can do to incentivize initial participation is worth the effort in the long run.

 

Make it a good day!

 

Mark

Sorry – I’m busy collecting data!

This is a potentially massive week for Augustana College.  We are hosting two data collections for the Wabash National Study.  The first one is tonight – Monday, March 26th – from 6-8 PM in Olin Auditorium.  The second is Thursday, March 29th – from 6-8 in Hanson Science 102.

 

PLEASE PLEASE PLEASE go out of your way to encourage any senior you know to come to one of those two dates.  We still have about 300 $25 gift cards to the Augie bookstore to give away to the seniors who show up.

 

Frankly, I’ve got nothing else to say at the moment.  Put more honestly, I’ve got not time to write anything right now – I doing everything I can to increase our participation rates that all of you have data that we can use over the next several years.

 

Yes, I really am “all about you.”

 

Make it a great day – Make it  Wabash National Study day!

 

Mark

Look what happens when you use your data to improve?

Even though, I know you have plenty of things to think and fret about these days with the start of a new term and the little matter of a proposed calendar and curriculum revision, I hope you are enjoying the weather and finding ways to keep your students motivated despite it!

 

With that said, I hope you’ve also had a chance to look through your IDEA course reports from the winter term and your packets of student forms.  Although many of you have attended one of the “interpreting the IDEA reports” sessions over the last year or so, I know that some of you continue to have questions.  I’m glad to sit down with you any time and answer any questions you might have.

 

I would like to share some of my observations after seeing almost every report over the last two terms.  My hope is that these observations are helpful, not only as you might be thinking about using your reports to inform your course design for future terms, but also in considering whether or not the switch to the IDEA Center process has been helpful for Augustana College in helping us to improve our teaching and student learning.

 

First, it appears to me as if the average PRO score (Progress on Relevant Objectives) went up between fall and winter terms.  There are a number of potential explanations for this – the types of courses offered, student acclimation to college (within the year as well as for first year students), and general attrition for those most unable to succeed at Augustana.  But it struck me that there are also some reasons why we might suspect learning (as represented by the PRO score) to decrease in the winter term – most notably the big break in the middle of the term and its impact on students’ motivation to restart the academic engine or remember what they had learned prior to the holiday break.  So I don’t think it’s completely out of the bounds to suggest that the increase in the overall PRO score is worth noting.

 

Second, it appears that many faculty members reduced the number of learning outcomes they selected for their individual courses.  I would argue that this is probably a good thing in the vast majority of cases.  First, I interpret the number of objectives selected as an indication of focus rather than an indication of learning.  In other words, as I’ve noted to some of you, in many cases your students reported learning substantially on objectives that you did not select.  In fact, it wasn’t uncommon at all to find faculty selecting fewer objectives and then finding that they could have selected additional objectives and the PRO score would have remained the same or even gone up.  The choice to choose fewer objectives and focus on them set the conditions for the “spill over” learning that was then evident on your reports.

 

Conversely, for faculty who initially selected many outcomes, the results of those reports suggested that the diffusion effect that I have mentioned repeatedly held true more often than not.  Folks who initially selected many objectives often found that, although some of the objectives they selected played out as they had intended, there were enough objectives on which students reported lower average learning that the average PRO score suffered as a result.  In my mind, the drop in the average number of objectives selected suggests to me that more faculty have engaged in the exact kind of purposeful thinking about course design and course outcomes that the adoption of this instrument was intended to produce.  Some of you might argue that this is only evidence of “gaming the system.”  I would argue that if “gaming the system” sets better condition for learning, then you can call it “manipulating,” “negotiating,” or “peppermint bon bon” for all I care.

 

With all of the uncertainty and ambiguity that goes with the work that we do – especially when it comes to trying to make decisions about the future of Augustana College – I think it is useful to look at a decision the faculty made last year and assess its impact.  In the case of the decision to switch to the IDEA Center system, I think that there is preliminary evidence to suggest that this switch is helping us improve the conditions for optimal student learning.  Whether or not it actually directly impacts student learning – I think that is a question for another Delicious Ambiguity Column that I will write more than a few years from now.

 

Make it a great day,

 

Mark

What do we know from our prior Wabash National Study data?

I am going to cut to the chase here – tonight is the first opportunity for senior students to participate in the Wabash National Study of Liberal Arts Education final phase of data collection.  It all starts at 6 PM in Hanson 102.  Please encourage your senior students to participate.  And remember – tell them that the first 400 participants get a $25 gift card to the Augie bookstore.

 

Instead of telling you why I think that the Wabash National Study might be so valuable to Augustana College, I thought I’d show you.  Over the course of this year, I’ve written 21 columns; almost all of them trying to help us think about ways that we can use our institutional data to improve what we do.  Nine of these columns examine data that is a part of the Wabash National Study.  Just in case you’ve forgotten, I’ve listed them below and provided links to the full column.

 

 

And these columns are only a miniscule sampling of the kinds of questions that could be answered using this dataset.  Moreover, if we can get enough seniors to participate, we could answer these same questions – and many others – within the context of each major.  This is the kind of data that would be gold for anyone thinking about how they can make their major experience the best it possibly can be.

 

I hope this demonstrates a little bit of why I hope you will help promote this study and encourage your students to participate.  If you have any questions about it, please don’t hesitate to email me.

 

Make it a great day,

 

Mark

Teaching, learning, and sleep

It’s that time of the term again – lots to do and not nearly enough time to do it.  Especially for students, at this time of the term the amount of time needed to meet academic and co-curricular obligations thunders past critical mass like a semi-truck blowing by a hitchhiker. Pretty soon basic health and hygiene behaviors get pushed to the side and our kids are riding a rollercoaster of super-sized energy drinks, junk food, and far too little sleep.

 

One of the outcomes that the Wabash National Study allows us to track is health behaviors.  This set of variables includes measures of exercising, binge drinking, smoking, and sleep deprivation.  Since the end of the term is often a time when students look like they are groggily stumbling toward the finish line, I thought we’d examine students’ reports of sleep deprivation over the first year and see if anything faculty and staff do might impact it one way or the other.

 

Sleep issues are deceptively complicated because there are lots of reasons why someone might not get enough sleep.  It might be too much homework all at once.  Or it might be stress about something completely unrelated to school.  Since we don’t have the breadth of variables in the Wabash data to get at all of the potentially influential stress related issues, I tried to focus this analysis on the factors that might shape students’ allocation of time and thus influence the frequency of feeling sleep-deprived.

 

First of all, we found that average amount of times during a week that students’ felt sleep deprived increased from the beginning to the end of the first year – an increase that proved to be statistically significant.  Now by itself, that isn’t much of a surprise – and many of you might say that this is as it should be.  So the next question is:  What are the factors that are uniquely influencing this change?

 

(I’m glad to send you the full list of variables we examined and the output file if anyone is interested – Regression Modeling Geeks Unite!)

 

After accounting for basic demographic characteristics and pre-college behaviors, we found that both the number of hours students reported studying per week and the number of hours students spent in co-curricular activities positively influenced an increase in sleep deprivation.  However, after adding greek membership into the mix, the impact of co-curricular involvement evaporated and was replaced by a similar sized impact of greek affiliation.

 

While that finding is interesting in its own right, I wanted to know more.  Is there anything about the way that we interact with students that might also impact this increase in sleep deprivation?  Interestingly, we found evidence that faculty teaching behaviors might mitigate this apparent increase.  As our students’ reports of experiencing instructional organization and clarity increased, the increase in sleep deprivation during the freshman year was REDUCED.  In other words, the degree to which students report faculty are clear and organized in teaching their courses appears to influence healthier sleeping behaviors in our students.  Moreover, I tested this analysis with the full Wabash data set (about 3000 students from 19 schools) and again, the impact of instructional clarity and organization was significant in reducing the increase in sleep deprivation over the first year.

 

I’m not sure I’m ready to suggest a direct causal relationship – but I think it’s worth considering the legitimate possibility that the way we teach and organize our courses might indeed play an important role in fostering a positive learning environment beyond the academic sphere.

 

zzzzzzzzz . . . (make it a good day . . . shhhh),

 

Mark

The educational benefits of reflection

If there was a magic potion that turned glum, unkempt, “I dare you to learn me some teachin,” students into captivated, self-directed, and perpetually inquisitive knowledge hounds, we’d all want to know about it, right?  Of course, student development does quite work that way.  And yet, there are specific pedagogical exercises that seem to be pretty influential in our first year students’ growth – for those students lucky enough to encounter in it.

 

One such exercise is reflective learning.  Although we often think of reflection as something that might be found in a journal assignment (or a mirror), it can happen in lots of settings and formats.  And while some criticize reflection as little more than rationalized navel gazing, there is plenty of evidence to suggest that reflection – when facilitated well – can be a powerful learning tool.  So I decided to see if reflective learning had any impact on the educational development of our first-year students who participated in the Wabash National Study in 2008.  After all, since many of the “high-impact experiences” we often talk about (e.g., study abroad, internships) are rarely accessible to freshmen, we need to know the kinds of learning experiences that can make the first year of college more than “just a year of waiting to get to the good stuff.”

 

The Wabash National Study accounted for reflective learning by combining three questions.  They asked, “During the current school year,

 

  • how often did you examine the strengths and weaknesses of your own views on a topic or issue?”
  • how often did you try to better understand someone else’s views by imagining how an issue looks from his or her perspective?”
  • how often did you learn something that changed the way you understand an issue or concept?”

 

Available responses included 1=never, 2=sometimes, 3=often, and 4=very often.

 

It turns out that the frequency of reflective learning reported by students at the end of their first year significantly influenced increases in Attitudes toward Literacy, Intellectual Curiosity, Intercultural Competence, Psychological Well Being, Socially Responsible Leadership, and Civic Engagement.  These increases continued to be true even after accounting for differences in incoming ACT score, sex, gender, socio-economic status, instructional clarity and organization, integrative learning, and higher order thinking.

 

This finding is even more interesting because the average scores on each of these outcomes didn’t change during the first year.  In other words, while there were enough students who either regressed, increased, or stayed the same on each of these outcomes to keep the overall averages static, the students who made gains on these outcomes seem to have (at least) one thing in common – increased reflective learning experiences.

Coincidentally (ok, not really), on Wednesday of this week (1/25) at 4 PM, Kristin Douglas, Rebecca Cook, and Stephanie Fuhr will host a presentation in the Treadway Library about the ways that Biology has successfully infused reflection into the major.  They’ll talk about the challenges and successes they have seen and hopefully give you some ideas of ways that reflective learning might work in your course or major.  In addition, Ryan White, Director of the Center for Vocational Reflection, is offering a one-time stipend to help faculty integrate reflection into their courses.  If I weren’t on an airplane on Wednesday, I’d be there.

 

I hope you’ll attend and consider finding ways to infuse this “magic potion” into your teaching.  Maybe it’s not really an instant elixir – think of it a time-release capsule.

 

 

Make it a good day,

 

Mark

Graduating our lower income students

We knew it was coming – despite hoping against hope that we might have avoided winter this year. But even as some of us were shoveling out and bundling up, the warm couple of days last week had already turned my thoughts to spring and all that comes with the end of the academic year.  Of course, this inevitably brings up the topic of graduation – a primary measure of our success as an institution.

 

For many years, colleges have tracked graduation rates – the proportion of students from a given incoming cohort that actually graduate from that college.  Although the national conversation about graduation rates generally references 6-year rates, for most private liberal arts colleges the 4-year graduation rate matters most because 1) the curriculum is explicitly set up to graduate students in four years, and 2) the cost of tuition at private colleges makes finishing in four years particularly preferable to students and their families.  In more recent years, many institutions have figured out that the overall graduation rate isn’t really as important as the graduation rates of student subgroups that are more likely to struggle and/or withdraw from college.

 

As the cost of higher education has increased, many have worried about the effect of this trend on college access for students from lower socio-economic status backgrounds.  But another question is also important – for the students from lower socio-economic backgrounds who acquire access to higher education, do they graduate at the same rate as students from higher socio-economic backgrounds?

 

Although the answer is probably a complicated one, we are able to examine graduation rates across federal financial aid categories and find out if there are systematic differences for students entering Augustana College.  Although socio-economic status (SES) is a complex issue, federal financial aid can roughly approximate three categories of students.  The most privileged students would be those who don’t qualify for any federal financial aid.  The students with some need qualify for a subsidized Stafford Loan, but no grant aid.  And the students for whom paying for college is the biggest challenge qualify for a Pell Grant.  Based on these categories, we can test the graduation rates for each group.

 

The most recent cohort of students to finish four years at Augustana entered in the fall of 2007.  The 4-year graduation rate for these students across these three SES groups is portrayed below.

 

Students with neither Stafford or Pell

79%

Students receiving a Stafford Loan

76%

Students receiving a Pell Grant

62%

 

Clearly, something is going on for the students who received a Pell Grant that differs from those who did not.  But what?  Maybe they initially thought they could cobble together the money to come to Augustana, but then found out they just couldn’t make it work.  Maybe they decided they weren’t getting enough out of the Augustana experience to merit the costs – especially in the context of their financial situation and economic collapse in 2008.  Or maybe the issue wasn’t so much about money as it was about a sense of belonging on campus among the much larger proportion of students who don’t come from such economically disadvantaged backgrounds.  Or maybe it was a combination of factors.

 

I don’t begin to know the answers to these questions.  But I think this data suggests that we had better find out.

 

Make it a good day,

 

Mark

Building on our advising success

A week or so ago, I was talking with one faculty member about the information that we now receive as a result of the IDEA student ratings of instruction reports.  During that conversation, our focus kept drifting toward the recommendations for improvement.  Although this wasn’t necessarily a bad thing, we began to notice together how evidence of success can be just as important.  Not only can it help confirm that our efforts are bearing some fruit, but it can also remind us to continue to “play to our strengths.”

With this in mind, I’d like to highlight some findings about our students’ experience with advising that I believe add to our rationale for considering ways in which we might further improve our students advising experience.

Every year we ask our graduating seniors about their satisfaction with advising overall and in advising in the primary major on a scale of 1 to 5.  Below are the average scores from last spring (2011).

 

Average Score

Standard Deviation

Overall Advising

4.102

1.176

Major Advising

3.701

1.198

 

It turns out that there are two interesting tidbits in this data.  First, the difference between satisfaction with overall advising and major advising is statistically significant – meaning that the difference between the two average scores is not attributable to chance.  Second, the difference in the standard deviation (the average gap between each student response to these questions and the overall average response) suggests that there is more variability of experiences in major advising than overall advising.

At this point you might be thinking, “Mark, that is a strange interpretation of playing to our strengths!”  To which I say – hold on for just a second.  Remember that our response scale of 1 – 5 defines “4” as satisfied . . . which means that on average both groups are relatively satisfied.  If you compare these numbers to our NSSE data on advising, it turns out that our students respond much higher than the NSSE average – both for freshmen and for seniors.

In the context of these two data points, I am most interested in asking whether we might have other data that suggests a relative strength in advising that we might expand upon to both improve our students’ average major advising score AND tighten the variability in across that experience.

I think we might have just such a data point in another section of the NSSE survey.  Students are asked earlier in the survey how often they talk about career plans with a faculty member or advisor (1=never, 2=sometimes, 3=often, 4=very often).

 

Augustana

Comparable Liberal Arts Colleges

Freshmen

2.42***

2.23

Seniors

2.92***

2.63

 

It appears that our faculty and staff advisors are already talking with advisees about their career plans substantially more often than advisors and faculty at comparable liberal arts colleges.  Since we know from our self-study of advising that this efforts makes a substantial difference in the degree to which our students feel certain about their post-graduate plans, it appears to me that this is something that we are doing very well and could build upon to strengthen our students advising experience in the major.

Have a wonderful week and a great holiday break.

Make it a good day.

Mark

One way to look at our students’ spiritual development

Last week Eboo Patel, founder of Interfaith Youth Core, spoke to many of us at either convocation or in a series of other meetings about the importance of embracing an inclusive tradition of faith – no matter the faith tradition we each might choose to follow.  His comments and questions spurred some intriguing conversation that got me wondering about the degree to which our students develop a more nuanced notion of their own spirituality during their time at Augustana so that they might be aware enough to make such a choice within their own faith tradition.

Before examining our data to see what we might have that begins to address this question, we have to accept a nagging ambiguity (and this time, it’s not all that delicious).  The term “spirituality” isn’t so easily defined.  Instead, it’s a term that tends to mean different things to different people.  For some, it’s inexorably tied to religious faith, maybe even dogma.  For others, it simply applies to an acceptance of things beyond our current understanding.  For most, it’s somewhere in between.

This makes the life of a number cruncher a little messy.  On the one hand, it turns out that we have two interesting data points on this question of spirituality.  The NSSE survey asks students:

1)     During the current school year, about how often have you participated in activities to enhance your spirituality (worship, meditation, prayer, etc.)?  The response options are 1=never, 2=sometimes, 3=often, 4=very often

 

Freshmen – 2.01; Seniors – 2.01

(both responses are significantly lower than comparable liberal arts colleges)

 

2)     To what extent has your experience at this institution helped you develop a deepened sense of spirituality?

The response options are 1=very little, 2=some, 3=quite a bit, 4=very much

 

Freshmen – 2.27; Seniors – 1.99

(both responses are significantly lower than comparable liberal arts colleges)

 

On the other hand, both questions focus on the word “spirituality,” suggesting that student responses could differ based upon their conceptualization of this term.  Nonetheless, while we might not have a precise finding from the perspective of a social scientist, we definitely have something that – from the perspective of creating optimal learning conditions and assessing student growth – begs for further inquiry.

The responses to these two questions are quite interesting to me.  In my mind they triangulate with the larger narrative we hold that sees our students as strivers.  They tend to be focused on getting a job or getting into graduate school and involving themselves in every possible activity that will help them achieve their goal.  Yet, this breakneck pace can all too often occur at the expense of our responsibility as educators to develop the whole person.

If we want our students to embrace an inclusive perspective on their own faith tradition – be it Christian, Muslim, Jewish, Hindu, Buddhist, Mormon, or Humanist – while also embracing a commitment to social justice rather (than an apathetic descent into rationalized relativism), then I suggest that we would do well to dig deeper into the following three questions.

  • Why are our students relatively less engaged in their own spiritual development than students at comparable liberal arts colleges (however students choose to define spirituality)?
  • Why do our students think that their experience at Augustana has contributed relatively less to the development of their own sense of spirituality than students at comparable liberal arts colleges?
  • Are the answers to these first two questions related?

 

Make it a good day.

Mark

One course just won’t do it

From time to time, Augustana lets me out of my little cave so that I can attend a conference related to higher education research or assessment of student learning outcomes.   A few weeks ago, a paper was presented at the Association for the Study of Higher Education (ASHE) that I found fascinating and particularly germane to many of the conversations we have at Augustana about the effects of particular curricular emphases on broader student learning outcomes.

This particular paper examined the influence of required diversity courses on students’ inclination toward civic engagement.  At many institutions the general education curricula is organized around a series of categories from which students choose one or two courses to meet the institution’s requirements.  This paper hypothesized that perhaps one course on diversity issues was not enough to influence substantive, lasting learning.  The authors examined data from about 500 students, gathered at the beginning of the first year and at the end of the fourth year.  The authors also had access to student transcripts that allowed them to identify which courses the students took to fulfill their general education requirements.

Students in this study had two options in fulfilling the diversity requirement.  They could take a domestic diversity course or a global diversity course.  In some cases, students took both – especially since some courses within the diversity category also fulfilled other requirements necessary for graduation.  Thus, the researchers could test the effect of taking one domestic diversity course, one global diversity course, or both courses on students’ gains in attitudes toward civic engagement.

The study found that the only students who made substantive gains in an inclination toward civic engagement were those who took both the domestic and global diversity courses.  Conversely, students who took only one course focused on either domestic or global diversity had not unique effect on attitudinal gains.

The take away from this paper, and the discussion that followed really honed in on the tendency for us to think that substantive learning can be accomplished by a single course – a “check the box” approach.   Of course, as we think about designing a new curriculum these findings might be useful to consider.  More broadly, however, I would suggest that this paper reinforces the idea that substantive learning is a function of a series of related experiences rather than any one experience.   We are the ones who can help our students engage in related experiences and help to point out those connections.

Make it a good day.

Mark