Apparently, two blogs are better than one!

Some of you already know, but please keep your eye out for TuitionFit (tuitionfit.org), the new platform that allows prospective students and parents to solve the college price transparency problem.  How, you might ask?  Well, check out the site and subscribe so you can be the first to know when it goes live.

And while you’re at it, check out the TuitionFit blog, including the latest post:

The #1 Reason Why Where You Go To College Doesn’t Really Matter

Make it a good day,

Mark

When presumptions about going to college while working a job collide

The results of a recent large-scale study of college students found that, on average college students spend more time during college working paid jobs than they spend going to class and studying (see one of many news reports about these findings here).  Depending on the news outlet, reports of these findings are followed by either:

  1. These findings are further proof that cost of college is so high that students have to work most of the time just to afford it.  Tuition is too damn high . . . blah blah blah . . .
  2. These findings are further proof that college’s academic requirements have gone horribly soft.  Back in my day . . . blah blah blah . . .

For the sake of argument, let’s say that both points are true.  I think there is a third point to be made that might be more important than all the rest.  The narrative about college graduates that we keep hearing argues that colleges don’t teach enough of the skills required to succeed in the world of work (have a look at one such news story here).  But if college students are spending more than half of their time working in paid jobs, then maybe the alleged skills gap (some folks make at least a partially reasonable argument that the whole claim is crap, like this opinion piece here) shouldn’t be laid at the feet of the colleges at all.

Maybe those who hire college students for all those paying jobs ought to shoulder some of the blame.  Especially if the majority of working college students are employed in the retail, restaurant, or hospitality sectors (a reasonable supposition, I think), then those students are actually working for a much larger corporations that are certainly hiring many of those college graduates.

It seems that maybe the employers who blame colleges for a perceived skills gap ought to take a look in the collective mirror.  And the pundits who use these findings to drive home a pre-determined agenda that college is supposed to produce young adults perfectly ready for everything that the world of work might throw at them . . . you might reconsider your premise.

I caused a bit of a ruckus

Last week I wrote a piece for Inside Higher Ed titled “Dear College Presidents.”   They gave it the headline “A Modest Proposal on Rankings” . . . which also works.Given the wave of responses I have received since the piece was published, I thought I’d point out two things.

  • I’m not suggesting that schools fudge data.  I’m suggesting that they fabricate fantastically in a manner worthy of the absurdity that college presidents have caused as a result of their unwillingness to draw a line in the sand.
  • The piece is pretty clearly sarcasm in the context of farce.  Try reading more carefully.
  • A simple college pricing question

    If you could see the prices that every other college charged a student similar to yours (i.e., students with similar academic accomplishments and similar financial need), and all you had to do to see that information was share an anonymized copy of your own child’s financial aid award letter, would you do it?

    (If you have a child that has already gone to college, please try answer this question retrospectively. If you have a child that won’t go to college for a few years yet, please try to speculate.)

    Please post your answer and anything you’d like to add as a comment below.

    Thanks!

    Reborn, Rebuilt, and Rebounding!

    Metamorphosis is a good thing, right?  Let’s say yes, and . . . !  Delicious Ambiguity spent eight years focused on a specific community in a specific context at a specific college.  Now, I’m taking this blog out on its own, into the wild, and off on an adventure.

    Wanna come along?  I hoped that you would!

    Make it a good day,

    Mark

    P.S. – For those of you who might know me from the blog I wrote at Augustana College from August 2011 through April 2018, I’ve reposted my unofficial “Au revoir” post below.  I had posted it on the original Delicious Ambiguity site to say goodbye to all of the wonderful people who read my blog regularly, but Augustana pulled my last post down after a few days.

    So here it is again in its entirety:

    It’s not goodbye; it’s yes, and . . .

    Hi Everyone,

    Goodbyes are always hard to write.  On the one hand there’s so much that I want to say, but on the other hand there isn’t that much more to add.  On June 5th, Augustana eliminated the position of Director of Institutional Research and Assessment.  I spent eight years thrilled to help foster a culture of, as George Kuh called it, positive restlessness.  Even though many of you had already embraced this approach to your work long before I arrived, I’d like to think that I helped breathe just a little bit of extra life into that part of the Augustana culture. I was continually humbled by how many of you took what little bit I could offer and turned it into something stunning, and beautiful, and breathtaking to watch. So, in a way, this is exactly what should happen to a role that focused on infusing a value into a culture. Once that value has clearly taken root, it’s time to give the responsibility to keep it alive to the people who live it.

    So now it’s up to you to carry on the spirit of getting just a little bit better at what you do the next time that you do it.  I know that you will.

    I love you all.

    Make it a GREAT day,

    Mark

    Revisiting the Value of Early Feedback

    It is becoming increasingly apparent in the world of quantitative research that producing a single statistically significant finding shouldn’t carry too much weight (whether or not the value of a single statistically significant finding should have ever been allowed to rise to such a level of deference in the first place is a thoroughly valid discussion for a more technical blog than mine here or here).  In recent years, scholars in medicine, psychology, and economics (among others) have increasingly failed in their attempts to reproduce the statistically significant findings of an earlier study, creating what has been labeled “the replication crisis” across a host of disciplines and splattering egg on the faces of many well-known scholars.

    So in the interest of making sure that our own studies of Augustana student data don’t fall prey to such an embarrassing fate (although I love a vaudevillian cracked egg on a buffoon’s head as much as the next guy), I thought it would be worth digging into the archives to rerun a prior Delicious Ambiguity analysis and see if the findings can be replicated when applied to a different dataset.

    In February 2014, I posted some analysis under the provocative heading, “What if early feedback made your students work harder?”  Looking back, I’m a little embarrassed by the causal language that I used in the headline (my apologies to the “correlation ≠ causation” gods, but a humble blogger needs some clickbait!).  We had introduced a new item into our first-year survey that asked students to indicate their level of agreement (or disagreement) with the statement, “I had access to my grades and other feedback early enough in the term to adjust my study habits or seek help as necessary.”  The response set included the usual suspects: five options ranging from strongly disagree to strongly agree.

    While we found this item to significantly predict (in a statistical sense) several important aspects of positive student interactions with faculty, the primary focus of the February 2014 post turned to another potentially important finding. Even after accounting for several important pre-college demographic traits (race, sex, economic background, and pre-college academic performance) and dispositions (academic habits, academic confidence, and persistence and grit), the degree to which students agreed that they had access to grades and other feedback early in the term significantly predicted student’s response to this item: “How often did you work harder than you have in the past in order to meet your instructor’s expectations?”  In essence, it appeared that students who felt that they got more substantive feedback early in the term also tended to work harder to meet their instructor’s expectations more often.

    Replication is risky business. Although the technical details that need to be reconstructed can make for a dizzying free-fall into the minutiae, committing to reproduce a prior study and report the results publicly sort of feels like playing Russian Roulette with my integrity.  Nonetheless, into the breach rides . . . me and my trusty data-wrangling steed.

    Although it would have been nice if none of the elements of the analysis had changed, that turned out not to be the case – albeit for good reason.  We tend to review the usefulness of various survey items every couple of years just to make sure that we aren’t wasting everyone’s time by asking questions that really don’t tell us anything.  This turned out to be a possibility with the original wording of the item we were predicting (what stats nerds would call the dependent variable). When we put the statement, “How often did you work harder than you have in the past in order to meet your instructor’s expectations?” under the microscope, we saw what appeared to be some pockets of noise (stats nerd parlance for unexplainable chaos) across the array of responses. Upon further evaluation, we decided that maybe the wording of the question was a little soft.  After all, what college freshman would plausibly say “never” or “rarely” in response?  I think it’s safe to assume that most students would expect college to make them work harder than they had in the past (i.e., in high school) to meet the college instructor’s expectations. If we were a college where students regularly found the curricular work easier than they experienced in high school . . . we’d have much bigger problems.

    Since the purpose of this item was to act as a reasonable proxy for an intrinsically driven effort to learn, in 2016 we altered the wording of this item to, “How often did you push yourself to work harder on an assignment even though the extra effort wouldn’t necessarily improve your grade?” and added it to the end-of-the-first-year survey.  Although this wording has proven to increase the validity of the item (subsequent analyses suggests that we’ve reduced some of the previous noise in the data), it’s important to note at the outset that this change in wording and relocation of the item to the end of the year alters the degree to which we can precisely reproduce our previous study.  On the other hand, if the degree to which students get early feedback (an item that is asked at the end of the fall term) significantly predicts the degree to which students push themselves to work harder on their homework regardless of their grade (now asked at the end of the spring term) in the current replication study, it strikes me that this finding might be even more important than the 2014 study.

    Thankfully, all of the other variables in the 2016-17 data remained the same as the 2014-15 first-year data. So . . . what did we find?

    I’ve provided the vital statistics in the table below.  In a word – bingo!  Even after taking into account sex, race/ethnicity, socioeconomic status (i.e., Pell grant status), pre-college academic performance (i.e., ACT score), academic habits, academic confidence, and persistence and grit, the degree to which students receive early feedback appears to significantly predict the frequency of pushing oneself to work harder on assignment regardless of whether or not the extra effort might improve one’s grade.

    Variable Standardized coefficient Standard error P-value
    Sex (female = 1) 0.022 0.143 0.738
    Race/ethnicity (white = 1) 0.001 0.169 0.089
    Socioeconomic Status (Pell = 1) -0.088 0.149 0.161
    Pre-college academic performance -0.048 0.010 0.455
    Academic habits scale 0.317 *** 0.149 0.000
    Academic Confidence scale -0.065 0.167 0.374
    Persistence and grit scale 0.215 ** 0.165 0.010
    Received early feedback 0.182 ** 0.056 0.005

    It is particularly intriguing to me that the statistically significant effect of receiving early feedback in the fall term appears when the outcome item is asked at the end of the spring term – a full six months later. Furthermore, it seems important that receiving early feedback produces a unique effect even in the presence of measures of academic habits (e.g., establishing a plan before starting a paper, starting homework early, etc.) and persistence and grit (e.g., continuing toward a goal despite experiencing disappointment, sticking with a plan to reach a goal over a longer period of time, etc.), both of which produce unique effects of their own.

    The implications of these findings seem pretty important. In essence, no matter the student’s pre-college academic performance, the degree of positive academic habits, or the depth of persistence and grit when arriving at Augustana, receiving early feedback in the fall term appears to improve a student’s likelihood of working harder on their schoolwork no matter how that effort might impact their grade.

    Whew!  I guess my integrity survives for another day.  More importantly (MUCH more importantly), it seems even more clear now after replicating the 2014-15 finding with 2016-17 data, that creating ways to provide early feedback to students so that they can recalibrate their study habits as necessary appears to be a critical element of effective course design.

    Make it a good day,

    Mark

    Data, Analysis, ACTION (now the camera’s bright lights shine on you!)

    A couple of weeks ago, the Assessment for Improvement Committee (AIC) and Institutional Research and Assessment (IR&A) hosted the third of three Friday Conversations focused on improving our students’ cognitive sophistication. Unless you’ve been living under a rock (or a pile of semester transition documents!), you know by now that one of the primary functions of AIC and IR&A is to foster an organizational culture of perpetual improvement. To that end, we run a perpetual cycle of data collection, analysis, and communication about the relationships between the student learning and student experience to shine a light on the ways in which we can improve what we do as a college.

    The cycle that culminated this year (the entire process takes 5-6 years) focused on the category of learning outcomes we have called “cognitive sophistication.” In particular, we explored data gathered from the cohort of students who entered Augustana in the fall of 2013 and graduated in the spring of 2017 to examine the development of our students’ inclination to, and interest in, thinking about complex or complicated issues or ideas. Just in case you need to catch yourself up, have a quick look at the three previous posts about this process:

    1. Does our Students’ Interest in Complex Thinking Change over Four Years
    2. What Experiences Improve our Student’s Inclination Toward Complex Thinking
    3. Doing Something with What We Now Know

    In the fall term, we presented what we had found about the nature of our student’s growth and collected your suggestions about student experiences and characteristics that might influence this growth. In the winter term, we presented the results of testing your suggestions to identify the student experiences that appear to be statistically significant predictors (i.e., particularly influential experiences) of our students’ growth. By contrast, during the spring term Friday Conversation, AIC and IR&A changes it up a bit and turn the session over to whoever shows up. Because if we – meaning the Augustana community – are going to convert our findings into demonstrable improvements, then we – meaning AIC and IR&A – need to hand these findings over to you and let you shape the way that we translate evidence into improvement.

    If you clicked on the third post linked above, you didn’t find the results of the third Friday Conversation, but rather a plug and a plea for attendees. Fortunately, a healthy number of faculty and staff showed up ready to put their brains to work. Folks broke into three groups and narrowed a range of ideas into one or two practical ways that the college could put our findings to use. So without further ado, here are the focal points of the conversation from the last Friday Conversation.

    Learning in Context

    The first set of findings from our data suggested that when students engage in hands-on or experiential learning experiences, their inclination toward complex thinking seems to increase. This may be because experiencing learning in real-world or hands-on settings inevitably add a context that often complicates what might have seemed more simple when discussed in the sanitary safety of a classroom. As students get accustomed to learning or applying prior learning in these real-world settings, research on experiential learning reveals that students find this learning more interesting and sometimes even invigorating.

    Even though Augustana offers all sorts of hands-on learning experiences (e.g., internships, research with faculty, community involvement, etc.), it seems that the distribution of these opportunities across majors is uneven. As a result, students in some programs have a much higher chance of gaining access to these kinds of experiences than other students. The faculty and staff focused on this topic considered policy or practice ideas that could bring more of these kinds of opportunities to programs where they have not traditionally thrived. At the same time, the faculty and staff who joined this part of the conversation emphasized the need to offer professional development in order to help faculty in these programs imagine or craft an expanded range of hands-on learning opportunities, especially in disciplines where faculty research tends to be a solo endeavor or the nature of that research tends to explore far beyond an undergraduate’s scope of understanding.

    Integrative Advising

    This discussion focused on the “integrative” part of integrative advising. Our findings suggested that the more students engage in the integrative aspects of advising conversations (i.e. when faculty or staff prod students to weave together the variety of things they’ve done in college – AKA that long list at the bottom of the email signature – into a coherent narrative), the more they tend to develop an inclination toward complex thinking. This may be because asking students to turn their own raw data (after all, a list of disparate activities is very much like a set of raw data) into a story requires them to engage in complex thinking about uncertainty from two directions: 1) what themes are already present throughout my various activities that could form the basis of a compelling narrative and 2) given where I want to end up after college, how should I alter my list of activities to better prepare for success in that setting?

    Participants in this discussion honed in on three ideas that are either already in development or could be introduced. First, they talked about the existing FYI proposal that includes a portfolio. This portfolio might be an especially good way to get first-year students to map out their college experience with the end (i.e. who they want to be when they receive their diploma) explicitly in mind. Second, the participants talked about the need for a way to continue this way of thinking beyond the first year portfolio and landed on a common assignment within the Reasoned Examination of Faith course (formerly Christian Traditions) that would focus on vocation-seeking and purpose. Third, they identified a continuing need for faculty development that would help individuals apply holistic/integrative advising practices no matter the advising context.

    Interdisciplinary Discussions

    The third group of faculty and staff tackled the challenge of increasing student participation in interdisciplinary discussions. It shouldn’t be much of a surprise by now that the experiences that we found to predict greater gains in cognitive sophistication were those that required students to apply one set of perspectives or ideas within a different, and often more tangible, context or framework. Augustana already offers several avenues for these kinds of conversations (e.g., Salon, Symposium Day, etc.), and there is a certain subset of students who continually participate with enthusiasm. But increasing student participation in these events means focusing on the subset of students who don’t jump at these opportunities. One possibility included finding ways for students to attend conferences in the region when they aren’t presenting research. Another possibility included fostering more interdisciplinary student groups. A third intriguing idea involved the conversations about a Creativity Center on campus and the idea that this initiative might be an ideal vehicle to bring together students from disciplines that might not normally intersect.

    Now comes the hardest part of this process. There isn’t a lot of reason to collect student learning data and identify the experiences that shape that learning if we don’t do anything with what we find out.  AIC and IR&A will continue to encourage the campus to plug these findings into policy, program, or curricular design. But we need you to take these findings and discussion points and champion them within your own work.

    When you (notice the “when” rather than “if”?) have implemented something cool and creative, can you send me an email and tell me about it?  I’ll be sure to share it with the rest of the college and celebrate your work!

    Make it a good day,

    Mark

    An educational idea that evidence doesn’t support

    Good morning,

    Over the last week or so, the IR office has been prepping the various large-scale surveys that we send out annually to first-year and senior students. After a couple of years of administering the same survey instruments, it’s tempting to just “plug and play” without thinking much about whether the questions we’ve been asking are actually supported by the evidence we have gathered previously or are even still relevant at all.

    Although there are good reasons to maintain consistency in survey questions over time, it is also true that we ought to change survey questions if they no longer match what we are trying to do or what we know to be true. Because we are human, we can get ourselves caught rationalizing something that we think ought to be so at exactly the time when we ought to do something else. It isn’t uncommon for us to believe something to always be so because it either seemed so at one time (and maybe even was so at one time) or because it appeared to be so in one instance it seemed like it ought to be so in every other situation or context.

    Last week, I read an article in the Atlantic about one such educational “best practice” that subsequent research seems to have debunked. It’s not a very long article, but what it describes might be important for some as many of us are designing and redesigning classes for the new semester calendar.

    The Myth of ‘Learning Styles’

    A popular theory that some people learn better visually or aurally keeps getting debunked.

    Hmmmmm  . . . . .

    Make it a good day,

    Mark

    What good are those Starfish flags anyway?

    Now that we’ve been using the Starfish tool for a couple of years to foster a network of early alerts and real-time guidance for our students, I suppose it makes sense to dig into this data and see if there are any nifty nuggets of knowledge worth knowing. Kristin Douglas (the veritable Poseidon of our local Starfish armada) and I have started combing through this data to look for useful insights. Although there is a lot more combing to be done (no balding jokes, please), I thought I’d share just a few things that seem like they might matter.
    Starfish is an online tool that allows us to provide something close to real-time feedback, positive, negative, or informational, to students. In addition, this same information goes to faculty and staff who work closely with that student in an effort to provide early feedback that influences future behavior. Positive feedback should beget more the same behavior. Negative feedback hopefully spurs the student to do something differently.
    In general, there are two ways to raise a Starfish flag for a student. The first is pretty simple: you see something worth noting to a student, you raise a flag. These flags can come from anyone who works with students and has access to Starfish. The second is through one of two surveys that are sent to faculty during the academic term. This data is particularly interesting because it is tied to performance in a specific class and, therefore, can be connected to the final grade the student received in that class. The data I’m going to share today comes from this survey data.
    We send a Starfish survey to faculty twice per term.  The first goes out in week 3 and asks faculty to raise flags on any student that has inspired one(or more) of four different concerns:
    • Not engaged in class
    • Unprepared for class
    • Missing/late assignments
    • Attendance concern
    The second Starfish survey goes out in week 6 and asks faculty to raise flags that address two potential concerns:
    • Performing at a D level
    • Performing at an F level
    We now have a dataset of almost six thousand flags from winter, 2015/16 through winter, 2017/18. Do any of these flags appear to suggest a greater likelihood of success or failure? Given that we are starting with the end in mind, let’s first look at the flags that come from the week 6 academic concerns survey.
    There are 1,947 flags raised for performing at a D level and 940 flags raised for performing at an F level. What proportion of those students (represented by a single flag each) ultimately earned a passing grade in the class in which a flag was raised?
    The proportion that finished with a C or higher final grade
    • Performing at a D level (1059 out of 1947)   –   54%
    • Performing at an F level (232 out of 940)    –    25%

    On first glance, these findings aren’t much of a surprise. Performing at an F level is pretty hard to recover from with only three weeks left in a term. At the same time, over half of the students receiving the “D” flag finished that course with a C grade or higher. This information seems useful for those advising conversations where you need to have a frank discussion with a student about what it will take to salvage a course or drop it late in the term.

    The second set of flags comes from the third week of the term and represent behaviors instead of performance. Are any of these raised flags – not engaged in class (278), unprepared for class (747), missing/late assignments (1126), and attendance concern (904) – more or less indicative of final performance?

    The proportion that finished with a C or higher final grade
    • Not engaged in class (202/278)       –        73%
    • Unprepared for class (454/747)        –        61%
    • Missing/late assignments (571/1126)   –    51%
    • Attendance concern (387/904)        –        43%
    There appears that these four flags vary considerably in their correlation with a final grade. Attendance concern flags appear to be the most indicative of future trouble while appearing unengaged in class seems relatively salvageable.
    Without knowing exactly what happened after these flags were raised, it’s hard to know exactly what (if anything) might have spurred a change in the behavior of those students who earned a final grade of C or higher. However, at the very least these findings add support to the old adage about just showing up.
    What does this data suggest to you?
    Make it a good day,
    Mark

    You need a laugh today? Let’s go!

    Even us emotionally stunted numbers-nerds get bogged down by rainy April days. So I thought I’d share a quick video lesson on predictive analytics that I hope will make you smile (while educating in the most high minded fashion, of course!).

    Predict This!

    Of course, if you’re in the kind of curmudgeonly mood that is way beyond laughter, try out this Mountain Goats song and belt out the chorus at the top of your lungs.

    Make it a good day,

    Mark