So after the first year, can we tell if CORE is making a difference?

Now that we are a little over a year into putting Augustana 2020 in motion, we’ve discovered that assessing the implementation process is deceptively difficult. The problem isn’t that the final metrics to which the plan aspires are too complicated to measure or even too lofty to achieve. Those are goals that are fairly simple to assess – we either hit our marks or we don’t. Instead, the challenge at present lies in devising an assessment framework that tracks implementation, not the end results. Although Augustana 2020 is a relatively short document, in actuality it lays out a complex, multi-layered plan that requires a series of building blocks to be constructed separately, fused together, and calibrated precisely before we can legitimately expect to meet our goals for retention and graduation rates, job acquisition and graduate school acceptance rates, or improved preparation for post-graduate success. Assessing the implementation, especially at such an early point in the process, by using the final metrics to judge our progress would be like judging a car manufacturer’s increased production speed right after the company had added a faster motor to one of the assembly lines. Of course, without having retrofitted or changed out all of the other assembly stages to adapt to this new motor, by itself such a change would inevitably turn production into a disaster.

Put simply, judging any given snapshot of our current state of implementation against the fullness of our intended final product doesn’t really help us build a better mousetrap; it just tells us what we already know (“It’s not done yet!”). During the process of implementation, the focus of assessment is much more useful if it identifies and highlights intermediate measures that give us a more exacting sense of whether we are moving in the right direction. In addition, assessing the process should tell us if the pieces we are putting in place will work together as designed or if we have to made additional adjustments to make sure the whole systems works as it should. This means narrowing our focus on the impact of individual elements on specific student behaviors, testing the fit between pieces that have to work together, and tracking the staying power of experiences that are intended to permanently impact our students’ trajectories.

With all of that said, I thought that it would be fitting to try out this assessment approach on arguably the most prominent element of Augustana 2020 – CORE. Now that CORE is finishing its first year at the physical center of our campus, it seems reasonable to ask whether we have any indicators in place that could assess whether this initiative is bearing the kind of early fruit we had hoped. Obviously, since CORE is designed to function as a part of a four-year plan of student development and preparation, it would be foolhardy to judge CORE’s ultimate effectiveness on some of the Augustana 2020 metrics until at least four years has past. However, we should look to see if there are indications that CORE’s early impact triangulates with the student behaviors or attitudes necessary for improved post-graduate success. This is the kind of data that would be immediately useful to CORE and the entire college. If indicators suggest that we are moving in the right direction, then we can move forward with greater confidence. If the indicators suggest that things aren’t working as we’d hoped, then we can make adjustments before too many other things are locked into place.

In order to find data that suggests impact, we need more than just the numbers of students who have visited CORE this year (even though it is clear that student traffic in the CORE office and at the many CORE events has been impressive). To be fair, these participation patterns could simply be an outgrowth of CORE’s new location at the center of campus (“You’ve got candy, I was just walking by, why not stop in?”). To give us a sense of CORE’s impact, we need to find data where we have comparable before-and-after numbers. At this early juncture, we can’t look at our recent graduate survey data for employment rates six months after graduation since our most recent data comes from students who graduated last spring – before CORE opened.

Yet we may have a few data points that shine some light on CORE’s impact during its first year. To be sure, these data points shouldn’t be interpreted as hard “proof.” Instead, I suggest that they are indicators of directionality and, when put in the presence of other data (be they usage numbers or the preponderance of anecdotes), we can start to lean toward some conclusions about CORE’s impact in its first year.

The first data point we can explore is a comparison of the number of seniors who have already accepted a job offer at the time they complete the senior survey. Certainly the steadily improving economy, Augustana’s existing efforts to encourage students to begin their post-graduate planning earlier, and the unique attributes of this cohort of students could also influence this particular data point. However, if we were to see a noticeable jump in this number, it would be difficult to argue that CORE should get no credit for this increase.

The second data point we could explore would be the proportion of seniors who said they were recommended to CORE or the CEC by other students and faculty. This seems a potentially indicative data point based on the assumption that neither students nor faculty would recommend CORE more often if the reputation and result of CORE’s services were no different than the reputation and results of similar services provided by the CEC in prior years. To add context, we can also look at the proportion of seniors who said that no one recommended CORE or the CEC to them.

These data points all come from the three most recent administrations of the senior survey (including this year’s edition, to which we already have 560 out of a 580 eligible respondents). The 2013 and 2014 numbers are prior to the introduction of CORE, and the 2015 number is after CORE’s first year. I’ve also calculated a proportion that includes all students whose immediate plan after graduation is to work full-time in order to account for the differences in the size of the graduating cohorts.

Seniors with jobs accepted when completing the senior survey –

  • 2013 – 104 of a possible 277 (37.5%)
  • 2014 – 117 of a possible 338 (34.6%)
  • 2015 – 145 of a possible 321 (45.2%)

Proportion of seniors indicating they were recommended to CORE or the CEC by other students –

  • 2013 – 26.9%
  • 2014 – 24.0%
  • 2015 – 33.2%

Proportion of seniors indicating they were recommended to CORE or the CEC by faculty in their major or faculty outside their major, respectively –

  • 2013 – 47.0% and 18.8%
  • 2014 – 48.1% and 20.6%
  • 2015 – 54.6% and 26.0%

Proportion of seniors indicating that no one recommended CORE or the CEC to them –

  • 2013 – 18.0%
  • 2014 – 18.9%
  • 2015 – 14.4%

Taken together, these data points seem to suggest that CORE is making a positive impact on campus.  By no means do these data points imply that CORE should be ultimately judged as a success, a failure, or anything in between at this point. However, this data certainly suggests that CORE is on the right track and may well be making a real difference in the lives of our students.

If you’re not sure what CORE does or how they do it, the best (and probably only) way to get a good answer to that question is to go there yourself, talk to the folks who work there, and see for yourself.  If you’re nice to them, they might even give you some candy!

Make it a good day,

Mark

One thought on “So after the first year, can we tell if CORE is making a difference?

  1. John Nugent says:

    Really interesting post…we’re starting our planning process in the fall, so thinking in advance about implementation and measuring is very helpful.

Comments are closed.