Over the past few weeks I have focused on a relatively detailed reporting of USAO’s NSSE 2013 results. I mentioned a number of scales, subscales and dimensions and I flooded the blog entries with bar graphs to help you visualize how we compared to other groups. What I did not do, was spend much time discussing what all these data points, means comparisons and statistical tests actually mean. This is what I will attempt to tackle in this last NSSE post for the year (unless, of course, I get an unforeseen deluge of requests for more). In order to make sense of the comparisons featured in previous posts, it is important to know what each comparison group consists of. I thought I had addressed it already, but apparently I confused this series with the report I prepared for my boss and his executives. So here is a brief description of each of our comparison groups: Oklahoma: All post-secondary education institutions in Oklahoma participating in NSSE 2013 are included in this group. The mean (average) reported for this comparison group is computed by NSSE from student responses from these schools. The names of the specific schools included in this category are reported below: Carnegie Class: All participating post-secondary educational institutions sharing a Carnegie classification with USAO are included in this group. The names of the specific schools in this category are reported below: NSSE 2013: Includes all post-secondary educational institutions participating in NSSE in 2013. You can think of this as a type of national average, but keep in mind that not every institution participates in NSSE and of those that do, not all participate every year. NSSE reports a total of 567 participating schools; the list is too large to reproduce here, so I won’t try. Thankfully, you can follow this link to NSSE 2013 Participating Institutions. Insight 1 The first meaningful conclusion we can draw from these results is that USAO students report being about as satisfied as students at other institutions with a wide range of survey items. This might not seem impressive, but it is important to put this into the proper context. Compared to our Carnegie class, USAO tends to be significantly less expensive for all concerned. It would not surprise me at all if students at those other institutions had access to the types of services that large tuition bills can purchase. The fact that USAO is comparable to these institutions in many comparisons is a victory itself. Insight 2 The second conclusion I took away from this report is that USAO students report significantly more positive evaluations for several measures included in NSSE. For instance, our first-year students report more positive evaluations with the quality of interactions than every one of our comparison groups. Similarly, USAO seniors report more positive evaluations for student-faculty interactions than any other comparison group. All told, I think this speaks very well about how we treat students at USAO; our commitment to our students is reflected in these numbers. Insight 3 The one area of concern revealed by NSSE 2013 results is senior’s evaluations on the Quantitative Reasoning scale. If you recall from my post on academic challenge at USAO, this scale refers to the ext5ent to which students are asked to draw conclusions from numerical information, or otherwise incorporate mathematical analysis into their curriculum. Overall, it seems that students at each of our comparison groups reported engaging in this type of activity more often than their USAO counterparts. This is certainly an interesting issue and one that USAO faculty and administration are sure to address in a systematic manner. In fact, when I hear about any proposed action as a results of these findings, I will do my best to report them here.