Understanding NSSE Results Reporting
You may not be too familiar with how the National Survey of Student Engagement (NSSE) reports the results of its survey. This would make you like most people- myself included before I started telling people about USAO’s results. So, before getting into the main subject of today’s post, allow me to share a few conventions that provide important context within which to consider the numbers that follow
Tip #1: Each NSSE sub-scale is made up of different components; sub-sub-scales, if you will. Since that is pretty hard to say (and to type), I will refer to these as dimensions, though in practice we might use other terms for these. Check out the NSSE Engagement Indicators page for background on each of these dimensions.
- Academic Challenge
- Reflective and Integrative Learning
- Learning Strategies
- Quantitative Reasoning
Tip#2: Each dimension has a range of 0- 60. In understanding each score we have to not only compare the figures to each other, but we also have to consider the overall range. We should ask ourselves how far away from the maximum (or minimum) each score is, or how far away from the mid-point. These scores correspond to the response choices students were given such that:
- Never = 0
- Sometimes = 20
- Often = 40
- Very Often = 60
Tip #3: In the current series, each dimension score is compared to the same score produced by other schools. In higher education, this comparison is called benchmarking (a similar approach happens in the private sector). When benchmarking, USAO typically looks to a few different groups. The first of these are other Oklahoma public universities. We also occasionally also benchmark against other schools with our Carnegie classification; in this case publicly-controlled liberal arts universities. The final comparison group in this analysis is the overall NSSE score made up of every institution that participated this year.
Of course, this is not the only way of understanding these numbers. It can be valuable to see how USAO scores have changed over the years. In fact, I hope to publish that very analysis to this blog at a later date.
Now that we have all that out of the way, we can finally talk about USAO’s results in Academic Challenge. Survey questions in this part of the survey revolve around the idea that some institutions promote deeper learning that others with the extent to which they challenge their students. Institutions might do this through challenging class work, creative group projects, and even rigorous testing.
According to the structure used by NSSE, the academic challenge sub-scale can be understood through four (4) constituent dimensions, each of which reflects an important aspect of academic challenge overall. These are:
- Higher Order Learning
- Reflection and Integrative Learning
- Learning Strategies
- Quantitative Reasoning
Figure 1 displays the results of the NSSE 2013 administration for the higher order learning dimension. The first thing you probably notice is that all groups on this graph are about the same. In fact, statistical analysis failed to reveal any statistically significant differences (this is going to happen a lot during this series) between groups. Another way of thinking about this is that each of these scores is functionally equivalent. So, all groups in this comparison often offer students the opportunity for higher-order learning. This pattern holds true for both student classifications, first year students and seniors.
We see much the same pattern of results for reflective and integrative learning. Here all comparison groups fall around 40 on the scale for first year students and around 35 for seniors. No statistically significant differences are reported across comparison groups. However, the approximately five (5) point difference between first year students and seniors is large enough that it could be significant. I can only estimate here, since the standard NSSE reports do not include comparisons across student classification (at least not that I have seen). For the purposes of this series I will “eyeball” any difference of at least five points as potentially significant.
All this means that across comparison groups, students report being offered opportunities for reflective and integrative learning a bit less than often. Further, it is possible that seniors as a whole report fewer opportunities for this type of academic challenge than first year students.
Moving onto opportunities for engaging learning strategies, figures 6 and 7 again fail to provide any evidence of differences across comparison groups. No matter which institution students find themselves in, they report the occurrence of this type of academic challenge as often. Additionally, it looks like first year students and seniors are not reporting this experience at different rates.
The final dimension of academic challenge measured by the 2013 administration of NSSE is quantitative reasoning. Examining scores for first year students we see no evidence of differences across groups. All institutions apparently provide students with about the same academic experience in this regard. I feel I should note that for this dimension we see all scores fall below the scale midpoint, meaning that student responses were closer to the sometimes end of the measured continuum than with any of the previous dimensions.
When we consider seniors in Figure 8 we finally see our first statistically significant difference (yay!). Unfortunately, results indicate that USAO students report fewer opportunities for quantitative reasoning than either our Carnegie, or Oklahoma peers (shucks!). In fact, looking at the raw means, we probably just missed a statistically lower score than NSSE overall.