Dear Members
of the Academic Community, I am writing in response to concerns about
the discrepancy in computed average ratings of the
Student
Opinion Survey (SOS) results prior to the implementation of the online
system in spring term of 2009 as reported by a GC faculty member, She
wrote, “ … for the years preceding the implementation of electronic
student evaluations, the summary information from Student Opinion
Surveys available on line does not match the summary information
provided on paper by Institutional Research back during those years. The
raw data (number of respondents who strongly agreed, agreed, etc) is the
same, but the CRN averages reported are vastly different. This means
that the current, on-line system, and the older system used to produce
the official paper reports up to and including spring 2008 performed
their calculations differently …”
The University
Senate brought this issue to my attention and asked me to look into this
matter. I have completed the review and what follows is an explanation
of the problem and the solution:
According to
the Office of Institutional Research, the hardcopy data that is referred
to above was generated from the SPSS analysis system and distributed to
individual faculty and to their department chair. As the quotation
indicates, this data was correct. The online data for previous terms
(those prior to spring 2009 and incorrect) was generated using a SQL
script based on the SPSS file being converted to an Excel spreadsheet
and then being loaded into an Oracle data table in the Banner
Information System. The raw data is correct but the calculation of
averages in the SQL script was incorrect. The averages computed were
based on the total number of responses including missing responses.
This had the effect of adding a “0”to the sum of responses for each non
response and thus lowering the average. This only occurred if a student
submitted a paper form (which added “1” to the divisor in the formula
for the average) and failed to mark one of the 5 choices, effectively
adding a “0” to the sum of responses. The calculations for previous
terms (prior to spring 2009) were created at some point in the past and
when the online system was implemented in spring 2009, the posted
results were not checked as it was assumed the calculations were correct
and the output forms were just being made available from a different
access point. The online system avoids all of this because the data is
stored in one table as it is collected and that table is the only source
for all output. When the error was discovered, the Office of
Institutional Research, in conjunction with the Information Technology
Data Management staff, immediately began a process to identify and
correct the problem. As of Tuesday the 12th of October, the calculation
procedures had been corrected and the raw data verified by Data
Management staff. The output for terms prior to spring 2009 has been
reviewed by the Office of Institutional Research and is now reported as
correct and the calculations are the same as for those terms covered by
the online process.
I appreciate
being made aware of the problem…and for the work of the professionals in
IR and IT who found a quick solution.
Sandra
Jordan