Academic Quality and Partnerships

Course Evaluation Surveys

This page includes some general information and Frequently-Asked Questions about the Course Evaluation Surveys.

One of the outcomes of the Student Engagement Working Group was to establish an end-of-year Course Evaluation Survey. The purpose of this was:

  • To improve response rates and coverage of feedback at a module level, reducing the number of questionnaires for students to complete, and providing an appropriate incentive.
  • To gain new insights into the course level experience of students.
  • In 2015/16 MEQs ran as normal in term 1. For term 2, MEQs did not run for any level 4 or 5 modules and instead students in their first and second year were invited to take part in a pilot of the Course Evaluation Survey developed by the Student Engagement Working Group. The population of students invited to take part in the survey was 6,545.
  • The survey was open from 3 May until 13 June. A total of 1,618 students completed the survey giving a 25% response rate. Those students who have completed the survey have since received the incentive of £5 printing credit ready for next academic year.

 Survey Questions

The complete set of questions, along with any notes to students and the possible responses, can be found on the CES Questions webpage.

Please note that most multiple choice questions had possible responses ranging from 1 - 4, meaning there is no "middle" or "on-the-fence" response.

Reports

Reports will be uploaded to the CES Reports webpages by School. Please use the link in the navigation menu to find the webpage for your School reports.

The following reports are produced based on the Course Evaluation Survey:

Institutional Summary Report

This report includes summary data for ALL students, including satisfaction based on demographic data. The institutional summary report can be considered at university committee level.

The institutional summary report can be found on the Instutitional Summary webpage (university login needed).

School Summary Reports

Two School reports will be produced - split by department - one for course responses and one for module responses.

These reports include summary data for all students on courses and modules for modules owned by the School. Summary tables will be broken down by department and will include overall course satisfaction, as well as a summary of module satisfaction for all modules owned by the department.

These reports are equivalent to the "Executive Summary" provided as part of the current MEQs.

Please note: the satisfaction scores for courses and modules are different, and should be interpreted differently.

The course satisfaction score ranges from 1 (Very satisfied) to 4 (Not at all satisfied).

The module satisfaction score ranges from 1 (Not at all satisfied) to 4 (Very satisfied).

A satisfaction score of 1 for a course indicates very satisfied students, whereas for modules this would indicate not at all satisfied students. 

Course-Level Reports

These reports, which are split by year of study, include statistical data on all course-level questions, including the free text comments provided on course satisfaction.

These reports will be provided to the owning School via Curriculum and Assessment Officers/School Administrators, who will be asked to check the free text comments before distributing to the relevant convenors, in line with any School vetting policy which operates for the existing Module Evaluation Questionnaires (see "Does vetting of comments take place?").

Module-Level Reports

These reports include statistical data on the module-level satisfaction of students and the free text comments provided on the best/worst aspects of the module.

These reports will be provided to the owning School via Curriculum and Assessment Officers/School Administrators, who will be asked to check the free text comments before distributing to the relevant convenors, in line with any School vetting policy which operates for the existing Module Evaluation Questionnaires (see "Does vetting of comments take place?").

Frequently Asked Questions

Why are the number of responses different in the summary and statistical data?

In the summary information at the start of the reports, the total number of responses is shown - i.e. the total number of students who are eligible and have completed the survey.

While the course satisfaction questions were mandatory, the remaining questions were optional for students to complete. As such, there may be some students who have completed the survey but have not provided responses for particular questions.

When presenting the responses to the survey, statistical information is provided, which may be affected by invalid or blank responses to some questions. In these cases, the response has been discounted from the statistical summary and may be noted below the response data table.

Why is there a responses threshold disclaimer at the top of my report?

When presenting the data, we have chosen a standard measure of response threshold reporting in order to be satisfied that the responses are representative of the cohort. This threshold is a minimum of 4 responses, or 50% of the eligible cohort size (whichever is greater). There will be a disclaimer shown on all reports which have not met the response threshold. This does not mean that the data cannot be used, but instead should be treated with caution, as the low number of responses relative to the cohort size means that the views of the majority of students are not necessarily evidenced in the responses.

For cohorts smaller than 4, i.e. where the number of eligible students for that course or module is 3 or less, these responses will not be reported separately and in such cases a report will not be published. This is to ensure students' anonymity, as the numerical and free-text responses could potentially be identified to the individual student who gave these responses. We are obliged to protect the anonymity of student responses.

From the MEQ info page:

Achieving a representative response rate is very important to have confidence in the results of [module] evaluation. What is considered to be a representative response rate varies with the overall size of the cohort (the larger the cohort, the lower the response rate required). As a general benchmark, however, the University considers that more than half the students on a module should complete the evaluation for results to be representative of the cohort. Please note that there is no lower limit on publication in relation to the percentage response rate. If the figure is lower that 50% for a module, then Conveners may choose to flag this in the comments section. If a module does not achieve this response rate the results should be treated with caution.

Why can't I see a report for....?

If there are some reports which appear to be missing, it is possible that:

  • the number of responses received from that cohort (by course and year, or by module) was too few in order to be reportable
  • the cohort size was too small to be reportable

Please see the section "Why is there a responses threshold disclaimer at the top of my report?" for more information on response thresholds and anonymity.

If you are sure that a report should be available but has not been provided, please get in touch (see "Where can I find more information?" below).

Does vetting of comments take place?

As with the existing Module Evaluation Questionnaires, the vetting of free text comments is built into the reporting process. 

The draft versions of course- and module-level reports will be uploaded to a repository for each School, along with a spreadsheet containing the anonymised response data. The link to this repository will be sent to School Administrators and Curriculum and Assessment Officers in advance of the publish date.

Schools will be asked to check the free text comments and carry out comment vetting in line with their existing School policy. The spreadsheet containing all anonymised response data for each course/module will include all free text comments, which can be edited and emailed to Matthew Tiernan. A revised report will be then be uploaded to the repository, and staff will be emailed confirmation of this change.

Schools may choose *not* to vet the free-text comments, in which case the draft reports can be circulated to staff.

The deadline for comment vetting is prior to the final publish date of 16 August 2016.

What about feedback to students?

As the survey is not delivered through Sussex Direct, there is no facility on the university webpages to display CES results to students or for staff to respond to students. 

It is recommended that School develop a strategy for responding to students, such as:

  • an email to all students in the cohort
  • a post on Study Direct
  • offering the opportunity to speak to students during office hours

The reports will be considered as part of the Annual Course Review process, which should include student representation. Further guidance on this will be provided before the Annual Course Review process begins later this year.

Are withdrawn students reported?

Students on PWD or TWD at the census date (when the data was uploaded to Qualtrics) were not included in the data and invited to complete the survey. Any students who subsequently withdrew from their course could have completed the survey.

Withdrawn students (PWD/TWD) who have responded to the survey are included in the reports as usual, i.e. their responses are not reported separately at module- or course-level.

This comes down to an issue of anonymity, as there are likely to be very small numbers of students withdrawing from a course/module, and therefore their responses could be distinctly identified and linked to these students.

A report on *all* withdrawn students' responses can be prepared at institutional level (note: 31 PWD students were surveyed in 2015/16).

Are joint courses reported?

Yes, for any joint honours courses, where more than 1 department are attached to a course, then the course report can be supplied to each department for consideration (upon request - please email Matthew Tiernan).

The course will only appear in summary reports for the major/owning department, rather than both departments.

What statistical data is provided?

All statistical data (counts of quantitative responses, average and median responses) are calculated using *valid responses only* (see "Why are the number of responses different in the summary and statistical data?" above). 

The numerical and statistical data provided is as follows:

Summary data

  • The cohort size (the number of students surveyed in the cohort)
  • The number of responses (the number of students completing, or partially completing the survey)
  • The response rate (presented as a percentage of the cohort size)
  • The threshold for this cohort (see "Why is there a responses threshold disclaimer at the top of my report?")

Response data (per question)

  • The percentage of valid responses per multiple choice option
  • The total number of valid responses (for each question)
  • The arithmetic mean of the valid responses (the sum of all valid responses divided by the total number of valid responses)
  • The median of the valid responses (the midpoint in the ordered distribution of all valid responses)
Where can I find more information?

The student news article: http://www.sussex.ac.uk/students/newsandevents/?id=35664

For questions on the Course Evaluation Surveys, please contact the Enhancement Team in ADQE, Clare Wolstenholme and Matthew Tiernan.