Course evaluation - for teachers

Key figures of the evaluation report

The evaluation you receive by e-mail is generated in a standardised way using the evasys evaluation software.

It contains a graphic illustration for closed questions in the case of scale questions (see Fig. 1):

The histogram enables you to better estimate the distribution of answers per question.

At TUDa, scales with 5 response categories are usually used. These form the basis of the histogram. The percentage frequencies given above the bars refer to the distribution of the valid answers of the students.

Next to the histogram you will find the following key figures:

  • The number of responses (n) available in each case
  • The respective average value of the answers given (mw ≙ arithmetic mean)
  • The standard deviation (s) for the mean.
    Note: The larger the standard deviation, the more heterogeneous is the response behaviour of the students. The students therefore view the questioned aspect very differently. To get a complete picture, therefore, be sure to look at the graphic illustration, which shows the percentage frequencies of the individual scale points.
  • the number of abstentions (E)

In addition to the evaluation of the individual questions in the first section of the evaluation, there is another section in which the results are presented as a profile line. The profile line represents the respective mean values per question. In addition, the valid number of answers (n) and the mean value in numbers (mw) are shown next to the profile line.

On request, we can add further comparison profile lines for you in addition to your own profile line. You can find out more about this further down this page under “Further options for classifying the results”.

Disadvantages of using the arithmetic mean

The arithmetic mean is sensitive to outliers. Particularly in small courses – and thus a small number of responses in the course evaluation – “extreme” responses can distort the arithmetic mean.

Therefore, be sure to pay attention to the standard deviation and the graphical illustration that shows the distribution of the values!

A more robust form of representation of the distribution of the responses against outliers would be the specification of the median, as well as corresponding percentiles to represent the dispersion around the median.

The median divides the answers given to an item exactly in the middle. This means that exactly 50% of the values are above and below the median.

For the evaluation reports from evasys, the indication of the arithmetic mean has proven itself, because this is understood by a larger number of teachers. The histogram shown in the report also provides a good overview of the distribution of the answers, so that false conclusions due to outliers are hardly possible.

Overall, the students' evaluations of the teachers' commitment and teaching competence are mostly positive. We rarely see mean values that are lower than 2.5. If you still find such values, it is worth taking a closer look and possibly finding more indications of difficulties in the open answers.

Even if your expectations on certain aspects and the actual results are very different, it is helpful to understand the reasons.

Here, dialogue with the students about the results is useful. It can be worthwhile to talk about individual aspects, to ask for possible reasons for critical assessments or to explain one's own assessment of the framework in which changes are realistically possible. Please also note our instructions on how to feedback results to students .

An exchange shows students that their feedback is received and taken seriously.

Please also note that there are questions where a mean value of 3 is the optimum (see Fig. 1).

If you have little or no experience with evaluating courses, a profile line comparison can be helpful. For this purpose, it is possible to compare the mean values of your own course with values from previous semesters or similar courses of the same course format. This can help you get a sense of individual aspects assessed and identify opportunities for improvement.

One danger, however, is that a comparison with thematically inappropriate courses or those of a very different size will give a distorted picture. The course evaluation team therefore needs your support in selecting suitable comparison courses as far as the content of the courses is concerned.

Even if you have already held some courses on the same topic several times, a profile line comparison can help you to identify the strengths and weaknesses of your own course. Or if, for example, you have tried out a new teaching method or have incorporated conceptual changes into your course, an effect can be seen in the profile line comparison.