New education method or tool
Open Access

Immediate feedback of lectures by smartphones

Ingrid Peroz[1], Klaus Donandt[2]

Institution: 1. Charité - Universitätsmedizin Berlin, Department of Prosthodontics, Gerodontology and Craniomandibular Disorders, 2. Krankenhaus Waldfriede, Hand Surgery, Upper Limb and Foot Surgery, Department of Orthopaedics and Traumatology
Corresponding Author: Prof Ingrid Peroz ([email protected])
Categories: Teaching and Learning, Curriculum Evaluation/Quality Assurance/Accreditation, Undergraduate/Graduate
Published Date: 12/12/2019

Abstract

Introduction:

Lectures are essential formats in dental curriculum. An immediate feedback at the end of a lecture could motivate students and lecturers for improvement. Feasibility and acceptance of a smartphone-based feedback (SBF) were evaluated.

 

Methods:

Google forms with six statements and rating scales from total agreement to strongly disagreement were linked with QR-codes. Thirty-eight dental students of the 4th clinical semester evaluated 8 lectures. Their answers were given immediately and could be discussed. This feedback was compared to the online evaluation at the end of semester. Four lecturers and the participating students answered an evaluation questionnaire about the SBF.

 

Results:

One student had technical problems. The acceptance was excellent for students and good for lecturers. Lecturers and students could imagine to evaluate all lectures by SBF. Students are more motivated to discuss the feedback than lecturers. The students are more motivated to use SBF than the online evaluation at the end of semester.

 

Discussion:

The feedback statements should be revised in cooperation with students. The required time for the feedback are about 5 minutes, so it could be possible to integrate SBF in selected lectures. The discussion of the evaluation could be compromising for lecturers. SBF gives an individualized feedback and could improve the quality of education.

 

Conclusions:

A no cost, immediate, SBF is feasible to integrate in lectures.

 

Keywords: Evaluation; feedback; dentistry; lectures; lecturers; teacher

Introduction

Usually, educational courses or lectures are evaluated by students for feedback and quality assurance. The reliability of students` evaluation is proven by studies in which individual courses and lectures received a direct feedback (Hasekura, Fukushima and Hiraide, 1990; Kuhnigk et al., 2011; Stillman et al., 1983).

 

At our university, this feedback is offered centrally by an online given questionnaire at the end of each semester. This allows an anonymous response and an overall evaluation. Both the delayed evaluation at the end of the semester and the general evaluation disable an individualized feedback, which could improve the education. There is no possibility for an interactive discussion. Especially when a lecture course is given by several lecturers the individual lecturer does not know to what extend the evaluation is focussed on his or her lecture (Bode et al., 2015). Furthermore it has to be doubted, whether students can remember all lectures of the past half a year. Looking into the overall evaluations at our university, it has to be admit, that the amount of questions is high, the answering is time consuming and so far only a few students take part in this voluntary evaluation (mostly < 10 students out of about 40 students per semester). So far the online evaluation lacks representativeness.  Nevertheless it is of a great importance to include students into the evaluation process and that the lecturers are willing to revise their lectures (Malik and Malik, 2012).

 

Repeated evaluations of the same lectures given by the same lecturers could show, that immediate student feedback is a powerful technique that can result in positive changes leading toward course improvement (Sterz et al., 2016; Stillman et al., 1983; Stokke et al., 2019).

 

An individual evaluation is also important for the lecturer, as he or she has to prove ones didactic qualification in the order of an application process for professorship or an educational award.

 

Beside immediate evaluation (Stillman et al., 1983) audience response systems has been used to evaluate lectures (Bode et al., 2015). They need a special infrastructure and lead to a significant more positive evaluation than the overall evaluation at the end of the semester (Bode et al., 2015).

 

The aim of the present study was therefore to use an immediate, smartphone-based feedback (SBF) without any costs at the end of lectures and to evaluate the feasibility and acceptance by students and lecturers.

Methods

Eight lectures of four lecturers of the prosthodontics department were evaluated in the summer semester 2015 by 38 students of the 4th clinical semester. A Google form was created with six statements and rating scales from totally agreement (1) to strongly disagreement (6) according to the German school grades (Table 1).

Table 1: Google form with six statements and a rating scale from 1 (totally agreement) to 6 (strongly disagreement) according to the German school grades.

Evaluation of lectures

totally agree  -  strongly disagree

1

2

3

4

5

 6

The lecturer encourages active cooperation of the students.

 

 

 

 

 

 

 

The lecturer was prepared and on time.

 

 

 

 

 

 

The knowledge for given objectives was clearly presented.

 

 

 

 

 

 

 

I understood the presented topic.

 

 

 

 

 

 

The lecturer used examples, which were useful for the understanding of the topic.

 

 

 

 

 

 

The content of the lesson was appropriate for the time frame.

 

 

 

 

 

 

 

The link to this Google form was created by a QR-code, which could be scanned with the private smartphones of the students. All students and lecturers were informed about the feedback and asked to upload a free ware QR-scanner application on their smartphones. The lectures ended with a QR-code, which could be scanned directly from the wall panel and was linked to the Google form. As soon as all students entered their ratings, the evaluation was presented on the wall panel and could be used for discussion.

 

At the end of the 8th lectures the students and the lecturers got additionally a paper based questionnaire to evaluate feasibility and acceptance (Table 2 and 3).

Table 2: Questionnaire for the students to evaluate the feasibility and acceptance of the immediate SBF with a rating scale from 1 (totally agreement) to 6 (strongly disagreement) according the German school grades.  

Questionnaire for students

totally agree - strongly diagree

1

2

3

4

5

 6

This evaluation motivates me to take part.

 

 

 

 

 

 

 

I would prefer a paper based evaluation.

 

 

 

 

 

 

I would evaluate in the same way, if I had the possibility to use the same evaluation in Blackboard directly after each lesson.

 

 

 

 

 

 

 

I would evaluate in the same way, if I had the possibility to use the same short evaluation in Blackboard at home.

 

 

 

 

 

 

I had technical problems to use the evaluation, which I could not solve.

 

 

 

 

 

 

To see the results of the evaluation is interesting for me.

 

 

 

 

 

 

I think it is good, that the lecturers discuss with us if their evaluation was worse.

 

 

 

 

 

 

I have no problem to discuss critical points with the lecturer.

 

 

 

 

 

 

I could imagine to evaluate all lectures during the semester in this way.

 

 

 

 

 

 

I am afraid, the evaluation is not anonymous.

 

 

 

 

 

 

I have additional comments:

 

 

 

 

 

 

Table 3: Questionnaire for the lecturers to evaluate the feasibility and acceptance of the immediate feedback by smartphones with a rating scale from 1 (totally agreement) to 6 (strongly disagreement) according the German school grades.

Questionnaire for lecturers

totally agree - strongly disagree

1

2

3

4

5

 6

I felt myself sufficient informed about the evaluation.

 

 

 

 

 

 

 

I felt myself fairly rated by the evaluation.

 

 

 

 

 

 

I discussed the evaluations worse than 2 with the students.

 

 

 

 

 

 

 

If discussed: This discussion helped myself to understand the problems of the students and to optimize my lectures.

 

 

 

 

 

 

I prepared myself better for the lectures than I would have done without this evaluation.

 

 

 

 

 

 

I support the performance of such direct feedback in as many lectures as possible.

 

 

 

 

 

 

I dislike such individual evaluations of lectures.

 

 

 

 

 

 

I have additional comments.

 

 

 

 

 

 

 

The results of the immediate feedback were compared to the regular online evaluation of the all lectures and courses, which is performed at the end of each semester by the educational deanery.

Results

Between 17 and 31 students took part in the SBF. As the attendance was voluntary and anonymous the gender ratio cannot be given. Only one student had a technical problem as he had no application for a QR-scanner.

 

Feedback for the lectures

The average ranking scales for all lectures were below two, which means a high agreement to the statements. This characterizes a lecturer, well prepared, being in time, presenting the lecture clearly, with appropriate content, using examples for better understanding, and motivating for cooperation, so that the students could understand the topics (Table 4).

 

Looking into the single lectures there are some variations from one lecture to the other, but the average feedback for a single lecture was never worse than 2.12.

Table 4: Evaluation of the eight lectures by the students SBF.

Lecture

Encourage

cooperation

Prepared

in time

Clear

presented

Understood

the topic

using

examples

Time

frame

N =

students

1

1.65

1.04

1.58

2.08

1.46

1.5

28

2

2.08

1.16

1.88

2.08

1.4

1.64

26

3

1.6

1.03

1.43

1.37

1.5

1.37

31

4

1.97

1.1

1.45

1.61

1.61

1.23

31

5

2.12

1.36

1.48

1.6

1.52

1.28

25

6

1.62

1.05

1.48

1.9

1.57

1.19

21

7

1.35

1.22

1.43

1.39

1.22

1.35

23

8

1.29

1.29

1.76

1.76

1.59

1.24

17

average

1.71

1.16

1.56

1.72

1.48

1.35

25

 

Evaluation of feasibility and acceptance by students

The students felt motivated to take part in the evaluation. They had no technical problems, were interested in the results, and motivated to discuss even worse grades with the lecturer. They did not doubt anonymity, disagreed to prefer a paper version for evaluation but were not sure, whether they would evaluate in the same way if they had the opportunity to do the evaluation by the learning management system “Blackboard” after the lecture or at home. They could imagine to evaluate all lectures during the semester in this way. Some felt necessity for further comments. (Figure 1)

 

Figure 1: Evaluation of the acceptance and feasibility of the immediate feedback by smartphone by the students.

 

Evaluation of feasibility and acceptance by lecturers

The lecturers felt fairly graded, accepted this way of feedback and could imagine that all lectures are evaluated by this way. They disagreed the usefulness of discussion of worse results with the students. They did not agree that they had been better prepared than without SBF. They felt not necessity for further comments. They felt not quite well informed about the SBF (Figure 2).

 

Figure 2: Evaluation of the immediate feedback by smartphone be the lecturers.

 

Comparison with the online evaluation at the end of the semester

The voluntary online evaluation of all dental lectures and courses at the end of the semester, which is provided centrally by the education deanery, was answered by 8 students only. Is uses a rating scale from total agreement (1) to strongly disagreement (5). The overall satisfaction with the prosthodontic lecture course was 1.5. Exact the same average of all ratings was calculated for the SBF.

Discussion

The digital and technical infrastructure to perform a smartphone or online evaluation can be presumed (Thorell et al., 2015).

 

The study shows, that it is possible to use a SBF without any costs to give immediate feedback for lecturers. Whereas the students were motivated to take part and to discuss the results, even giving up their anonymity, the lecturers were not very willing to discuss the evaluation with the students.

 

Stokke et al. performed an online questionnaire about 445 lectures of a medical degree program (Stokke et al., 2019). The questionnaire was answered by three representatives of a semester and sent to a study coordinator, who evaluated the free-text responses and prepared an anonymous qualitative thematic analysis. This could be discussed with the lecturers. Furthermore a semi-structured group interview with four students' representatives for different modules was performed. This direct feedback was useful for quality improvement of the lectures. It was a useful tool for dialogue and collaboration. The low willingness of the involved lecturers of the present study may be caused by the direct confrontation in front of the whole semester, whereas the lecturers in the study of Stokke et al. used a neutral surrounding and non personalized feedback for the discussion (Stokke et al., 2019).

 

The students felt, that not all aspects had been covered by the feedback statements, whereas the lecturers stated no neccesity for further comments. The feedback questionnaire or statements should be elaborated in cooperation with the students and the lecturers, to find the interesting focus for both sides. Other possibilities could be discussed like emotional adjectives instead of statements (Belch and Law, 2018) or targeting on factors which are motivating for students to follow the lecture like interactiveness, fun/engaging, or practical importance (Jen et al., 2016). This underlines the statements of Malik and Malik, that it is of great importance to integrate the students in the evaluation processes (Malik and Malik, 2012).

 

The comparison with the online evaluation at the end of the semester revealed a very low participation of students (23%), whereas the SBF was used by 45% to 82% (average : 66%) of the semester. The statements of the online questionnaire were not comparable with the feedback-statements. Nevertheless the overall satisfaction of both evaluations methods was the same (online evaluation : 1.5 = SBF) A similar study was performed by Bode et al. who compared an immediate feedback by an audience-response system with the online evaluation conducted at the end of the semester. The audience-response system yield a sligth better assessment (Bode et al., 2015).

Conclusions

A no cost, immediate, SBF is feasible to integrate in lectures. It gives an individualized feedback and could improve the quality of education. The feedback questions should be elaborated together with students and the lecturers should be willing to perform the feedback, to discuss it and to integrate the feedback in the revision of their lectures.

Take Home Messages

  • No cost immediate SBF is feasible.
  • It needs at least 5 minutes time to perform it.
  • Students have a higher motivation to perform an immediate feedback than an evaluation at the end of the semester.
  • Students should be integrated in the elaboration of the feedback questionnaire.

Notes On Contributors

Prof. Dr. Ingrid Peroz is currently responsible for the clinical, dental education in the field of Prosthodontics, Gerodontology and Craniomandibular disorders. She got the Excellence in Dental Education Award of the Association for Dental Education in Europe 2015.

 

Dr. Klaus Donandt was responsible for the preclinical, dental education and studied additionally medicine at the Charité – Universitätsmedizin Berlin. He is currently working in the Department of Orthopaedics and Traumatology in the Waldfriede Hospital in Berlin.

Acknowledgements

Acknowledgements: We thank the lecturers and students for their willingness to take part in this study.

 

Images: The source of figures 1 and 2 is the corresponding author.

Bibliography/References

Belch, K. and Law, S. (2018).´Emotions as student feedback´, Clin Teach, 15(6), pp. 483-487. https://doi.org/10.1111/tct.12749

Bode, S. F., Straub, C., Giesler, M., Biller, S., et al. (2015).´Audience-response systems for evaluation of pediatric lectures--comparison with a classic end-of-term online-based evaluation´, GMS Z Med Ausbild, 32(2), Doc18.

Hasekura, H., Fukushima, H. and Hiraide, K. (1990).´Evaluation of lectures on medical ethics through students' drawings´, Med Educ, 24(1), pp. 42-45. https://doi.org/10.1111/j.1365-2923.1990.tb02436.x

Jen, A., Webb, E. M., Ahearn, B. and Naeger, D. M. (2016).´Lecture Evaluations by Medical Students: Concepts That Correlate With Scores´, J Am Coll Radiol, 13(1), pp. 72-76. https://doi.org/10.1016/j.jacr.2015.06.025

Kuhnigk, O., Weidtmann, K., Anders, S., Huneke, B., et al. (2011).´Lectures based on cardinal symptoms in undergraduate medicine - effects of evaluation-based interventions on teaching large groups´, GMS Z Med Ausbild, 28(1), Doc15.

Malik, A. S. and Malik, R. H. (2012).´Twelve tips for effective lecturing in a PBL curriculum´, Med Teach, 34(3), pp. 198-204. https://doi.org/10.3109/0142159X.2011.588741

Sterz, J., Hofer, S. H., Bender, B., Janko, M., et al. (2016).´The effect of written standardized feedback on the structure and quality of surgical lectures: A prospective cohort study´, BMC Med Educ, 16(1), 292. https://doi.org/10.1186/s12909-016-0806-y

Stillman, P. L., Gillers, M. A., Heins, M., Nicholson, G., et al. (1983).´Effect of immediate student evaluations on a multi-instructor course´, J Med Educ, 58(3), pp. 172-178.

Stokke, S., Solberg, I. L., Grottum, P., Lundin, K. E. A., et al. (2019).´A system for students' evaluation of lectures in the medical programme in Oslo´, Tidsskr Nor Laegeforen, 139(11), pp. https://doi.org/10.4045/tidsskr.19.0027

Thorell, M., Fridorff-Jens, P. K., Lassen, P., Lange, T., et al. (2015).´Transforming students into digital academics: a challenge at both the individual and the institutional level´, BMC Med Educ, 15, Doc 48. https://doi.org/10.1186/s12909-015-0330-5

Appendices

None.

Declarations

There are no conflicts of interest.
This has been published under Creative Commons "CC BY-SA 4.0" (https://creativecommons.org/licenses/by-sa/4.0/)

Ethics Statement

An ethical approval was not sought. Nevertheless the authors declare, that they followed the Declaration of Helsinki. The study is an evaluation of an educational activity. Routinely the evaluation is conducted by an online evaluation of the deanary at the end of each semester. For a pilot study of a spontaneous feedback a shortened additional evaluation was done at the end of 8 lectures by smartphones using QR codes. The responses were collected anonymously, not knowing, who answered the questions. The data could not be followed to identify the responders. Therefore no personal data of the participants could be taken. The privacy and confidentiality was established during the whole evaluation. The participation in the smartphone evaluation was voluntary both for students and lecturers.

External Funding

This article has not had any External Funding

Reviews

Please Login or Register an Account before submitting a Review

Keith Wilson - (03/01/2020) Panel Member Icon
/
The authors of this paper are exploring the use of smartphones in gathering more immediate feedback from students on lectures. They highlight challenges that many universities face in getting meaningful and representative feedback from students in order to improve future iterations.

Peroz and Donandt highlight the ease by which one can use Google Forms and QRcodes following lectures to remove barriers in getting effective feedback. The method by which students engage in giving feedback is simple and clear. The authors admit that they did not seek ethics approval beforehand but did discuss how they followed the Declaration of Helsinki. The results section was concise and clear.

The discussion section could benefit from further elaboration of some of the results and limitations in my opinion. It would be interesting to know if the challenges the authors highlight will be addressed in future iterations of this tool at their institution. I think that further discussion of the limitations of using Likert-type scales in conveying feedback would have been helpful: it may explain why lecturers did not generally find that they were better prepared with the feedback. A hybrid solution combining Likert-type questions and actionable comments might be worth exploring – indeed the students felt that not all aspects had been covered by the feedback statements. It would be interesting to know some of the content of the discussions (e.g. were specific suggestions given for improvement?) when discussion did take place.

Much thanks go to the authors for sharing this very accessible method that could be picked up by any university. We all try to give students timely feedback – this study gives us another tool for faculty.
P Ravi Shankar - (15/12/2019) Panel Member Icon
/
This manuscript could be an extension of the use of audience response system toward providing immediate feedback about lectures. The description is clear and the system seems to be easy to use. The major benefit could be the immediacy and the relevance of the feedback. The lecture has just been completed and feedback can be provided. The drawback is students may not be completely candid about providing negative feedback. The lecturers may also feel uncomfortable about this as has been mentioned by the authors.
The end of semester feedback provides an overview of the performance of the lecturer during the semester and has its own value. It also addresses other areas in addition to the lectures. With current technology using these systems entails no extra cost. The article will be of interest to all medical educators.