Description of a new education method or tool
Open Access

Using an online student response system, Socrative, to facilitate active learning of Physiology by first year graduate entry to medicine students: a feasibility study

Mark G. Rae[1], Dervla O'Malley[2]

Institution: 1. University College Cork, 2. University College Cork,
Corresponding Author: Dr Mark G. Rae m.rae@ucc.ie
Categories: Research in Medical Education

Abstract

Technologies such as audience response units (‘clickers’) have been used to facilitate greater student engagement within a variety of educational settings, but numerous technical issues have limited their more widespread use. More recently, flexible, cloud-based student response systems (SRSs), which are designed for use with student mobile devices and overcome most of the limitations of clicker systems, have become widely available. However, the suitability of use for such systems in accelerated degree programmes such as graduate entry to medicine (GEM) has yet to be assessed. Therefore, we utilised Socrative, a freely available SRS, in a physiology component of a first year GEM module to ascertain, a) its ease of deployment, b) its popularity with students and, c) if they felt it improved their learning. There were no technical problems using Socrative. Further, 93% of respondents to an attitudinal survey strongly agreed or agreed that they favoured using Socrative in the classroom, and that they felt that it had improved their learning (92%). Thus, our data strongly indicate that the use of SRSs like Socrative would be highly valued even by the time-pressured, relatively mature students enrolled on accelerated professional courses such as GEM.

Keywords: graduate entry to medicine, socrative, attendance, pre-clinical Physiology, active learning

Introduction

Prior to the advent of Web-based teaching tools around 2011, if educators wished to gather in class, real time, information on student understanding/opinion, they could either simply ask students to raise their hands in response to a question or utilise a computerised audience response (‘clicker’) system. The benefits to the student of the latter over the former more traditional technique being that, as responses are generally anonymous, students would be more likely to, a) answer questions in the first place and, b) answer them honestly without fear of answering incorrectly or supporting an unpopular opinion (Cain and Robinson 2008, Lantz 2010, Liu and Taylor 2013).

Hardware-based clicker systems typically consist of three elements: presentation software such as PowerPoint, receiver hardware, and response devices/clickers. They have been used fairly widely within higher education for a variety of purposes and have generally been both popular with students and, most likely because they encourage students to become active learners through knowledge application and/or cooperative learning, have provided demonstrable improvements in student engagement, learning and exam performance (Cain and Robinson 2008, Caldwell 2007, Keough 2012, Lantz 2010, Liu and Taylor 2013, Michael 2006, Vicens 2013).  However, in spite of their general popularity, and educational benefits to students of using these systems, the cost of hardware and software required to run them, paired with technical issues associated with their use (e.g. time required to set up and dismantle system for each lecture, testing/registering of clickers, handing out and collecting clickers, non-functional clickers, insufficient numbers of clickers, etc.) (Barnett 2006, Caldwell 2007, Hoffman and Goodwin 2006, Keough 2012, Liu and Taylor 2013), has deterred many potential users, ourselves included, from utilising them routinely in classes.

Recently however, a variety of cloud-based online student response systems (SRSs) (e.g. commercial entities such as Go Soapbox (www.gosoapbox.com), Poll Everywhere (www.polleverywhere.com), Learning Catalytics (https://learningcatalytics.com), as well as free systems such as Quiz Socket (www.quizsocket.com), Kahoot (https://getkahoot.com), and Socrative (http://socrative.com)) have emerged which, at face value at least, promise to circumvent many of the aforementioned drawbacks of the hardware-based systems. The primary advantage of these SRSs, over the older ‘clicker’, systems is that they can be accessed anywhere by students using their own mobile devices, provided that there is an adequate WiFi signal. Furthermore, the requirement for students to purchase, or be given, their own ‘clicker’ is obviated by the near ubiquitous ownership and ever presence of either smart phones, tablets or laptops by students.

Of the SRSs mentioned above, the free, cloud-based system Socrative currently seems to have been the most widely adopted in higher education circles, with several recent studies, from diverse educational areas, all describing very positive experiences using it (Awedh et al. 2015, Coca and Ciesielkiewicz 2014, Dervan 2014, Kaya and Balta 2016, Liu and Taylor 2013, Wash 2014). All highlighted the fact that in addition to providing a very user-friendly interface with which to set up quizzes, it also offers several question options and provides immediate and downloadable feedback. Equally importantly, as it is entirely Web –based, the Socrative site can be viewed on any browser without the need for the installation of a specialised app.

For these reasons, Socrative seemed to us to be an attractive replacement for the clicker audience response system we had used previously. However, before fully endorsing this new SRS and proposing its use by other lecturers, we wanted to run an initial feasibility study to determine the reaction of students using it. For example, although today’s students have a preference for, and a desire to utilise, technology during the learning process (so-called ‘Digital Natives’; Prensky 2010), we wanted to know if they might perceive Socrative as merely a gimmick without any sound pedagogical benefits. As Dervan (2014) observed, educators using new technology for teaching “can be perceived to be more focused on technology than with teaching” to the annoyance and distraction of students, particularly if technical issues overshadow the technology. We also wanted to ensure that Socrative could be employed in a relatively seamless manner within the classroom such that vital teaching/learning time was not wasted either setting up or troubleshooting, both of which can be troubling for students (Stuart et al. 2004). Finally, as Knight and Wood (2005) noted, many students are still incorrectly of the opinion that meaningful learning can only take place whilst the lecturer is talking, and that time spent on other classroom activities such as interactive quizzes and discussions can be viewed as both distracting and a waste of time for them. This latter point is a particular concern for us teaching students enrolled upon the graduate entry to medicine (GEM) program here at University College Cork (UCC). In addition to requiring students to possess a minimum of a second class honours, grade one (2H1 or equivalent) result in their first honours undergraduate degree (from any discipline) to gain entry to the GEM program, its duration is only four, rather than the five, years of traditional ‘direct-entry’ undergraduate medical (DEM) degrees (Duggan et al. 2014). This is primarily accomplished by compressing the pre-clinical teaching on GEM programs into just over one, rather than two, years which means that the GEM students must learn the same amount of material as their DEM counterparts in effectively half the time, requiring that they manage their study time very efficiently. As such, they do not appreciate, and may not attend, teaching activities which they perceive to be of little value or relevance to them.

In the present study therefore, we sought to answer the following research questions:

  1. What were the attitudes of the first year GEM students towards Socrative as a teaching/learning tool?
  2. Did attitudes differ across gender?
  3. Did attitudes differ depending on educational background (i.e. whether or not the students’ first degree was in a so-called ‘biomedical’ versus a ‘non-biomedical’ discipline)?

Method

The study was undertaken at University College Cork (UCC), Ireland as part of the Physiology component of a fourteen week pre-clinical module, GM1002 (Fundamentals of Medicine II), which also comprises Anatomy, Microbiology, Pathology and Pharmacology components (N.B. to date, the author (M.G.R.) is the only lecturer to have used Socrative during lectures and/or practical sessions on the GEM program and therefore the students had not had any prior exposure to this SRS).

For this study, we prepared eight quizzes in total using the Socrative website. Questions were of a single best answer format where a question stem could be answered by selecting one of up to five possible answers, only one of which was the correct answer. These quizzes comprised of between four and seven questions with each focussing on different elements of respiratory physiology and acid-base balance that are covered in six lectures and four practical sessions within this module. The questions used for the study were a mixture of pure recall and application type questions, consistent with the format and difficulty of those used in their fomal examinations. As the point of the exercises was to encourage student participation and discussion without ‘exam pressure’ no summative marks were awarded for performance. Indeed, to further encourage participation, they were free to log-in to the Socrative site anonymously if they so wished. The quizzes were usually deployed towards the end of each lecture and/or practical class and related primarily to material that had been taught in that session. On two occasions however it was felt that there would be insufficient time to run the Socrative quizzes and so they were held at the start of the next available teaching session, be it either a lecture or a laboratory practical.

At the end of GM1002 in March 2015, upon completion of the six lectures and four practical sessions included in the study, students were asked to complete an anonymous online quantitative and qualitative survey containing multiple Likert-style (five-point balanced – with text strongly agree, agree, neither agree nor disagree, disagree or strongly disagree), as well as several free-response, questions relating to their Socrative use. The participants provided their consent to being surveyed when they accessed and completed the survey.

Evaluation and Analyses

In order to avoid unintentional bias inadvertently affecting analysis of respondents’ data by the lecturer delivering the taught material and Socrative exercises in class (MGR), responses to the attitudinal survey were gathered, analysed and categorised (in the case of the qualitative responses) by one of the authors (DoM) who was not involved in the teaching, examining or organisation of the GEM programme. Further, DoM had no contact with any of the students enrolled on the programme. From the data captured by the attitudinal survey we obtained basic demographic information (e.g. age, sex, nationality, educational background) and self-reported attendance. Where relevant, we were able to pair this with the information provided in the four Likert-type questions related to their perceptions of Socrative and use of questions in lectures. Responses to the Likert-type questions were further sub-categorised into, a) male versus female and, b) biomedical educational background (all students, male and female) versus non-biomedical educational background (all students, male and female). Thus, in the manuscript, data derived from the Likert-style questions are displayed both as percentage values in histograms as well as means where values have been attributed to the Likert responses (e.g. strongly agree = 5, strongly disagree = 1). In this way we were able to make statistical comparisons between values for the different sub-categories of students in the study using, unless otherwise stated, Student’s unpaired t-test (assuming equal variance between the groups). P < 0.05 was considered significant.

Results

Student demographics

For the current study 65 out of 70 (93%) eligible students in GEM year 1 completed the survey towards the end of March 2016.

Table 1 presents demographic data for the respondents and illustrates a roughly even split between males and females (with no significant differences in their ages) and those coming from either a European Union (EU) state (mainly Ireland) or a non-EU state (mainly Canada). There are however significantly fewer entrants to the GEM program of students whose first degree was classified (by the students themselves) as ‘non-biomedical’ (e.g. law, music, English, etc.) relative to those coming from a ‘biomedical’ undergraduate degree or discipline (e.g. Biochemistry, Physiology, biological research, etc.). Of the 65 students who responded to the survey, a minimum of 64 students answered all of the questions referred to in this study. Attendance at the six lectures and four practical sessions which were the subject of the current study was excellent, with 85.9% (55 students) attending all six lectures and the remaining nine (one student did not answer this question) attending at least five lectures.

 

Student perceptions of Socrative

Students were overwhelmingly positive about their experience using Socrative, as measured both quantitatively and qualitatively. For example, in response to the statement, ‘I liked using Socrative to answer questions during lectures/practicals’, 95.3% (61/64 students; 1 unanswered) either strongly agreed or agreed (mean = 4.55±0.07). When these responses took into account sex and/or educational background there were no statistically significant differences between groups, illustrating their almost universal popularity with the class (see figure 1A).

Although this result seemed to be a ringing endorsement by the students for Socrative, we wanted to ensure that they were also able to express their opinion on in-class questioning in general, just in case their positive view of Socrative had been swayed by the novelty factor of this new tool. However, we found that responses to the statement, ‘I would prefer NOT to have to answer questions at all during lectures’, supported the previous result in that 92.2 % (59/64 students, 1 unanswered) disagreed or strongly disagreed with the statement (mean = 1.57 ± 0.08), with none agreeing or strongly agreeing. It is interesting to note however that the percentage of students who disagreed or strongly disagreed with the statement was in excess of 90% for all subcategories of students described above except for female non-biomedical students, where it fell to 75% (mean = 2 ± 0.27, n=8). This was significantly different to the all-male (mean = 1.5 ± 0.1, n= 35, P< 0.05), female biomedical (mean = 1.5 ± 0.13, n= 33, P< 0.05) and male biomedical (mean = 1.5 ± 0.11, n=27, P< 0.05) subgroups.  However, due to the relatively low numbers of female non-biomedical students (eight), in reality this simply reflects the fact that two of those students selected the ‘neither agreed nor disagree’ option for that particular question (see figure 1B).

In an attempt, again, to ensure that any positive student sentiment towards Socrative was for pedagogical reasons rather than the novelty of new technology (Blood and Neel 2008), we sought to ascertain if they felt that using Socrative had actually helped them to learn, which ultimately was one of the main purposes of the exercise. Thus, in response to the statement, ‘Using interactive software like Socrative improves the overall learning experience’, 92.2 % (60/65 students) strongly agreed or agreed, with none disagreeing or strongly disagreeing (mean = 4.49 ±0.08). When the responses were broken down into their various subcategories, it was notable that all sixteen of the non-biomedical students strongly agreed or agreed with the statement (mean = 4.5 ± 0.12, n=16), whereas female biomedical students polled the lowest percentage strongly agreeing or agreeing with the statement (86.4%; 19/22 students; mean = 4.5 ± 0.16, n=22). Statistically however, there were no significant differences between these or any other subgroup (see figure 2A).

One final Likert-style question in the survey asked students to respond to the statement, ‘I would have preferred Dr Rae just telling us the answers to questions rather than using Socrative’ in order to ensure that we were obtaining student impressions of Socrative rather than simply their opinion of gaining access to exam-style questions for example. Overall, 60/65 students (92.2%) disagreed or strongly disagreed with the statement, with only one (female biomedical) student agreeing with it (mean = 1.77 ± 0.08). Similar percentage values were found across all subcategories, although the largest percentage of students who neither agreed nor disagreed with the statement was the male biomedical students (11.1%, 3 students; mean = 1.7 ± 0.13, n=27; see figure 2B).

 

Student Qualitative Responses Regarding Socrative

In the free essay component of the survey, students were invited to provide comments about their use, and impressions, of Socrative in lectures/practicals.

59 out of the 65 respondents supplied responses, only one of which was unrelated to the use of Socrative and was therefore disqualified from the analysis.

54/58 (93.1%) comments were designated as being positive about Socrative.

Very much liked the use of Socrative - would like to see it employed in all classes.

I think it's good because it challenges me to actively think about the concept I just digested in the lecture and is still fresh in my mind.

I thoroughly like the use of this software in lectures. Learning from examples (e.g. questions) is one of the ways that I am able to retain the information that has been taught. I perform better if I've had some experience answering questions rather than just listening to a lecture.

Socrative is very useful as it makes you apply your knowledge on the spot. Discussion of the answers is also useful for learning the lecture material.

A further 19/58 (32.8%) comments related to how the anonymous nature of the quizzes encouraged them to answer questions in class.

Being anonymous allows students to answer without repercussion [sic], allowing some students who are unsure of their answers to take risks and learn from their mistakes

The anonymity of answering with socrative definitely makes it much easier to answer questions.

I think the anonymous nature is perfect and does encourage you to answer all the questions.

Eighteen students (31%) also explicitly stated that they wished Socrative would be utilised by other disciplines/ lecturers on the GEM course.

Finally, but importantly, four students (6.9%) specifically referred to the fact that they enjoyed the collaborative nature of Socrative exercises.

Socrative was a very good way to help establish concepts we just went over in lecture, but also to discuss with neighboring [sic] classmates and guage [sic] my level of understanding relative to the class.

I took away a lot of info just from doing the socrative questions and being able to discuss why answers were correct and incorrect. I also liked working out the answers with peers in class.

Although no comments were designated as being explicitly negative about Socrative per se, two students (3.4%) felt that the time provided for the class to answer each question was too long (it was approximately 60 seconds per question but longer if the author felt the question was more challenging than normal, or if a calculation was required). Another three students (5.2%) stated that, depending on the room in which the quizzes took place, internet connectivity could hinder their ability to both log-in to Socrative as well as for their answers to be logged by the website.

Discussion

The growth of accelerated graduate entry programs within Ireland and the UK, and the attendant time pressures being placed on students enrolled upon such courses, is increasingly challenging traditional approaches to teaching and learning. This is particularly true with regard to the role of traditional, passive, didactic lectures, which, in spite of the large body of evidence indicating that they are a relatively ineffective way of maximising student learning and understanding (Bransford et al. 2000, Frederick 1986, Stuart and Rutherford 1978), are still the primary mode of ‘information transfer’ utilised by most places of higher education. Indeed, their function becomes even less clear when one considers that students can now easily access a wealth of useful, and often better, educational information on the internet and, increasingly, can also view recorded lectures (Cardall et al. 2008, Chandra 2011, Dupagne et al. 2009, McNulty et al. 2009, Schreiber et al. 2010, Soong et al. 2006, Traphagan et al. 2010, Vaccani et al. 2016, Young 2008). With these thoughts in mind, many educators are now utilising their relatively limited face-to-face time with students in a much more student-focussed manner in order to maximise their learning and understanding. This is primarily achieved by encouraging active student participation within lectures by, for example, having them apply new knowledge or participate in discussions, which has well described and proven pedagogical benefits to student learning and exam performance over and above passive listening to entirely didactic lectures (see inter alia Beichner and Saul 2003, Crouch and Mazur 2001, Deslauriers et al. 2011, Gannod et al. 2008, Hake 1998, Knight and Wood 2005, Smith et al. 2009).

However, a particular difficulty for any educator interested in delivering ‘active’ learning lectures lies both in identifying a learning activity or technique that both fits the particular demands of the course(s) upon which they teach, and which will also obtain sufficient student ‘buy in’ for it to work. For example, previous attempts by us to ‘flip the classroom’ (Bergmann and Sams 2012, Betihavas et al. 2016, Crouch and Mazur 2001, Gillispie 2016, Morgan et al. 2015, O'Flaherty and Phillips 2015, Pierce and Fox 2012, Sharma et al. 2015, Street et al. 2014, Tolks et al. 2016), whereby GEM students were asked to study material before attending class so that scheduled lecture time could be utilised more fruitfully to ask questions and stimulate discussion, were not particularly popular with these students. The two main reasons given for their dislike of this teaching method were that, a) they did not have enough time to prepare in advance for lectures and, b) they still wanted ‘normal’ lectures (in spite of their aforementioned pedagogical imperfections).

Therefore, given these particular constraints, we had to devise a strategy which would both allow us to still apply the principles of active learning within lectures to increase student participation but, at the same time, not have the students perceive either that they were receiving ‘extra’ work, or that there was a significant change in format of their lectures. To this end, we hypothesised that utilisation of the cloud-based SRS, Socrative, in lectures would provide us with just such a solution. Further, it had the added attraction to the students of being a novel, hi-tech tool which was accessible using their mobile devices, which we felt would strongly appeal to this generation’s innate desire to utilise technology during the learning process (Prensky 2010).

The results displayed herein appear to bear out our initial hypotheses and concur with previous studies using this SRS (Awedh, Mueen, Zafar and Manzoor 2015, Coca and Ciesielkiewicz 2014, Dervan 2014, Kaya and Balta 2016, Liu and Taylor 2013, Wash 2014). For instance, the utilisation of Socrative did indeed prove to be a very simple way of increasing student participation within lectures which, importantly, the students themselves viewed very positively. Furthermore, from the educator’s standpoint, it’s deployment did not significantly impact upon the amount of course content which could be covered in each session. Furthermore, this approval was almost universal across gender and educational background subgroupings, albeit with a small, but significant, decrease in rating by the female non-biomedical subgroup. Equally importantly, the vast majority of students felt that it improved their learning, and therefore clearly did not view its use as a simple ‘gimmick’ employed by the lecturer for their amusement. Indeed, nearly a third of the class explicitly stated that they would like to see it employed by other disciplines teaching on the UCC GEM program.

Although we were unable to test this perception of improved learning empirically, numerous studies support a connection between increased student interactivity and learning outcomes (Cain and Robinson 2008, Caldwell 2007, Keough 2012, Lantz 2010, Liu and Taylor 2013, Michael 2006, Vicens 2013).

One additional benefit of using Socrative, which had also been observed by Awedh et al. (2015), was its facilitation of in-class discussion. Significantly, in-class discussion, in and of itself, has been conclusively shown to improve student learning (Knight and Wood 2005, Liu and Taylor 2013, Smith, Wood, Adams, Wieman, Knight, Guild and Su 2009). This was an interesting observation for us as we had originally expected each student to log in and answer individually (although the lecturer (MGR) did not make this mandatory). However, it very soon became clear that not all students were logging in to Socrative, but instead that small ad hoc discussion groups formed organically amongst themselves with only one of the group actually logged in. On some occasions, there were as few as 22 respondents to a Socrative quiz even though it was clear that virtually the whole class was fully engaged with the exercise. By virtue of the fact that in-class discussion also frequently took place after the correct answer had been revealed, particularly if the students had collectively answered it poorly, a useful two-way learning experience was opened up (Bates et al. 2006, Burnstein and Lederman 2001, Liu and Taylor 2013). Thus, for the students, I could work through the problem with them so that they understood why a particular option was correct, but it also helped me to identify areas where I had not initially been clear, thus enabling me to improve my teaching in future.

Impediments to using Socrative

One of the main reservations about using SRS tools such as Socrative is that course content would have to be sacrificed as a logical consequence of devoting more time to questions and discussions. A counter argument to this is that students benefit more from an in-depth understanding of underlying concepts and learning to problem solve than rote learning facts17,50. Although in a broad educational sense this is undoubtedly correct, for certain professional courses such as pre-clinical medicine, elements of the prescribed curriculum cannot be omitted simply because there hasn’t been enough time to teach them. The suggestion that the responsibility for learning extra material outside of class should lie with students (Bates, Howie and Murphy 2006, Knight and Wood 2005) is not an unreasonable one but has to be gauged against the existing workload of such students for their other classes. For example, our own attempt at semi-flipped classroom teaching on an accelerated GEM course, whereby students were expected to watch specified material before each class and then be questioned on, or discuss, it in class, largely failed as many students, already struggling just to keep up to date with the content of their daily lectures, resented the imposition of additional ‘homework’. The response of over a third of the class was to simply stop attending Physiology lectures as, without having done the preparatory work, they felt left out of the interactive sessions.

However, as described herein, using Socrative to interrogate students’ knowledge of specific concepts or their ability to apply certain concepts using a small number of specifically targeted questions can be achieved with relatively minor tweaks to pre-existing lectures. For example, with regard to my own lectures I found myself eliminating superfluous information that had been retained simply because the lectures already fitted their allocated fifty minute slot.

Student Wear out

One other possibility that might affect student positivity towards systems such as Socrative is student ‘wear out’. That is, student’s might simply get bored of this type of SRS once the initial novelty wears off. However, Wang (2015), in the only study that we are aware of that has looked at this potential problem, albeit with the game-based SRS, Kahoot, found no fall off in ‘student engagement, motivation or learning’ even after using it continuously for five months. However, as the lack of student wear out with this system was attributed to the competitive nature of Kahoot, it remains to be seen if using Socrative will be so resistant to student habituation.

Limitations of the study

There are three principal limitations of our current study. Firstly, the data were obtained from one relatively small cohort of students, within a single taught module, attending a particular Medical School. Thus, in the absence of further studies, it is difficult to extrapolate our findings to other GEM programmes. Secondly, as this study was carried out principally to determine the feasibility of using cloud-based SRSs within the GEM classroom, we did not select a randomized control group to experience the same classes without utilising the Socrative quizzes. However, prior to the module in which Socrative was employed, the same students surveyed for the current study did complete another fourteen week ‘basic’ sciences module (GM1001) in which the Physiology content was delivered in a traditional didactic format, without utilisation of Socrative. Indeed, most students on the GEM program entered the course having completed undergraduate degrees in which didactic lectures are the norm. Additionally, prior to commencing the present study, when asked, none of the students in the class indicated that they had previously even heard of Socrative. Thus, students’ extensive experience of lectures without Socrative, effectively enabled them to act as their own controls (i.e. the vast majority clearly indicated that they preferred lectures in which Socrative was, rather than was not, used.

Finally, although students did perceive that using Socrative did improve their understanding of the taught material, we did not empirically assess whether or not this did actually translate into an improvement in examination performance. However, although this is a future study goal, it is likely that we would continue to use Socrative as a teaching tool, almost irrespective of the outcome of such a study (unless it significantly negatively affected examination scores), such was its popularity and success in engaging students within lectures.

Future plans

Given the success of this initial pilot study we plan to utilise Socrative in all future Physiology lectures on the GEM course, but possibly with some minor alterations in its delivery. For example, given the reported attention spans of only seven minutes (Baker et al. 2011) (or even shorter; Bunce et al., 2010 ) of the current generation of learners, a useful proposal is to break the lecture itself up into ‘digestible blocks’, each separated by a question or two related the completed section (Liu and Taylor 2013) as an aid to consolidate that material.

Equally, it may be instructive to run Socrative at the start of lectures either to assess students’ understanding of the material to be presented so that the lecture can be tailored accordingly, or to test retention of material from previous information sessions. Finally, and as described by Liu and Taylor (2013), Socrative could also be used to gather immediate student feedback on the teaching session itself, such that subsequent lectures could be revised or modified to reflect the feedback received.

Finally, as discussed above, we wish to assess whether the deployment of Socrative within classes actually results in a measurable improvement in grades.

Conclusions

The evidence presented here strongly supports the use of SRSs such as Socrative as an aid to enhancing perceived student learning outcomes even in accelerated graduate entry programs such as GEM. Whereas lecturers might have previously been concerned that the use of such technology would be perceived as gimmicky or a waste of time by the more mature students enrolled upon these types of professional courses, who will also have generally come from previous degrees where traditional course structures prevail (Knight and Wood 2005), our study suggests that, in contrast, these students wholeheartedly embraced this increased classroom interactivity. However, as discussed by others, in order to be effective, SRSs like Socrative and, more specifically, the questions that are utilised to engage students, need to be designed appropriately such that the technology always supports, rather than distracts from, pedagogy (Gray and Steer 2012, Liu and Taylor 2013, Watson 2001).

Take Home Messages

1) Little is known about how graduate entry to medicine students perceive the use of cloud-based student response systems like Socrative in the classroom.

2) The integration of Socrative into lectures in a pre-clinical physiology module was overwhelmingly popular with students.

3) Students indicated that Socrative use improved their understanding of the taught material even though that was not empirically tested here.

Notes On Contributors

Mark G. Rae is a lecturer/principal investigator in the Department of Physiology at UCC and holds a Ph.D. in Physiology and a Masters degree in Teaching and Learning in Higher Education.

Dervla O’Malley is a lecturer/principal investigator in the Department of Physiology at UCC and holds a Ph.D. in Neuroscience and a Diploma in Teaching and Learning in Higher Education.

Acknowledgements

The authors wish to thank and acknowledge the UCC GEM students who completed surveys for this study, without whom we would not have been provided with such useful and interesting data.

Bibliography/References

Awedh M, Mueen A, Zafar B, Manzoor U. 2014. Using Socrative and Smartphones for the support of collaborative learning. International Journal on Integrating Technology in Education (IJITE) 3(4):17-24.

https://doi.org/10.5121/ijite.2014.3402  

Baker R, Matulich E, Papp R. 2011. Teach Me In The Way I Learn: Education And The Internet Generation. Journal of College Teaching & Learning (TLC).4(4).

Barnett J. 2006. Implementation of personal response units in very large lecture classes: Student perceptions. Australasian Journal of Educational Technology. 22(4):474-494.

https://doi.org/10.14742/ajet.1281  

Bates SP, Howie K, Murphy ASJ. 2006. The use of electronic voting systems in large group lectures: challenges and opportunities. New Directions in the Teaching of Physical Sciences. 2(1):8.  

Beichner RJ, Saul JM. 2003. Introduction to the SCALE-UP (student-centred activities for large enrollment undergraduate programs) project. Proceedings of the International School of Physics "Enrico Fermi," Varenna, Italy.  

Bergmann J, Sams A. 2012. Flip Your Classroom. Reach Every Student in Every Class Every Day Washington, DC: International Society for Technology in Education.  

Betihavas V, Bridgman H, Kornhaber R, Cross M. 2016. The evidence for 'flipping out': A systematic review of the flipped classroom in nursing education. Nurse Education Today. ; 38:15-21.

https://doi.org/10.1016/j.nedt.2015.12.010  

Blood E, Neel R. 2008. Using student response systems in lecture-based instruction: Does it change student engagement and learning? Journal of Technology and Teacher Education. 16(3):375.  

Bransford JD, Brown AL, Cocking RR. 2000. How People Learn: Brain, Mind, Experience, and School. Washinton D.C.  : National Academies.   

Bunce DM, Flens EA, Neiles KY. 2010. How long can students pay attention in class? A study of student attention decline using clickers. Journal of Chemical Education. 87(12):1438-1443.

https://doi.org/10.1021/ed100409p   

Burnstein RA, Lederman LM. 2001. Using wireless keypads in lecture classes. The Physics Teacher. 39(1):8-11.

https://doi.org/10.1119/1.1343420   

Cain J, Robinson E. 2008. A primer on audience response systems: current applications and future considerations. American Journal of Pharmaceutical Education. 72(4):77.

https://doi.org/10.5688/aj720477   

Caldwell JE. 2007. Clickers in the large classroom: current research and best-practice tips. CBE Life Sciences Education. 6(1):9-20.

https://doi.org/10.1187/cbe.06-12-0205   

Cardall S, Krupat E, Ulrich M. 2008. Live lecture versus video-recorded lecture: are students voting with their feet? Academic Medicine. 83(12):1174-1178.

https://doi.org/10.1097/ACM.0b013e31818c6902   

Chandra S. 2011. Experiences in personal lecture video capture, IEEE Transactions on Learning Technologies. 4(3):261-274.

Coca DM, Ciesielkiewicz M. 2014. The Use of Smartphones in Primary Education: Improving Arithmetic Calculations in Pre-service Primary Teacher Training. US-China Education Review.403.   

Crouch CH, Mazur E. 2001. Peer instruction: Ten years of experience and results. American Journal of Physics. 69(9):970-977.

https://doi.org/10.1119/1.1374249   

Dervan P. 2014. Increasing in-class student engagement using Socrative (an online Student Response System). AISHE-J: The All Ireland Journal of Teaching and Learning in Higher Education. 6(3):1801-1813.   

Deslauriers L, Schelew E, Wieman C. 2011. Improved learning in a large-enrollment physics class. Science. 332(6031):862-864.

https://doi.org/10.1126/science.1201783  

Duggan EM, O'Tuathaigh CMP, Horgan M, O'Flynn S. 2014. Enhanced research assessment performance in graduate vs. undergraduate-entry medical students: implications for recruitment into academic medicine. QJM: An International Journal of Medicine. 107(9):735-741.

https://doi.org/10.1093/qjmed/hcu064   

Dupagne M, Millette DM, Grinfeder K. 2009. Effectiveness of video podcast use as a revision tool. Journalism and Mass Communication Educator. 64(1):54-70.

https://doi.org/10.1177/107769580906400105  

Frederick PJ. 1986. The lively lecture—8 variations. College Teaching.34:43-50.

https://doi.org/10.1080/87567555.1986.9926766   

Gannod GC, Burge JE, Helmick MT. Using the inverted classroom to teach software engineering. Proceedings of the 30th International Conference on Software Engineering; 777-786; 2008: ACM.

https://doi.org/10.1145/1368088.1368198   

Gillispie V. 2016. Using the Flipped Classroom to Bridge the Gap to Generation Y. The Ochsner Journal. 16(1):32-36.   

Gray K, Steer DN. 2012. Personal response systems and learning: It is the pedagogy that matters, not the technology. Journal of College Science Teaching. 41(5):80-88.   

Hake RR. 1998. Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses. American Journal of Physics. 66(1):64-74.

https://doi.org/10.1119/1.18809   

Hoffman C, Goodwin S. 2006. A clicker for your thoughts: Technology for active learning. New Library World. 107(9/10):422-433.

https://doi.org/10.1108/03074800610702606   

Kaya A, Balta N. 2016. Taking Advantages of Technologies: Using the Socrative in English Language Teaching Classes. International Journal of Social Sciences & Educational Studies.4.  

Keough SM. 2012. Clickers in the Classroom: A Review and a Replication. Journal of Management Education. 36(6):822-847.

https://doi.org/10.1177/1052562912454808   

Knight JK, Wood WB. 2005. Teaching more by lecturing less. Cell Biology Education. 4(4):298-310.

https://doi.org/10.1187/05-06-0082   

Lantz ME. 2010. The use of 'clickers' in the classroom: Teaching innovation or merely an amusing novelty? Computers in Human Behavior. 26(4):556-561.

https://doi.org/10.1016/j.chb.2010.02.014   

Liu DY, Taylor CE. 2013. Engaging students in large lectures of introductory biology and molecular biology service courses using student response systems. Proceedings of the Proceedings of The Australian Conference on Science and Mathematics Education (formerly UniServe Science Conference); 154-162   

McNulty JA, Hoyt A, Gruener G, Chandrasekhar A, Espiritu B, Price R, Naheedy R. 2009. An analysis of lecture video utilization in undergraduate medical education: associations with performance in the courses. BMC Medical Education. 9(1):6.

https://doi.org/10.1186/1472-6920-9-6   

Michael J. 2006. Where's the evidence that active learning works? Advances in Physiology Education. 30(4):159-167.

https://doi.org/10.1152/advan.00053.2006   

Morgan H, McLean K, Chapman C, Fitzgerald J, Yousuf A, Hammoud M. 2015. The flipped classroom for medical students. The Clinical Teacher. 12(3):155-160.

https://doi.org/10.1111/tct.12328   

O'Flaherty J, Phillips C. 2015. The use of flipped classrooms in higher education: A scoping review. The Internet and Higher Education. 25:85-95.

https://doi.org/10.1016/j.iheduc.2015.02.002   

Pierce R, Fox J. 2012. Vodcasts and active-learning exercises in a "flipped classroom" model of a renal pharmacotherapy module. American Journal of Pharmaceutical Education. 76(10):196.

https://doi.org/10.5688/ajpe7610196   

Prensky MR. 2010. Teaching digital natives: Partnering for real learning: 1st Ed. Thousand Oaks, CA: Corwin Press.   

Schreiber BE, Fukuta J, Gordon F. 2010. Live lecture versus video podcast in undergraduate medical education: A randomised controlled trial. BMC Medical Education. 10(1):68.

https://doi.org/10.1186/1472-6920-10-68  

Sharma N, Lau CS, Doherty I, Harbutt D. 2015. How we flipped the medical classroom. Medical Teacher. 37(4):327-330.

https://doi.org/10.3109/0142159X.2014.923821   

Smith MK, Wood WB, Adams WK, Wieman C, Knight JK, Guild N, Su TT. 2009. Why peer discussion improves student performance on in-class concept questions. Science. 323(5910):122-124.

https://doi.org/10.1126/science.1165919   

Soong SKA, Chan LK, Cheers C, Hu C. 2006. Impact of video recorded lectures among students. Procedings of the 23rd annual ascilite conference: Who's learning? 789-793.   

Street SE, Gilliland KO, McNeil C, Royal K. 2014. The Flipped Classroom Improved Medical Student Performance and Satisfaction in a Pre-clinical Physiology Course. Medical Science Educator. 25(1):35-43.

https://doi.org/10.1007/s40670-014-0092-4   

Stuart J, Rutherford RJ. 1978. Medical student concentration during lectures. Lancet (London, England). 312(8088):514-516.

https://doi.org/10.1016/S0140-6736(78)92233-X  

Stuart SAJ, Brown MI, Draper SW. 2004. Using an electronic voting system in logic lectures: one practitioner's application. Journal of Computer Assisted Learning. 20(2):95-102.

https://doi.org/10.1111/j.1365-2729.2004.00075.x   

Tolks D, Schafer C, Raupach T, Kruse L, Sarikas A, Gerhardt-Szep S, Kllauer G, Lemos M, Fischer MR, Eichner B, et al. 2016. An Introduction to the Inverted/Flipped Classroom Model in Education and Advanced Training in Medicine and in the Healthcare Professions. GMS Journal for Medical Education. 33(3):Doc46.   

Traphagan T, Kucsera JV, Kishi K. 2010. Impact of class lecture webcasting on attendance and learning. Educational Technology Research and Development. 58(1):19-37.

https://doi.org/10.1007/s11423-009-9128-7   

Vaccani J-P, Javidnia H, Humphrey-Murto S. 2016. The effectiveness of webcast compared to live lectures as a teaching tool in medical school. Medical Teacher. 38(1):59-63.

https://doi.org/10.3109/0142159X.2014.970990   

Vicens Q. 2013. Building students' knowledge one click at a time. Tidsskriftet Læring og Medier (LOM).6(10).

Wang AI. 2015. The wear out effect of a game-based student response system. Computers & Education.82:217-227.

https://doi.org/10.1016/j.compedu.2014.11.004   

Wash PD. 2014. Taking advantage of mobile devices: Using Socrative in the classroom. Journal of Teaching and Learning with Technology. 3(1):99-101.

https://doi.org/10.14434/jotlt.v3n1.5016  

Watson DM. 2001. Pedagogy before technology: Re-thinking the relationship between ICT and teaching. Education and Information Technologies. 6(4):251-266.

https://doi.org/10.1023/A:1012976702296   

Young JR. 2008. The lectures are recorded, so why go to class. Chronicle of Higher Education. 54(36):A1.

Appendices

There are no conflicts of interest.

Please Login or Register an Account before submitting a Review

Reviews

Veena Rodrigues - (19/05/2017) Panel Member Icon
/
This paper provides 3 take home messages:
1) Little is known about how graduate entry to medicine students perceive the use of cloud-based student response systems like Socrative in the classroom.
- There is little discussion of why graduate entry medical students would be expected to be any different from other medical students in the use of SRS. The sample studied is too small to allow for exploration of effect modification through stratification of age or gender but the authors make no attempt to discuss the findings in this respect. Results are presented simply as counts and % but with such a small sample size, how reliable are these?
2) The integration of Socrative into lectures in a pre-clinical physiology module was overwhelmingly popular with students.
Acceptability to students is important but this is assessing the intervention at a very basic level (Kirkpatrick level 1 - reaction).
3) Students indicated that Socrative use improved their understanding of the taught material even though that was not empirically tested here
This is a very bold conclusion to make based on the fact that learning gain (recall or application) was not tested in the study. Self-reporting of 'understanding' is not ideal.
The authors mention resistance of students to the semi-flipped lecture - change is unpopular among any students so a degree of persistence is required to let the new pedagogical model to bed down. They have to study the material anyway so why not be more efficient and use the contact time more productively for deeper learning?
Finally, the authors need to consider that it is the ensuring of student participation and engagement that improves learning in large group teaching rather than the medium used to do this (whether clickers or Socrative or Kahoot or buzz groups or pair and share) .
Parichad Apidechakul - (25/01/2017)
/
This paper is based on the idea of “Reflective thinking”, in order to assess the students’ point of view. However, the freely log-in to the website is not an easy means to uphold the personal identity. Besides, authors should concern regarding the web-based manipulation, since the conflict of interest is inevitable.
Richard Hays - (11/01/2017) Panel Member Icon
/
While reporting an interesting development, this paper lacks the methodological rigour to claim any evidence of sud=ccess. I am not sure why graduate entry and other pathway students should be any different, so the title is confusing. Any new approach, particularly involving technology, attracts interest and engages learners, at least for a while, so the described impact is easily explained by a keen lecturer adopting early relatively new interactive technology. There really cannot be a control group or blinding, so time will tell just how sustainable the interest is.
THOMAS PUTHIAPARAMPIL - (11/01/2017)
/
Although it was a pilot study the results are highly impressive. I agree that the use of clickers can be cumbersome and expensive. Socrative or such online facilities will be very useful to get immediate feedback from students. Such activities will definitely keep the students attentive, thereby stimulating teaching and learning, both for the students as well as the lecturers. Well done
Gary D. Rogers - (09/01/2017) Panel Member Icon
/
This was a nice exploratory study of a new technology to improve the interactivity of large group teaching in a graduate-entry medical program. Although the outcome measures were relatively low-level, they were sufficient to demonstrate ringing endorsement by the students to warrant further utilisation and study of the technique. Despite the obvious pedagogical drawbacks of the large group as a learning methodology, economic realities mean that it is likely to be with us for a long time to come and it is beholden on us to find ways to improve its effectiveness. I agree with the authors that stronger evidence of effectiveness is now required and qualitative methodologies may provide the vehicle for achieving this in a situation where confounders bedevil attempts at quantitative investigation. Our own experience of using smart-phone-based response technologies has been terrible (in common with the attempt to use them at last year's AMEE conference) because of the requirement for wifi of sufficiently high band-width in the auditorium for them to work. We have gone back to hardware clickers!
Trevor Gibbs - (09/01/2017) Panel Member Icon
/
Although I have little doubt that there is improved activity as a result of audience-response systems being used in the correct manner, and that we need to be able to "measure " the true effect of this active learning on the student, I am still unsure if this paper addresses or satisfies those explicit needs.
I would be very surprised if the results were any different to those shown, given the generation of technologically advanced students that we are dealing with, and I doubt very much whether the results would be different in non-GEM cohort. Although the study does support the opinion that the system used is well appreciated by the students and they feel that it has improved their learning and understanding, should we not be looking more at the real effect on the ability to drive student learning from a more superficial to a deeper level, and such use of TEL actually improves learning to a level that we are striving for.
I appreciate the fact that the authors address the need for a more extensive study, but I do not at present feel that they can be as dogmatic in their conclusions.
I would also appreciate some clarity of results, in the appropriate section, specifically when the authors have very clearly outlined their research questions