Personal view or opinion piece
Open Access

Learning Analytics in Medical Education

Poh Sun Goh[1]

Institution: 1. National University of SIngapore
Corresponding Author: Dr Poh Sun Goh ([email protected])
Categories: Medical Education (General), Teaching and Learning, Technology
Published Date: 04/04/2017


The purpose of this article is to propose a framework to think about the use of learning analytics in medical education. This MedEdPublish article is presented in three sections – 1) a single topic summary illustration; 2) a short “elevator pitch”; and 3) a more “traditional” article format piece (in four main parts). It is hoped that this presentation format will give readers the option to review this topic either visually', to review the “essence” of the topic in the form of a brief “corridor chat” or “elevator pitch”, or a more expansive quasi-traditional article (with four main complementary section-parts which deliberately have overlapping content; with the intention to stimulate thinking on this topic, and provoke-form the starting point for further discussion, on the MedEdPublish platform).

Keywords: Data analytics; eLearning; Technology enhanced learning


The purpose of this article is to propose a framework to think about the use of learning analytics in medical education. This MedEdPublish article is presented in three sections – 1) a single topic summary illustration; 2) a short “elevator pitch”; and 3) a more “traditional” article format piece (in four main parts). It is hoped that this presentation format will give readers the option to review this topic either visually, the option to review the “essence” of the topic in the form of a brief “corridor chat” or “elevator pitch”, or a more expansive quasi-traditional article (with four main complementary section-parts which deliberately have overlapping content; with the intention to stimulate thinking on this topic, and provoke-form the starting point for further discussion, on the MedEdPublish platform).

Section 1: Topic summary illustration

Section 2: The “elevator pitch”

“What gets measured gets managed”

- Peter Drucker


“When you can measure what you are speaking about and express it in numbers, you know something about it, but when you cannot measure it, when you cannot express it in number, your knowledge is of a meagre and unsatisfactory kind”

- Lord Kelvin


We can manage better (to iteratively improve, share and build on) what is visible, what we can see – directly, through data, and data dashboards-visual data maps and illustrations. Data and observations, big data and small or rich data, quantitative and qualitative research (mixed methods research) gives us insights as educators into our teaching practice, its effectiveness and impact. Just as we blend the best features of traditional and online/eLearning/Technology enhanced learning in our teaching practice, we can “blend” and take advantage of “big data” or online data analytics, which when added to traditional classroom observations and measures-indicators of learning effectiveness and impact, can give us a more complete, comprehensive, and rounded picture of individual, and group learning. Big data or data on online behaviour and interactions with our web-based and mobile teaching and learning platforms gives information on who our students are, where they are from, what they interact with, and for how long; what they spend most time, or little-no time on – essentially what they value, and find useful (they will spend time on). Small data or rich data often obtained by traditional means of direct individual student observation and interaction, group interaction and observation, as well as evaluation of the outputs and artefacts of the learning process, and learning outcome (notes, questions, discussion points, summary presentations, answers to oral questions posed, submitted assignments-worksheets and portfolio items), which may be in traditional verbal, print or digital document-multimedia format; as well as online-mobile forum-platform posts and interactions (online discussion boards, mobile and social media used for professional learning chat threads) can all give us data, information and insights into the learning process, and outcomes of learning. We can move beyond traditional assessment (formative and summative, both classroom and workplace based) toward making the learning process, and outcomes of learning visible. Let’s not only show what we teach with, and assess on, but also go further, to make the learning process, as well as the learning and training outcomes visible.

Section 3: Abstract and Introduction

The purpose of this article is to propose a framework to think about the use of learning analytics in medical education. This article will examine this topic iteratively in four sections – 1) by posing a series of questions; 2) followed by a reflection on practice, and personal reflections section; 3) then examining this topic from a student behaviour, and instructor observation viewpoint; 4) followed by a look at possible indicators that an educator might use (including data analytics). Key observations to be made include the importance of “big data” and “small data”; the usefulness of quantitative, and qualitative data, used in a mixed methods research and evaluation paradigm; the idea of a longitudinal learning and training process; the close analogy between observations and data obtained from “traditional” teaching, learning, and practice settings; and their online-mobile learning equivalent; and the need to not just observe engagement and activity; but more importantly indices of actual learning, maturation and deepening of cognitive development and thinking, applied problem solving ability, skill development, and emotional and empathetic sensitivity development.

Part 1

How we spend our time, and our bank balance, shows what we value. Sustained, significant time and effort consciously applied to a program of learning and training has a transformative effort on who we are, and who we become. How do we go beyond "marking" class attendance, online page views; paying attention in class, (asking questions in class/or online - both quantity and quality); online page "click-through’s"; viewing and reviewing (by a teacher or observer of a student's) quantity and quality of notes taken in class, or soon after class, both hardcopy and digital formats; toward indicators and measures of learning, and ability to apply and transfer classroom learning to the workplace setting? How do we go beyond "spot quizzes", both in class, and online, (or within an eTextbook, online video, or eCourseware), toward more accurate measures and indicators of comprehension, ability to integrate knowledge, demonstrate ever increasing skill levels, and higher levels of emotional development and empathy? How does the move to the workplace setting, and workplace training and apprenticeships, as well as alternating classroom and workplace (skill - implying application) training programs take advantage of classroom time to integrate and stimulate reflection, and workplace settings to practice knowledge application and increase skill development? How can we best use pre-class preparation and pre-reading; in-class interactive and reflective-integrative activities, in both traditional and online class settings; and after class note reviews, assignments, and projects to facilitate, and reinforce learning? How do we take advantage of the different one off teaching-learning-training encounters (lectures, presentations, workshops, symposia, panel discussions), combined with longitudinal, programmatic learning and training programs to increase knowledge, understanding, and insight; build skills, both basic and higher level; and increase our "feel" and emotional - empathetic understanding of a topic? What are signs, indicators, and evidence of higher levels of professional development, increasing competency and proficiency; expert level performance, and mastery of a topic, or field?

Part 2

Learning analytics (and adaptive learning) has been highlighted in the 2016 Horizon Report as an area of important development in educational technology with a time to adoption of one year or less. The purpose of this article is to propose a framework to think about learning analytics or “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs” (1st International Conference on Learning Analytics and Knowledge). We can examine this topic from the point of view of an instructor observing the learning activities and outcomes of learning for one student; and for a cohort, group or class of students. Throughout this process, an underlying theme would be an examination of the value and impact of a blended online-face to face educational process, with the following; a recent faculty development presentation (for around 20 university single school faculty including Dean), symposium (APMEC main conference symposium for around 150 participants), workshop (APMEC pre-conference workshop for around 35 participants), postgraduate small group tutorial (for around 10 radiology residents), undergraduate small group tutorial (for around 24 students), and undergraduate large group lecture (for 300 students) used as case studies. These case studies illustrate and review the teaching and engagement practice of a single teacher-academic practitioner, at the tertiary and professional level (faculty of medicine, topic of clinical radiology; and faculty development for health professions educators); with the use of blogs + Padlet (over 5 years); as well as Instagram + WhatsApp more recently (last year), as the teaching and engagement platform, and the use of "off the shelf", and free to use, Google Analytics, SlideShare analytics, embedded SurveyMonkey results, together with documented (anonymised) discussion and session output on Padlet and WhatsApp as indicators and evidence of student engagement and learning. An attempt will be made to examine insights one might reasonably glean from this data of online activity; and how this can be supplemented and complemented by more traditional classroom and assignment observations, and outcome data.

A reflection of my personal experience (as both participant-student; and now as local Singapore faculty for the MHPE-S program) with the Maastricht MHPE program over two years, run as a blended offline, and online program is also instructive. To faculty, student attendance, punctuality, class and module contribution (both quantity, frequency, and quality), ongoing development and maturity as an educational practitioner and scholar (through class participation - quantity, frequency and quality; and specifically by assessment and evaluation of 3 to 4 submitted individual and group assignments per module; and after graduation by practice and academic output) is visible both in (live) class; as well as online (through examining frequency and duration of online log-ins to the learning management system [which incorporates not only a class management system, but also discussion and interaction platform, and module instructional content], participation in the online discussion forums, degree and quality of Q and A with tutors and faculty by email, and quality of the submitted module assignments and Master Thesis - which includes use of recommended module learning resources as well as additional resources used and cited by the student); and after graduation by evaluation of the quantity and quality of educational-professional, and academic output.

Part 3

Traditional metrics of student engagement and learning

Let's start this process by examining what an instructor might observe, or want to observe, if we had access to, and were able to observe the whole continuum of a single students' engagement with a single large group teaching session, or small group tutorial-discussion. The following would be useful: -time spent; how time was spent (including material reviewed, for how long, to what extent; discussion time; problem solving, including what types of problems, assignments and scenarios); how material reviewed was used in practice (through arguments, citations and referencing); evidence of increasing interest, and engagement with material (from additional reading/online review-search for new material, not on recommended reading-reference lists, including innovative search strategies and areas, going beyond initial topic parameters to adjacent and unexpected but related areas, use of analogies); and evidence of longitudinal increase in knowledge, insight, practice and practical application and transfer, and innovative-new use/application.

Online equivalent (of traditional metrics of student engagement and learning)

What analogies might a teacher or instructor use in a "live" classroom setting? How might this translate into online metrics and data? What might one look for, and look at? We could look at time spent on webpage, online content; follow through online viewing, onto linked, and cited online content; same session note-taking/knowledge building as viewed through concept maps, and quality of note taking, and online note-reflections; use of online content to support assignments, and in case-based scenario discussions and problem solving; quantity, and quality of online discussions - including degree, and depth of participation in online discussion platforms and forums (including number, type and nature of questions asked by students; and quality of "live" answers to questions and points posed and raised by either instructor, or other student participants); the type/nature, degree, and depth of synchronous online search for additional resources during and in parallel with "on-site" and "off-site" individual, and group based learning activities; performance on formative and summative assignments, problems and projects; portfolio construction and items; and "workplace" and realistic scenario performance, in knowledge, skill and attitudinal-affective domains.

Part 4

Online Data – Analytics (Quantitative Data, or Big Data), Informing “Who, Where, When and What”

At this juncture, it might be helpful to highlight how Social Media Analytics and Web (eg. Google) Analytics can assist and educator; focusing on the featuresfunctions-functionality and differences between both (note that each word in preceding sentence is hyperlinked to separate webpage-article).

Small Data (Personal Observation, Focus Groups, Rich Data, Qualitative Data), Informing “Why, How” and “To What Extent”

The key idea is to not only to have visibility of what is used often, and found to be useful; but to evaluate “why” this is so, for an individual student, or student cohort. With the aim to provide more of this content, or learning process-activity. And conversely with content and learning processes that are “unpopular”, where students spend little time on, do not use, and by implication find not useful; to investigate why this is so, and either replace, or refine this material or learning process-activity.


How can data analytics inform our teaching practice - a framework (in quasi table form) for thinking about and evaluating the use of Web 2.0 technologies as an online teaching / engagement tool and platform. (TeL = Technology enhanced learning; eL = eLearning)


When is TeL or eL used? Where is it used?

For example – 1) Views - volume, timing (quantitative);

2) Reviews - qualitative and quantitative;

and 3) Preference over other competing content.


Why is TeL or eL used? 

1) Access, convenience;

2) Accessibility;

3) Useful, fit for purpose;

4) Easy to use, intuitive;

5) Simple to use; and

6) Recommended by others


How is TeL or eL used?

1) Before, during or after class;

2) Before examination/assessment;

3) To look up something, on demand reference/learning, performance support.


With whom? By whom?

What to use? For what purpose? Instructional objective?

1) Content;

2) Assessment;

3) Illustrate theory; and for

4) Mastery training.


Traditional, TeL or blended approaches?

1) When to use?

2) How to use?

3) When to blend?

4) How to blend?


Where is the value/what is the value; and impact of TeL? What is the evidence of this?

1) Views;

2) Reviews;

3) Recommendations;

4) Actual use - how, when, where, with whom, why;

5) Evaluation of integration and use of Knowledge and Skills learnt, and creation of new knowledge, insights and applications/skills.

Conclusion and Take Home Points

It is hoped that this article (with its multi-section/multi-part format) has provided useful ideas, and a framework to think about; and use learning analytics – both “big data” and “small or rich data” in medical education.

We can move beyond traditional assessment (formative and summative, both classroom and workplace based) toward making the learning process, and outcomes of learning visible.

Take Home Messages

Notes On Contributors

POH SUN GOH, MBBS, FRCR, FAMS, MHPE, FAMEE, is an Associate Professor and Senior Consultant Radiologist at the Yong Loo Lin School of Medicine, National University of Singapore, and National University Hospital, Singapore. He is a graduate of the Maastricht MHPE programme, a current member of the AMEE eLearning committee, and a Fellow of AMEE. 



Alhlou, F., Asif, S., & Fettman, E. (2016). Google Analytics Breakthrough: From Zero to Business Impact. Hoboken, NJ: Wiley.  

Brown, M.G. (2013). Killer Analytics: Top 20 Metrics Missing From Your Balance Sheet. Hoboken, NJ: Wiley.

Christensen, C.M., Dillon, K., Hall, T., & Duncan, D.S. (2016), Competing Against Luck: The Story of Innovation and Customer Choice. New York, NY: Harper Business.   

Goh, P.S. A proposal for a grading and ranking method as the first step toward developing a scoring system to measure the value and impact of viewership of online material in medical education - going beyond "clicks" and views toward learning. MedEdPublish. 2016 Oct; 5(3), Paper No: 62. Epub 2016 Dec 9.

Goh, P.S. The value and impact of eLearning or Technology enhanced learning from one perspective of a Digital Scholar. MedEdPublish. 2016 Oct; 5(3), Paper No: 31. Epub 2016 Oct 18. 

Lindstrom, M. (2017). Small Data: The Tiny Clues That Uncover Huge Trends. Great Britain: John Murray Learning.   

Long, P., & Siemans, G. (2011). Penetrating the Fog: Analytics in Learning and Education. Retrieved from   

Mayer-Schonberger, V., & Cukier, K. (2017). Big Data: The Essential Guide to Work, Life and Learning in the Age of Insight. Great Britain: John Murray.



There are no conflicts of interest.
This has been published under Creative Commons "CC BY-SA 4.0" (


Please Login or Register an Account before submitting a Review

Ken Masters - (02/04/2019) Panel Member Icon
There are many good ideas on Learning Analytics shared in the paper. My major problem is with finding and following the argument.

The paper opens with a very full and complex diagram. While I can see that there is flow within the diagram, it really does need detailed explanation. I was hoping that the more detailed portion of the paper would be (or give) this explanation, and, while it certainly contains elements from the diagram, I could not find the explanation.

The rest of the paper throws up interesting questions and identifies important elements that need to be discussed, but fails to grapple enough with them. The very last part of the paper does begin a framework, but in very broad outline only. It would have been useful if this framework could have been fleshed out with the questions and elements raised in the pieces above, and could have been mapped onto the diagram in the beginning of the paper.

The paper strikes me as something that would have emerged from (and/or has the seeding for) a fascinating brainstorming or workshop session, and has the beginning of an idea of a framework, but then stops short of delivering.

I do, however, like the use of the hyperlinks to more material, although, perhaps, the author over-does it a bit. Too much of a good thing, and the readers will ignore it.

In order to give a more balanced view, I read the other reviewer’s comments and the author’s response. My concerns appear to be a little similar to the first reviewer (although, interestingly, he does not share my confusion of the opening chart, so perhaps there is something to be said for the strange structure :-).

So, ultimately, in spite of my problems with the paper, I think it does make an interesting read for those who wish to have some idea of the complexities involved in learning analytics; I hope to see the result of the later work promised in the authors’ response to the other review.
Richard Hays - (08/04/2017) Panel Member Icon
I found this paper difficult to review. The title was attractive, but ultimately I was not sure what the message was for readers. The multiple format presentation is an interesting concept. I found sections 1 and 2 to be the most helpful, because I am a visual analogy person and I like concise explanations. I found section 3 to be a little vague and the structure to be more artificial than genuine. The repetition of the abstract (twice) was unnecessary. However, structure aside, I am still not sure what message was imparted. The topic appears to be mostly about technology enhanced learning, but I was uncertain how combining 'big data' and 'small rich data' can be done best. This topic may well be better suited to an interactive workshop, where I think the discussion and shared understanding achieved may be more useful and then produce a helpful written summary that reflects those discussions. Perhaps the best use of this paper is to provoke discussion in a workshop?