Using Student Engagement Data to Evaluate Online Activities and Resources

In this blog article, we will be sharing the results of our research into student engagement with online learning and which virtual educational tools are preferred by staff and students at the University of Exeter.

After the COVID-19 pandemic hit in 2020, we, as well as all the teaching world, had to rapidly assimilate new online technologies and learning materials, in order to carry out online teaching. Yet, some have proved more popular and engaging than others. This is what we wanted to research: what do staff and students have to say about the online learning tools, and how are they engaging with them?

In January 2021 we were allocated funding by ALDinHE to complete research into how students are engaging with online learning tools. Our initial thought process to do this was three-fold: staff focus groups, collection of “click data” from the University’s online digital learning platform and a student questionnaire. However, in the end, the questionnaire wasn’t posted as it was felt this would replicate previous University student questionnaires.

Firstly, staff were invited over email to take part in focus groups in March of 2020 over Microsoft Teams. Staff who responded to the email (approximately 25) were split into five focus groups and asked a selection of questions from a question bank, such as: “What digital tools have you used while teaching this year?” and “What teaching methods have you used for each of these?”. The purpose of asking the academics these questions about online learning was to get their opinion on what they were using, what they prefer and what they feel the students are enjoying the most. These answers will be compared against the results taken from the “click data”.

Illustration of an online focus group.

Lastly, we collected 3100 students’ interactions or “click data” from 500 Exeter Learning Environment (ELE: The University of Exeter’s digital learning platform) module pages. After we collected the “click data”, academics who had a large uptake of interactions on their course pages had a follow-up interview to discuss how they were keeping their students engaged.

After a preliminary examination of the “click data” (over one million entries), some of the online learning tools that had a greater amount of clicks included forums and pre-recorded lectures that had been split into more manageable 15-minute videos. Some technologies had a higher uptake but were compulsory for the course, such as Flipgrid. Taking these factors into account the analysis and interpretation of our results is ongoing.

As the analysis for the data is still in progress, we don’t yet have a definitive answer to our question, however, we hope that in the future, educators or teaching bodies can look at the feedback of tools generated by our “click data” and interviews to see how they can intentionally use the tools to increase engagement among their students. Having the evidence from students’ “click data”, the focus group answers and the subsequent interviews with staff can show exactly what they are interacting with and how this can enhance teaching methods that are used for online pedagogy in the future.

These key data can be used as a core tenet of the design and evaluation of new tools being developed or considered for use by the university. We will work with relevant module leaders to implement this method and publish our findings so that academics from other organisations can benefit from our research.

By Eleanor Sandison, Tomas Nicholas and Valentin Kozsla, Digital Learning Developers from the CEMPS (College for Engineering, Maths and Physical Sciences) department at the University of Exeter.

Leave a Comment

Skip to content