Monitoring student engagement via online teaching tools

By miranda.prynne, 12 July, 2021
View
Maurice Kinsella and colleagues offer practical advice on using the virtual learning environment (VLE) tools to monitor student engagement and focus support efforts where they are needed
Article type
Article
Main text

When one thinks of teaching and learning in higher education, images of lecture halls, classrooms and labs come to mind, places where instructors manage the design and delivery of curriculum content, where students are motivated to deepen their knowledge, and where participation and progress can be engaged and monitored.

While virtual learning environments (VLEs) were already an important component of university teaching, their critical value as an educational resource has been brought into sharp relief during the pandemic. Nevertheless, the reliance of universities on VLEs to keep classes running remotely and the concomitant lack of in-person contact present obstacles that are worth addressing.

One challenge is ensuring instructors remain mindful of students’ module engagement. Embedded in VLEs are monitoring tools that provide insights into student participation and performance, tools that instructors can integrate into their teaching and learning plans. These tools can optimise a VLE’s capacity to assess student engagement within each module, enabling instructors to tailor student support. Inter-module data can provide a clearer picture of students’ programme-wide engagement, helping reduce discrepancies in student monitoring arising from “module silos”, that is, treating each module as separate, unconnected entities.

Student monitoring: opportunities and challenges

Responding to students’ needs and preferences first entails identifying what they are. VLE monitoring is helpful in achieving this goal; it is a real-time feedback resource, providing insights that instructors and student-support professionals can both use to recognise and enhance student engagement. VLEs can provide metrics across different domains:

Cognitively: information on how students respond to and internalise module content, for example, accessing lecture content or resources and providing critical feedback

Behaviourally: information on how students participate in and perform on their module, for example, adhering to attendance requirements and submitting assignments

Socially: information on how students interact with people within their module, for example, contributing to discussion boards and collaborative projects.

One of the main challenges in monitoring VLE engagement is analysing the wealth of information and extracting data considered accurate and actionable. Tools and data within VLEs are generally geared towards facilitating virtual learning, and instructors can customise their modules’ structure, content, interface and assessments to meet diverse learning requirements. However, in terms of ascertaining overall student engagement, the questions that need to be asked are relatively simple:

Is this student engaged?

If not, how can they be given the support they need in order to increase and strengthen engagement?

How to measure engagement?

Ascertaining the quality of learning resources and materials and deciding which forms of student engagement deserve the most attention can be challenging. For example, data from discussion-board participation or engagement with secondary learning material offline may be difficult to quantify. So, when monitoring engagement to inform student-support decisions, we would recommend the following:

Baseline programme-level engagement reporting: Establishing a baseline of student programme-level engagement allows instructors to filter out any discrepancy between this and what module-level engagement data indicate when examined one module at a time. This also helps to direct support interventions towards overall student engagement as opposed to targeted academic support for a specific module.

Module consistency: Having consistency in content levels across modules clarifies instructors’ expectations around student participation. Increased consistency brings an added benefit of increased visibility of fluctuations in student engagement, which could otherwise be masked by content variance.

Focus on peer-relative engagement, not absolute thresholds: Although cross-module consistency may simplify some monitoring aspects, this may not always be achievable. Different subject areas can lend themselves to substantial differences in both content and assessment. Ensuring that baselines for flagging engagement issues are driven by peer-relative engagement instead of absolute thresholds allows for effective engagement monitoring across a wide variety of modules.

Regular student feedback: Ensure that feedback is sought from students regarding their expectations of programme engagement and how proactive they expect their instructors to be.

How to focus student support efforts?

Turning our attention to the second question of how to best provide student support and facilitate successful intervention outcomes, we recommend the following approaches:

Information accessibility: Rather than analysing module data and subject matter that student-support staff may be unfamiliar with, dashboards and basic reporting summarising students at risk of disengagement enable greater transparency.

Interventions are “light touch”: Initiating light-touch interventions to ascertain potential issues can help avoid the need for a more top-down approach, which could result in the student feeling support is disciplinary rather than pastoral.

Staff have autonomy and discretion in conducting interventions: Students flagged to student-support professionals through analytics data monitoring may have already been flagged via another mechanism. Allowing the support staff autonomy to decide whether to follow up with students and how they do so helps avoid potential demotivation, which can come from imposing a cookie-cutter approach, helping foster productive staff-student partnerships.

Regular student feedback: Gathering regular feedback on students’ support expectations and services with the most significant perceived value helps staff to refine their support strategies and services, improving future learning outcomes.

Whether students are learning in person or online, VLEs empower them to actively participate in their learning and enable instructors and support staff to oversee students’ academic and personal development. In short, teaching staff should make the most of VLE monitoring tools to optimise their course design, delivery and student support.

Maurice Kinsella is a research assistant, John Wyatt is a project manager, and Niamh Nestor is a student adviser, all at University College Dublin.

Standfirst
Maurice Kinsella and colleagues offer practical advice on using the virtual learning environment (VLE) tools to monitor student engagement and focus support efforts where they are needed

comment1

THE_comment

2 years 8 months ago

Reported
True
User Id
2251280
User name
Robin Gibson
Comment body
A really important discussion to be had. Too many virtual lectures have taken place with academics talking and students sitting with screens off and on mute! That isn't a great experience for the student or lecturer! One great indicator of a student's engagement with a course is how often they are accessing learning resources. A proprietary platform can show both academic and library, what texts are accessed, how often, when , how many pages have been read and how many annotations have been made in a book. This data provides a very strong indication of engagement, an early warning of students that are at risk and is a great indicator of outcomes!
jwt token
eyJhbGciOiJIUzUxMiJ9.eyJuYW1lIjoiUm9iaW4gR2lic29uIiwiZW1haWwiOiJyb2JpbmdAa29ydGV4dC5jb20iLCJpZCI6IjIyNTEyODAiLCJpYXQiOjE2MjYzNDMxNjUsImV4cCI6MTYyODkzNTE2NX0.UQjJCcIov29w4HB8QUkOhWBSUGGak1AQW4WkxOg7js0H6rXuln0JorEWNVXeyASVdOnO3q5udKFirOxDRvzb4w
Reviewed
On