Student-facing learning analytics dashboard for remote laboratories in engineering education
Files
Item Status
Embargo End Date
Date
Authors
Reid, David P.
Abstract
Remote laboratories provide an opportunity to widen access to practical work and to further
integrate active learning into the Engineering curriculum. The main barriers to wider adoption
are reservations about the educational efficacy and a perceived lack of support for learners
studying outside of the traditional classroom or laboratory environment. This thesis investigates
both aspects by surveying student perceptions of remote laboratory practical work and
developing and evaluating a learning analytics technique for providing on-demand, automated
formative feedback.
Across three years of survey data, students report a positive educational experience using
remote laboratories, including perceptions that they offer similar opportunities for achieving engineering laboratory objectives as traditional, in-person laboratory activities. Analysis demonstrates that the user interface has more impact on student perceptions of “predictable control” over remote laboratory experiments than has been previously recognised and that increasing the interactivity of user interface components can improve engagement. Updates to the booking system (including cancellable and future bookings) are shown to remove frustrations around first-come, first-served access to remote laboratories.
A key implication of survey results is the need to develop additional tools to support learning
with remote laboratories. A promising source of feedback is the digital trace produced by remote
laboratories, but this must be analysed and presented in a meaningful way. Unfortunately, only
4.5% of recent learning analytics research describes any attempt to provide feedback to students.
Student-facing learning analytics (SFLA) dashboards are a means of addressing this gap, but
are insufficiently informed by educational research and lack rigorous evaluation in authentic
learning contexts, including during remote laboratory practical work.
This thesis presents an SFLA dashboard, designed using the established educational principles
of formative assessment, and evaluates its effect on learning outcomes during authentic
remote laboratory practical work. Feedback available via the SFLA dashboard is based upon
graphical visualisations of student actions performed during laboratory tasks and includes learning indicators, such as task completion. A learning analytics server is developed that enables the automated, on-demand analysis of student actions and provision of feedback. An asymmetric graph dissimilarity measure (TaskCompare) is developed to calculate appropriate learning indicators and is shown to be a more reliable measure of task completion than existing graph dissimilarity measures. TaskCompare distinguishes students who miss expected actions from those who perform additional actions, a capability absent from existing symmetrical graph
dissimilarity measures.
Educational parity was a key consideration in the research design. Rather than force
students into control and treatment groups, all were offered access to the SFLA dashboard.
Propensity score matching (PSM) was then used to produce a matched sample (i.e. post-hoc
assignment to control and treatment groups) that accounted for possible bias in self-selection
of dashboard use. Pre-determined control and treatment groups were not used because they
would deny access to the SFLA dashboard to (approximately) half of the participating students.
Using survey data and a total of N = 235 graphs of student interaction with remote laboratories
from two different engineering courses for validation and evaluation, the results presented in
this thesis demonstrate that those students who used the SFLA dashboard achieved significantly
better task completion rate (nearly double) than those who did not. There was a significant
difference in TaskCompare score between the two groups (Mann-Whitney U = 453.5, p < 0.01,
Cliff’s δ = 0.43, large effect size) and this difference remained after accounting for self-selection
using PSM. Students also reported a positive rating of usefulness of the SFLA dashboard for
completing laboratory work that is significantly above a neutral response (S = 21.0, p < 0.01).
These findings provide evidence that the SFLA dashboard is an effective means of enabling
formative assessment during remote laboratory activities.
This item appears in the following Collection(s)

