
10 min read
2025/09/26
Learning analytics has been praised for its potential to improve teaching and learning, but can insights from virtual learning environments and other institutional systems genuinely support students, lecturers and educational managers in everyday practice? This piece examines the current evidence, implementation challenges and transferability limits, helping readers understand where learning analytics can make a real difference and where its promise may exceed its current impact.
Is Learning Analytics More Promise Than Practice?
Author: Prof John Traxler, UNESCO Chair, Commonwealth of Learning Chair and Academic Director of the Avallain Lab
St. Gallen, September 26, 2025 – Learning analytics has a long history and has been the subject of extensive research. It seems to have considerable potential, but what is it, and does it have any practical value?
The following account is based on the research literature and structured conversations with leading researchers, and it attempts to answer these questions.
What is Learning Analytics?
Learning analytics (LA) is, in broad terms, the notion that as students increasingly learn with digital technologies and as these digital technologies are capable of capturing large amounts of data from large numbers of students, this might enable educators and education systems to be more effective or efficient.
According to some leading researchers, learning analytics is ‘the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs.’ (Viberg, Hatakka, Bälter & Mavroudi, 2018) and ‘Learning analytics, is the analysis and representation of data about learners in order to improve learning,…’ (Clow, 2013).
As with much data, freely and cheaply available, we must, however, always remember, ‘Just because it’s meaningful, doesn’t mean you can measure it; just because you can measure it, doesn’t mean it’s meaningful!’ and we should ask ourselves, if it is both meaningful and measurable, who benefits and in what ways? Is it learners, perhaps in improved attitudes, improved subject knowledge, or even improved understanding of their own learning? Or is it teachers and lecturers? Or is it educational managers and administrators, each with very different values, priorities and targets?
Additionally, from another leading researcher, Professor Rebecca Ferguson of the UK Open University, giving the keynote, at the Learning Analytics Summer Institute, in Singapore, 2023, there is this summary, ‘….while we’ve carved a fantastic research domain for a large number of academics and a growing number of researchers globally, we have done less well at tackling improvement of the quality of learners’ lives by making the learning experience something that is less institutional, less course based, less focused on our system of education, and more focused on the experience of learners.’
So there are some doubts within the learning analytics research community.
How Does Learning Analytics Work
OK, so how does learning analytics work? To start with the basics, there are two dominant techniques. Firstly, predictive modelling, ‘a mathematical model is developed, which produces estimates of likely outcomes, which are then used to inform interventions designed to improve those outcomes … estimating how likely it is that individual students will complete a course, and using those estimates to target support to students to improve the completion rate.’ (Clow, 2013:7).
Secondly, social network analysis (SNA), ‘the analysis of the connections between people in a social context. Individual people are nodes, and the connections between them are ties or links. A social network diagram, or sociogram, can be drawn in an online forum; the nodes might be the individual participants, and the ties might indicate replies by one participant to another’s post … interpreted simply by eye (for example, you can see whether a network has lots of links, or whether there are lots of nodes with few links).’ (Clow, 2013:11).
In practice, this means that the data is coming from the main academic digital workhouse, the virtual learning environment (VLE), aka the learning management system (LMS), and therein lies the problem, which we will discuss later.
Investigating Learning Analytics
Typical research questions that academics have been tackling include whether learning analytics:
- improve learning outcomes,
- improve learning support and teaching,
- are taken up and used widely, including deployment at scale and
- are used in an ethical way. (Viberg, Hatakka, Bälter & Mavroudi, 2018)
More recent systematic reviews have confirmed these trends. For example, Sghir, Adadi & Lahmer (2023) surveyed a decade of predictive learning analytics and concluded that although machine and deep learning approaches have become more sophisticated, they rarely translate into significant pedagogical impact. Likewise, a 2023 systematic review of learning analytics dashboards found that while dashboards are increasingly designed to support learning rather than just monitoring, their actual effects on student achievement, motivation and engagement remain limited (Kaliisa, Misiejuk, López-Pernas, Khalil, & Saqr, 2024). These findings echo the persistent ‘promise versus practice’ gap.
Typical answers, filled from systematically reviewing the research literature, include:
‘The proposition with most evidence (35%) in LA is that LA improve learning support and teaching in higher education.
There is little evidence in terms of improving students’ learning outcomes. Only 9% (23 papers out of all the 252 reviewed studies) present evidence in this respect.
… there is even less evidence for the third proposition. In only 6% of the papers, LA are taken up and used widely. This suggests that LA research has so far been rather uncertain about this proposition.
… our results reveal that 18% of the research studies even mention ‘ethics’ or ‘privacy’ … This is a rather small number considering that LA research, at least its empirical strand, should seriously approach the relevant ethics.’
And, unsurprisingly, ‘… there is considerable scope for improving the evidence base for learning analytics …’ (Ferguson & Clow, 2017).
Findings on Learning Analytics Outcomes
However, ‘the studies’ results that provide some evidence in improvements of learning outcomes focus mainly on three areas: i) knowledge acquisition, including improved assessment marks and better grades, ii) skill development and iii) cognitive gains.’ (ibid)
These authors (ibid: p108) also failed to spot affective gains, meaning learners not liking learning any more, or metacognitive gains, meaning learners not becoming any better at learning, only getting more knowledge or understanding the subject better. More recent evidence (Kaliisa, Misiejuk, López-Pernas, Khalil & Saqr, 2024) supports this view: a systematic review of 38 empirical studies found that learning analytics dashboards showed at best small and inconsistent effects on student motivation, participation and achievement. This underscores that despite ongoing technological advances, affective and metacognitive benefits remain elusive.
The Practical Potential of Learning Analytics
However, the point of this blog is to tackle the relevance of this research without going needlessly into detail and ask whether learning analytics has something to offer routine academic practice across educational organisations and institutions. This means asking whether the data harvested in practice from a VLE or LMS can be of practical use. The details, context and concrete specifics may be necessary, but generally, there is a range of issues.
Firstly, students in their different universities, colleges or schools interact with a variety of other institutional systems, including:
- Plagiarism detection, attendance and access monitoring, library systems, CAA (computer-aided assessment), lecture capture, e-portfolios, student satisfaction surveys and student enrolment databases (courses, marks, etc, plus data on postcode, disability, gender, ethnicity, qualifications, etc.).
- Plus, search engines, external content (YouTube, websites, journals, Wikipedia, blogs, etc.) and external communities (TikTok, Instagram, Facebook, Quora, WhatsApp, X, etc.).
In order to get a complete picture of student activity, data would have to be harvested, cleaned and correlated from all these different sources. Permission would have to be obtained from each of the institutional data owners. Suppose institutional IT systems were stable enough for long enough. In that case, this might, in theory, be possible, albeit prohibitively expensive.
However, the fact that each institution has its own IT infrastructure, set up and systems, means that none of the work is transferable or generalisable; each institution would have to start from scratch. Recent case studies from UK higher education (Dixon, Howe & Richter, 2025) confirm this: although analytics can provide insights into teaching and assessment, challenges around data quality, integration and stakeholder trust often limit real-world adoption. In other words, the institutional ecosystems in which LA must operate are highly fragmented, and this lack of transferability continues to be one of the field’s most pressing barriers.
Secondly, academics would need to factor in face-to-face learning, formal and informal, in the hope that it, too, would complete the picture, balancing students with a preference for face-to-face with those with a preference for the digital. Even those with a preference for the digital may prefer to engage with institutional systems as little as possible, using their own devices and networks, learning from personal contact, social media, websites, search engines, podcasts and now AI chatbots.
Final Reflections
As a footnote, this account touches only briefly on the ethical dimensions (Misiejuk, Samuelsen, Kaliisa, & Prinsloo, 2025). Yet recent scholarship increasingly emphasises that ethics cannot be treated as an afterthought. Studies have shown that less than half of published LA frameworks explicitly address privacy or ethics (Khalil, Prinsloo & Slade, 2022). Practical guidelines for institutions (Rets, Herodotou & Gillespie, 2023) stress the need for transparency, informed consent and giving learners agency over their data.
More critical perspectives highlight the risk that analytics reinforce inequities or institutional agendas over student wellbeing, calling for ‘responsible learning analytics’ (Khalil, Prinsloo & Slade, 2023). Others argue for idiographic approaches, analytics tailored to individuals rather than groups, to mitigate risks of bias and overgeneralisation (Misiejuk, Samuelsen, Kaliisa & Prinsloo, 2025). Together, these developments show that ethics is now central to the future of learning analytics practice.
So perhaps it is unsurprising that learning analytics has made little practical headway in the mainstream of formal education. These challenges suggest that while learning analytics holds promise, its routine application across educational institutions remains limited and requires careful, context-sensitive planning to realise its potential.
References
Clow, D. (2013). An overview of learning analytics. Teaching in Higher Education, 18(6), 683-695.
Dixon, N., Howe, R., & Richter, U. (2025). Exploring learning analytics practices and their benefits through the lens of three case studies in UK higher education. Research in Learning Technology, 33, 3127
Ferguson, Rebecca and Clow, Doug (2017). Where is the evidence? A call to action for learning analytics. In LAK’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 56–65
Kaliisa, R., Misiejuk, K., López-Pernas, S., Khalil, M., & Saqr, M. (2024, March). Have learning analytics dashboards lived up to the hype? A systematic review of impact on students’ achievement, motivation, participation and attitude. In Proceedings of the 14th learning analytics and Knowledge Conference (pp. 295-304).
Khalil, M., Prinsloo, P., & Slade, S. (2022, March). A comparison of learning analytics frameworks: A systematic review. In LAK22: 12th international learning analytics and knowledge conference (pp. 152-163).
Khalil, M., Prinsloo, P., & Slade, S. (2023). Fairness, trust, transparency, equity, and responsibility in learning analytics. Journal of Learning Analytics, 10(1), 1-7.
Misiejuk, K., Samuelsen, J., Kaliisa, R., & Prinsloo, P. (2025). Idiographic learning analytics: Mapping of the ethical issues. Learning and Individual Differences, 117, 102599.
Rets, I., Herodotou, C., & Gillespie, A. (2023). Six Practical Recommendations Enabling Ethical Use of Predictive Learning Analytics in Distance Education. Journal of Learning Analytics, 10(1), 149-167.
Sghir, N., Adadi, A., & Lahmer, M. (2023). Recent advances in Predictive Learning Analytics: A decade systematic review (2012–2022). Education and information technologies, 28(7), 8299-8333.
Viberg, O., Hatakka, M., Bälter, O., & Mavroudi, A. (2018). The current landscape of learning analytics in higher education. Computers in human behavior, 89, 98-110
About Avallain
At Avallain, we are on a mission to reshape the future of education through technology. We create customisable digital education solutions that empower educators and engage learners around the world. With a focus on accessibility and user-centred design, powered by AI and cutting-edge technology, we strive to make education engaging, effective and inclusive.
Find out more at avallain.com
About TeacherMatic
TeacherMatic, a part of the Avallain Group since 2024, is a ready-to-go AI toolkit for teachers that saves hours of lesson preparation by using scores of AI generators to create flexible lesson plans, worksheets, quizzes and more.
Find out more at teachermatic.com
_
Contact:
Daniel Seuling
VP Client Relations & Marketing
dseuling@avallain.com