The challenges of learning analytics and possible solutions

Having looked in previous blog posts at the definition of learning analytics (LA) at the 3 primary uses: attainment, retention and satisfaction, and the many (potential) benefits of LA, this post addresses three key challenges in implementing and using LA in an organisation.  A future post will address the issue of how and when to use LA to support student learning.

Underlying LA is the assumption that organisations are collecting information on the many interactions that a student has with their provider.  Much of this will come from the virtual learning environment (VLE) which can collect a vast amount of data on the use of every aspect of these systems including content access, social media interactions and the result of assessment both formative and summative.  Other sources of information can include library usage and attendance data for lectures and seminars that is captured manually, by smart cards in lecture theatres or using the proxy of WIFI-enabled location analysis.   The analytics in LA are the processes that make sense of all these data to identify the activities and behaviours which are associated with different outcomes and to make this available in a summarised and comprehensible form for staff and students to use.

Leaning analytics data supports informed decision-making

While there are definite benefits from LA, successful implementation and use is not without its challenges.

Challenge 1: Unit design.  As the VLE is usually the core system from which most of the LA data is sourced, the first challenge is that limited use of the VLE and its tools will lead to the collection of less data which will have a negative effect on the quality of the predictions generated by the LA engine.

Possible solution:  A key element for addressing this challenge is in the training of staff and the raising of awareness of the benefits of collecting meaningful engagement data that can be used to improve the quality and usefulness of LA.  This is not a call to constrain students only to the material available in the VLE as this would limit the benefits offered by the myriad of third party applications such as Peerwise and Padlet and learning resources such as videos, talks and reusable learning objects.  Instead, this is a question of curriculum design and giving thought to the interactions that students can make with the VLE that will generate valid and useful predictions of the outcomes of modules and units for students.  The answer to this question will vary by discipline and subject.

A suggestion, which may not be relevant to all disciplines but will be for some, is the use of short and regular formative and summative assessments, perhaps using MCQs that can be automatically marked to reduce the marking burden on staff.  Such assessments are likely to identify whether students have grasped the required learning and understood the threshold concepts on which later parts of the module will build.  The early warning provided by these assessments enable action to be taken to build knowledge before it is too late.  This suggestion also addresses another challenge for LA, that interaction with the VLE does not mean that students are learning.  This is certainly true and it is expected that assessment will help to identify if the desired learning outcomes are being achieved before any end of unit or module assessment takes place.

A second suggestion relates to social media tools and a review of whether it is possible to bring discussions and group interactions within the boundaries of the VLE rather than using popular and accessible third party solutions.  The attraction of using always-on third party tools is strong but there is an argument to be made for conducting such activities in the VLE, not least from a privacy and data protection perspective but also so that staff can engage in the conversation if that is appropriate and use social media for social learning and constructivism which authors such as Laurillard (1999) say is so important.

Challenge 2.  Co-ordination across the institution.  In the introduction, it was mentioned that data for inclusion in LA can be sourced from many locations.  The co-ordination challenge is the process of managing the myriad of potentially useful information to generate a consistent whole that efficiently and effectively supports student learning.

Possible solution: Building on the first solution that addressed the need to consider LA during individual unit design, the solution to the challenge of co-ordination is to make LA central to the learning process across the organisation (Pardo et al 2015; Persico and Pozzi 2015),  This can be achieved by defining the requirements and organisational policy for LA and supporting staff in the embedding, interpretation and use of LA within individual units.  Co-ordination encompasses the aspects of LA that are common across the organisation (Brooks et al 2014) and also recognises the discipline and subject specific characteristics of programmes and units (Winne and Baker 2013).   Policy, communication to staff and students, training and transparency are key elements for addressing the need to co-ordinate LA across the institution.

Challenge 3.  Time.  In order for the LA engine to be able to provide predictions there needs to be data available from past iterations of a unit.  Over time and with more data to include, the predictive engine can become more accurate.  There is a double challenge here in that the need for consistent and voluminous data would suggest that the unit should remain unchanged whilst LA is being used and also that it could take some years before accurate predictions are available.

Possible solution:  The data on how quickly LA begins to provide meaningful information is sparse and depends on factors such as how often a unit is run and the characteristics of that unit in terms of design for LA (challenge 1).  A case study by Georgia Southern University (D2L 2018) suggested that several iterations were necessary when they reported how using “a baseline of data from five previous semesters … they ran an analysis on the current course. By the time students were about 10 weeks into the course, the Brightspace Student Success System was predicting final performance with 70% accuracy.”  LA will start to generate predictions after a single run of a unit and it will depend on the nature of the unit to define how accurate and useful this information is.  In the interim, Hoel and Chen (2014) recommend identifying the low-hanging fruit or quick wins that LA has to offer and to focus on these aspects.

Many institutions promote the development and improvement of units based on student feedback, new technology, staff innovation, programmes such as the peer review of education practice and institutional initiatives.  For the 5 years that I was unit leader on a project management unit, I made annual changes to the content and assessment of the unit in a continual attempt to make it more interesting, engaging, authentic and beneficial for students.  Changing the usage of elements such as discussion forums, formative assessment MCQs, the scope and range of content and introducing flipped sessions makes it difficult to form a year on year comparison of the unit, a fact that will diminish the application of LA.  That said, my later changes included the introduction of weekly MCQs which are more likely to be strong predictors of student success that support the use of LA.  Perhaps the solution here is to develop a unit being mindful of how the changes could inhibit or reinforce the use of LA.  In addition, VLEs such as Brightspace provide modules that provide unit staff with a view of how the current cohort is performing with reference to other students in the same cohort.  This relative reference (Wise and Vytasek 2017) can provide information to support staff making interventions to help student learning.  Staff with a role such as academic advisor will also be able to view a student’s progress across the year and in different units to gain a more holistic view than could a unit tutor and thereby gain benefit from using the output from LA while the predictive aspects develop accuracy over time.

This post has looked at three challenges to using and implementing LA in an institution by assessing the potential issues of an organisation focus, unit design and time.  For each challenge,  number of possible solutions were presented that seek to mitigate the challenge and enable staff to begin to realise the benefits of using LA.


Brooks, C., Greer, J., & Gutwin, C. (2014). The data-assisted approach to building intelligent technology enhanced learning environments. In J. A. Larusson & B. White (Eds.),Learning analytics: From research to practice
(pp. 123–156). New York: Springer.
D2L. (2018).  Georgia Southern University. Shining a light on student success.  Available at:
Laurillard, D., 1999. A conversational framework for individual learning applied to the’learning organisation’and the’learning society’. Systems Research and Behavioral Science, 16(2), p.113.
Hoel, T, and Chen, W, (2014). Learning Analytics Interoperability–looking for Low-Hanging Fruits. Proceedings of the 22nd International Conference on Computers in Education. Japan: Asia-Pacific Society for Computers in Education
Pardo, A., Ellis, R.A. and Calvo, R.A., 2015, March. Combining observational and experiential data to inform the redesign of learning activities. In Proceedings of the Fifth International Conference on Learning Analytics And Knowledge (pp. 305-309). ACM.
Persico, D., & Pozzi, F. (2015). Informing learning design with learning analytics to improve teacher inquiry.
British Journal of Educational Technology, 46(2), 230–248.
Winne, P. H., & Baker, R. S. (2013). The potentials of educational data mining for researching metacognition, motivation and self-regulated learning. Journal of Educational Data Mining, 5(1), 1–8
Wise, A.F. and Vytasek, J., 2017. Learning analytics implementation design. Handbook of learning analytics, 1.
Photo by Headway on Unsplash

Leave a Reply

  • (will not be published)

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>