ALT-C 2016 – Learner Analytics – the current state of play

Learner Analytics – the current state of play

This year’s ALT conference (‘Connect, Collaborate, Create’) identified learning analytics as one of the key themes for discussion. A wide variety of perspectives were shared on this topic in the parallel sessions on day 1 of the conference.

Learning analytics is clearly an evolving service management area for universities.  As this year’s UCISA Technology Enhanced Learning Survey findings have shown (see associated blog post and YouTube summary of analytics developments), the establishment of institutional learning analytics services is still in its formative stages across the UK higher education sector. Only 20 institutions appear to have established learning analytics services which are used by students and typically these services are being employed on a small-scale across 1% – 4% of their courses. There appears to be a lot of scoping work going on, but a lack of clarity remains on the role that analytics services should play in supporting learning and teaching activities and the issues that they are meant to be addressing. This reinforces the findings of the Heads of e-Learning Forum Report (2015) conducted by Barbara Newland, Lindsey Martin and Neil Ringan which highlighted the ‘limited levels of understanding’ that senior management have regarding the possible benefits and outcomes of implementing learning analytics.

Jim Emery and Sheila MacNeill  reflected candidly on the Glasgow Caledonian experience in scoping out their own institutional service. Glasgow Caledonian is one of over 50 universities and colleges working with the Jisc to develop their own institutional solution and support the development of sector-wide resources.  Discussions have focused on institutional culture, processes, people and technology.  Of these areas, the technology review has thrown up some interesting challenges in terms of defining relevant data sources (e.g. attendance monitoring through swipe cards; Google analytics) and ensuring that these systems integrate in some way with the main institutional VLE, which serves as the principal data source on student learning. However, assuming that the technology ‘plumbing’ can be addressed, the key message from the Caledonian experience is that senior management  leadership ‘buy in’ is essential to any sustainable analytics initiative,  ensuring that there is strategic alignment between the ambitions of learning analytics and digital strategy initiatives across the institution. This means getting the organisational culture right –so that staff engage with data in an appropriate way through a code of practice with due attention to the ethical issues bound up with data management.

Neil Witt’s presentation developed this theme in greater depth, focusing on the importance of organisational culture over technology in his institutional checklist for analytics, which he has developed at the University of Plymouth. Whilst the technology challenges in establishing single versions of truth for data and their joining-up cannot be underestimated, the key task is to ensure that information is handled in the right way. Neil spoke of establishing a culture of respect for information throughout an institution, supported through policy changes and staff training.  This should involve clear communication to students on how analytics will be used to support learning interventions, with transparency over the data sources and applications that will be employed by staff.

The concern for transparency and appropriate ethical use of data was reiterated in Sharon Slade’s presentation on the UK Open University’s experience. Sharon noted the need for an institutional policy communicating to students in plain English how analytics will be used to inform learning support decisions, and highlighted the importance of student engagement in the review of analytics. Sharon remarked that OU students had not been keen on tracking services being employed without their direct involvement and that they wanted tailored services to help support their personal learning.

Finally, presenters covered familiar territory in commenting on the accuracy of data and the conclusions and appropriate actions which may be derived from them. As was noted in the plenary discussion following on from Cathy Gunn’s presentation in last year’s conference (see ALT-C 2015 blog post), we need to  be wary of jumping to conclusions on student performance based on ‘false positives’ and corrupted data. Paul Prinsloo (UNISA – South Africa’s open learning university) reinforced this point in highlighting an associated risk – the collapsing of contextual meaning when joining together data from disparate sources and arriving at conclusions based on algorithmic decision-making.  We may conclude that whilst it makes sense  to establish milestones  to review student performance and to consider predictive modelling through multiple data sources as a way of identifying support challenges, there has to be a point when (human) conversation takes over from automated decision making on student outcomes, when deciding on appropriate actions to support individual learners.  As Neil Witt concluded, a response to a ‘red flag’ should not be an automatic process but the start of a conversation between the tutor and student in question.

2 responses to “ALT-C 2016 – Learner Analytics – the current state of play

  1. Pingback: E-Learning Newsletter – September 2016 Edition | E-Learning Development Team·

  2. Pingback: UCISA TEL Survey | E-Learning Development Team·

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s