This year’s ALT conference offered multiple tracks on learning technology and a vast choice of sessions to attend, but two key themes struck me as being particularly prominent in discussions at the conference, namely institutional approaches to open learning provision and a concern for learner analytics – the interpretation of big data to inform targeted support to students.
Open education was addressed in a number of ways, with the case in favour most powerfully made by Laura Czerniewicz in her keynote presentation, who called for greater policy attention from government and institutions to the social opportunities afforded by open learning, consistent with a global social equality agenda. In her estimation this requires a concerted effort from government and higher education institutions to reclaim the open education agenda from commercial providers. In terms of practical steps, the keynote called for a rethink on student literacies and support for learners to engage with online courses. Citing Helen Beetham’s work on digital capabilities Laura acknowledged that digital skills are far shallower than we tend to think and as we know, open provision has largely benefited the educated – i.e. professionals and those with a first degree and with established learning skills and not the disadvantaged and younger learners who were originally thought of as beneficiaries of open education (Grainger, 2013). She framed the key challenge for the sector as providing verifiable accreditation for free courses to enable learners to reap the benefits in terms of their educational development, social mobility and professional development.
It was refreshing to hear a critical perspective on existing OER provision across the sector from Rachel O’Connor (Open University). In her study of MOOCs she highlighted the hidden barriers that can prevent students from effectively engaging with learning materials. She highlighted the use of disciplinary jargon and unexplained key concepts in course materials which exclude learners who do not have the necessary disciplinary training in place to begin with. We may speculate why these barriers to open learning actually exist – whether they are an unintended consequence of re-purposing existing teaching resources without thinking about the target audience and the entry level knowledge and skills which are required to engage with the course – or whether they simply reflect the true agenda for MOOCs as showcases for accredited fee-paying courses and university research rather than authentic learning opportunities for all. Make your own mind up – but it is telling how much commercial organisations have invested in this domain in recent years and how active marketing departments within universities have been in encouraging the development of open courses. In defence of MOOCs, a range of presentations cited the transformational impact that they are now having on teaching staff (note the presentations by Leeds and Bath) – exposing teaching staff to the potential of interactive content and lecture recordings which in turn are being applied to their campus-based teaching – in this way enhancing the learning experience for their students. The trickle-down effects of academic engagement with MOOCs on blended learning and the campus-based student experience is a compelling message to share with the learning technology community and one that we have heard before (CETIS, 2014), but the evidence base in support of it remains anecdotal at best.
Jonathon Worth’s keynote looked at open education from a different angle, exploring the perspective of students as participants and knowledge creators. The rush to the adoption of free and open technology has commonly been presented in previous ALT conferences in an uncritical way as a liberalising and necessary step to take in response to the constraining environment of managed learning spaces such as VLEs and the ‘walled gardens’ which are built around student learning. Jonathon’s presentation was an important corrective to this discourse, highlighting the risks of placing students within open spaces, without their informed consent as to the consequences of exposing their personal data and content within the public domain. Highlighting the right to be forgotten and the entitlement to make mistakes as a necessary part of the learning process, the keynote offered a valuable cautionary note on the risks of social media for accredited learning. (See the University of York’s guidance to staff on the use of external IT services, highlighting how we are flagging these risks and responsibilities to teaching staff.)
Turning our attention to learner analytics, there was a wide variety of perspectives shared at the conference. Rebecca Ferguson reported on the Open University’s ambitions to collect, analyse and report on the data of their learners and their contexts for the purpose of understanding and optimising learning. Citing Tim Renick (Georgia State University) Rebecca set out the ambition for learner analytics to support actionable interventions which may ultimately eliminate achievement gaps based on race, gender and social backgrounds. There is already well established practice in the United States, with Purdue University establishing a traffic light system to classify each individual student’s progress and to alert teaching staff to individuals who are going off track. The key of course is to guide staff on what to do in response to this data in terms of how to take action to support learners, so that interventions are timely and supportive of students. This all assumes that teaching staff have the digital literacies to make sense of the data that is being presented to them in the first place – a point made by Cathy Gunn in her review of analytics projects funded by Ako Aotearoa in New Zealand. It also requires data to be stored correctly in the first place and coordinated across multiple learning systems – an infrastructure which may not naturally be in place in some institutions. A common message from this conference and others (see Shane Dawson’s keynote presentation at ascilite 2014) has been the need for institutions to invest in a cross-service team to coordinate the generation and presentation of data to academics, encompassing contributions from data wrangler intermediaries, as well as data managers, educationalists and statisticians to source, ‘clean’ and present data for educational experts to interpret and act upon.
The impact of analytics depends first and foremost on the questions being asked and the data being collected. Who owns the analytics agenda within institutions? This is crucial to the outcomes that will be addressed. Rebecca Ferguson noted the tension between senior management interests, which may be directed more towards seeking out evidence of successful modules which enhance a university’s reputation, rather than the goals of teaching staff who might be more interested in the learning that is taking place – specifically how students are engaging with learning resources and participating in targeted learning activities in relation to predefined study goals.
Defining the threshold for intervention is critical and should provide a trigger for supporting interventions for students. There is of course an ethical issue at the heart of this strategy though in terms of how students are informed about the monitoring that is taking place. The presentation by Brockenhurst College on the use of predictive analytics to identify students before they arrive at the institution drew attention to this point. Using profiling data to identify and track ‘at risk’ students during the first 42 days may be crucial to the institution’s retention strategy, but raises serious questions about the ethics of learner analytics and how informed consent is sought from students for this activity. On this theme, there was an interesting counter-perspective from the University of Hull, where students on a foundation course have been directly consulted by their own student representatives (not staff!) on the type of data visualisations that they would like to receive to support their learning. Patrick Lynch reported how this crowd sourcing method had engaged students in a proper conversation on how analytics works and served as an effective way of stimulating discussions between staff and students about their learning. This is an interesting approach, although the scalability of the service is questionable and will no doubt depend on the agility of support staff to assess the value of student requests and provide the necessary support and follow-up to students.
Finally, there is of course the crucial consideration as to the accuracy of analytics that are being produced, particularly when they are used to predict or monitor student performance. The dangers of using Google analytics and the corruption of internal data with external input were addressed in the plenary discussion following on from Cathy Gunn’s presentation. Attention should also be directed to the underlying assumption that using log in data to learning management systems or other tools will give an accurate reflection of a student’s learning activity – clearly this will not capture what is taking place outside the ‘walled garden’. Arriving at false positives in identifying ‘at risk’ students carries ethical challenges which may adversely affect students and great care needs to be exercised when using these data sources to reach judgements on students and their academic performance.
Photo credit for featured image: Chris Bull www.chrisbullphotographer.com from ALT on Flickr https://www.flickr.com/photos/67220830@N02/21309515932/