Report from the ascilite 2014 ‘Rhetoric and Reality’ conference

ascilite Rhetoric and Reality Conference Logoascilite 2014: ‘Rhetoric and Reality’ – critical perspectives on educational technology

What are the current concerns of the Australasian educational technology community in supporting and managing online learning? Looking beyond the rhetoric of technology enhanced learning engagement and innovation, how are Australian and New Zealand (ANZ) higher education institutions actually faring in the implementation of their on-line learning policies and practices?

Courtesy of a UCISA travel / dissemination grant for the 2014 technology-enhanced learning (TEL) survey project, I was able to take the trip ‘down under’ to a cold and wet Dunedin, New Zealand to attend the ascilite annual conference (24th – 26th November), presenting a paper on UK TEL trends  whilst at the same time finding out a little about ANZ practice. Critical thinking appears to be alive and well in this engaging forum, with myth busting presentations on a range of topical themes.

Learning analytics: vendors are constantly pushing the ‘game changing’ properties of analytics solutions to university senior managers, but to what extent does the technology in its current form match the requirements of higher education? Shane Dawson (University of South Australia), in a thought-provoking keynote, questioned the value that is being extracted from analytics software across the HE sector, viewing the technology as a blunt instrument in the way that it is currently being implemented.

Title slide of Shane Dawson's presentation on 'What are we learning from learning analytics?'Shane highlighted the tension between suppliers’ solution-driven approach and the educational objectives of universities to the use of analytics. Vendor success is determined by the volume of data feeds presented to university administrators and teaching staff (echoes here of the number of course sites created within an institutional VLE as a success metric – a feature of VLE hype in its early days), but this is meaningless without some form of application to pedagogic interventions that change academic practice and enhance the student learning experience. To be truly effective universities need to develop an enterprise approach to their implementation of learning analytics, drawing on combined support from faculty, administrative and IT and learning technology staff, as well as researchers (data wranglers) who can filter data and create outputs to address the ‘right’ questions on institutional learning and teaching issues, rather than simply work from the generic dashboard outputs that are currently being offered. Shane argued that a cultural shift through a combined services approach is needed to shape meaningful interventions in support of learning and teaching, which will have some impact on the learner experience – otherwise the analytics data will merely serve as an administrative tool with little added value to the institution.  (Shane’s slides are available here.)

Cathy Gunn (University of Auckland) went further in cautioning anyone from seeing analytics as offering a complete picture of student learning; looking beyond the hype,  she remarked that analytics offers a layer of evidence on student learning which needs to be combined with qualitative measures and interpreted in an holistic way in order for it to be truly meaningful.

Linda Corrin (University of Melbourne) reported on the student perspective on learning analytics, based on pilot studies where dashboards of VLE activity (back-end data on log in activity, quiz performance etc.) were shared with undergraduates and postgraduate research students to help them reflect on their own study methods and the way they were progressing across a course. Students were presented with the dashboards of their personal performance and the course average at different milestones of the course. Postgraduates questioned the ethics of recording such data without their prior permission and refused to engage in a reflective discussion on their learning. Undergraduates were more accepting of the fact that data on their learning was being routinely captured, but interestingly the availability of the data and the key messages it conveyed to them on their online activities did not lead them to consider changing their study approach; rather it led them to justify their current study approach or respond in a superficial way –  ‘upping’ their log-in activity to perform better in the statistics in relation to the course average – reflecting a gaming rather than a reflective study approach. Linda conceded though that dashboard statistics cannot capture a holistic picture of student self-study activity – i.e. what’s going on outside the VLE.  Whilst useful as a stimulus for reflection on student activity and performance, the data from these pilots did not provide evidence to suggest that students actually change their study behaviour as a consequence of seeing statistics on their online performance and activity.

Flipping the classroom: Birgit Loch (Swinburne University) raised some important questions on blended learning implementation for mathematics, and highlighted the risks of unqualified acceptance of flipped learning design methods for some disciplines. Reflecting on a flipped design for a mathematics course that she delivered, she found that 22% of the class did not engage with the pre-class activities or struggled to engage with them, which restricted their participation in the face-to-face activities on campus. She questioned the transferability of flipped design to disciplines such as mathematics, focusing on the maturity of undergraduate students who are often targeted for this style of learning, yet may not have the requisite academic skills and self-discipline to engage effectively with the front-loading of conceptual learning through lecture recordings and pre-class activities. Birgit highlighted the ethical problem of pursuing a flipped design, when there is a risk of leaving large number of students behind with these innovations. How should we respond to students who won’t or can’t engage – preferring traditional lecturing methods (face to face) to grasp the key concepts?

Jiangang Fei (University of Tasmania) addressed flipped design from the perspective of lab work, with students viewing lecture recordings before they engaged in classwork. A number of benefits were derived from this approach, with students better prepared for their time in class and the instructor was able to manage time more effectively in teaching delivery. However Jiangang conceded that whilst these represented genuine improvements on the old course design, there was no evidence to suggest that the flipped design led students to engage in deeper learning and improvements in their final exam performance.

Student use of technologies: Continuing the myth-busting theme, Maree Gosper (Macquarie University) reported on a longitudinal study of Australian students and their technology usage, comparing survey results from 2010 and 2013 as a way of validating technology adoption trends. (The full SEET report is available here.) The study reported on a combined set of survey responses from Macquarie University, University of Technology Sydney and University of Western Sydney students, and was inspired by other international studies on this theme such as the Educause ECAR report.Educause Logo Unsurprisingly VLE usage remains at the heart of student practice, but there have been a number of key changes since 2010, most notably  through  the increased use of social media (Facebook) and web tools (Google docs) and mobile apps by students to support their learning. Smart phone ownership is pretty much ubiquitous (University of Western Sydney has indeed handed out over 45,000 iPads to their students to promote their digital inclusion agenda), but tablets are predicted to rise in the near future and become the learning device of choice on campus. Peer-to-peer communication through texting and Facebook usage has also increased from 2010. The strongest technology demand that students have is for lecture recordings and podcasts to be made available to them. The hype around MOOCs appears though to have had little bearing on the undergraduate experience. 77% of respondents had never heard of them and only 7% had actually enrolled on one of them.  There are clearly parallels here to the UK sector, with Jisc  /NUS research studies reporting similar findings on mobile usage and the demand for interactive content amongst UK students ( as summarised in this blog post), whilst MOOC courses appear so far to have had little impact.

New developments across the sector

The conference also had a forward-looking agenda in considering national priorities for learning technology development over the short to medium-term.  In this respect it was good to see representatives from the New Zealand Ministry of education present at the conference and active in promoting a national discussion on the integration and uptake of new education delivery models, inviting institutions to describe how they are tackling innovation. The National Centre for Tertiary Teaching Excellence (Ako Aotearoa) is helping to foster this sharing of practice, as well as providing national project funding for innovations in mobile learning (e.g. as a means to help vocational students to capture and log placement activities and reflect how they are applying their learning, or to help foster community-based learning outside the classroom).  Another important initiative has focused on the development of a national set of e-learning guidelines to inform course design and delivery (http://www.elg.ac.nz ). ACODE benchmarks for technology enhanced learning document cover pageThe guidelines are intended to help institutions benchmark their own e-learning practice, with key questions addressing the role and responsibilities of e-learning managers, organisational leaders and QA bodies (handy when preparing for an institutional audit!), as well as help them to think about the provision they offer for the learner and the design and delivery responsibilities of the instructor. This is a useful resource (licensed under creative commons), with each perspective supported by an individual workbook to guide the reader through the quality issues under review.
Benchmarking is certainly alive and well across the Australasian sector, with 24 institutions from 5 countries attending an ACODE event in June to share findings on institutional benchmarking reviews on the revised ACODE standards (Australasian Council on Online, Distance and e-Learning), which have been redesigned from a pure distance learning focus to a broader technology-enhanced learning coverage and are also available to other institutions to use.

Things to think about for the future

There were numerous take-aways from this conference, but here are a few concluding thoughts on staff development that struck me as being particularly important for future practice. In a number of sessions presenters critiqued the ‘deficit model’ in staff development, which  predominates across the sector. Typically this focuses on identifying and addressing what teaching staff do not know in their digital practice through guides and workshops, rather than addressing the context in which technology may be employed to support instructional goals. Maria Northcote (Avondale College of Higher Education) was one of many speakers to address this theme, highlighting how the College has repositioned its staff development focus, moving from workshops to the embedding of showcase materials within its institutional VLE, enabling staff to ‘graze’ for ideas through themed case studies and then drill down into detail if they wanted to pursue a particular approach. Showcase resources are exposed to staff via a top banner within the system, ensuring their visibility. This seems like a sensible approach in engaging academics through case-based illustrations rather than through functional ‘how to’ cookbooks, which fail to acknowledge the academic as an active agent in technology design. Food for thought indeed!

Posted on behalf of Dr. Richard Walker

 

4 responses to “Report from the ascilite 2014 ‘Rhetoric and Reality’ conference

  1. Pingback: This Week These Weeks in Learning Analytics (November 29 – December 12, 2014) | Timothy D. Harfield·

  2. Pingback: ‘Rhetoric and Reality’ – critical perspectives on educational technology from Down Under | ALT Online Newsletter·

  3. Pingback: Report from the Echo 360 (Replay) – ANZ Community Conference 2014 | E-Learning Development Team·

  4. Pingback: Open Education and Learner Analytics – towards a critical discourse | E-Learning Development Team·

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s