Following gov.uk’s guidance on performing both automated and manual testing for accessibility, the PDLT decided to work with Library and Archives to run a user research workshop with disabled students. We ran two half day sessions with five students (four visually impaired and one dyspraxic/dyslexic). Although we wanted a wider range of abilities, this was our first time trying to organise such a workshop and getting in touch with a range of learners with various needs was more tricky than I thought it would be. As it turned out, we learned so much about a range of visual impairments and how to improve what we did to help those using screen magnifiers or a range of different screen readers, taking into consideration laptop and mobile experiences. View our ‘how to run a user research workshop’ guide [UoY login required] so you can repeat the process if you wish to.
We tested two vles (Blackboard and Canvas) on desktop and mobile as well as Blackboard’s web conferencing tool, Collaborate, looking both at the way users accessed and used the site as well as how they used the resources on the site. We were also able to see how a screen reader user would access alternative formats using Blackboard Ally. Library tested our catalogue search, Yorsearch, and our reading list tool, Leganto. We invited the student app team to test the new student app too. Below is a summary of what we’ve learned, including a link to Amy Eyre’s more in-depth reflections on Blackboard Mobile and Collaborate.
Yorsearch and reading lists
These sessions really opened my eyes to the ways Leganto can present to users, even when using the same screen-reading software. There were subtle nuances to the way these software behaved, depending on which platform they were being used on. These subtle nuances made a big impact on the users’ ability to use and understand the array of information in front of them.
Our students have a great knowledge of the software aids they use and how particular data presentation styles affect their useability. This was particularly useful to us, giving us a solid focus on things we need to implement going forward.Jess Bull
It was great to meet them all – they were all very friendly, showed great patience and a genuine will to help facilitate the shape of the products we use.
This was an incredible opportunity to spend time with some of our inspirational students. Even though we had carried out accessibility testing as part of the YorSearch project, there is no substitute for actually being with users as they make use of our systems. One of the most useful parts of the two days for me was realising that no two screen readers work in the same way, and that the screen readers used by our subjects could themselves be configured to work differently. I also learned that you need to tailor your questions to your subjects as some of the tasks I’d prepared in advance were not at all relevant to, for example, a vision-impaired student.
I was pleased at how YorSearch performed in general, but some obvious areas for improvement quickly became apparent – mainly in the way we’re integrating complementary applications into Primo. Our catalogue-enrichment product (Syndetics Unbound), for example, was practically undetectable to screen readers. That said, other products such as LibKey were very positively received by all of the subjects.
There were a number of fundamental issues with the reading list system (Leganto) which was disappointing, particularly given the fact that it’s essentially based on the same platform as YorSearch. This presents us with an opportunity, however, as we have a good relationship with the software supplier, and, as a member of the International Product Working Group for Primo, I intend to push for implementation of as many of the Primo accessibility features in Leganto as possible.
Thanks very much for giving me the chance to participate!
Nathan Page and Jonathan Ayto:
It was a great experience to see (and discuss) real use of Canvas and its accessibility features. I had thought about the user experience a lot but of course that can’t really be done in a bubble.
The users were a pleasure to work with and we came away with valuable information on two complex accessibility issues that we’d been grappling with:
- We use ‘click and reveal’ elements on our Canvas vle – how accessible were these to a screen reader?
- The transcripts we currently used for our online learning course videos – how understandable were they? Was the format desirable?
It turns out our click and reveal elements have to be reworked to make them accessible to screen readers, and we can improve the way we add slide descriptions on our video transcripts.
I found the experience to be very rewarding, and I feel that I have been able to glean some useful insights that I can feedback to the product vendor (Panopto) and keep in mind when it comes to the deployment of lecture recording provision via the VLE.
The two key tests that I undertook pertained to the ease at which a user could find recordings via the VLE, and the user experience of interfacing with the Panopto Web Player.
One key shortfall that I uncovered pertained to the lack of alternative text instructions/’tab-able’ anchors for a screen reader user to make notes/bookmarks within Panopto. This means, unfortunately, that without prior knowledge/outside instruction, screen reader users are not provided with sufficient information through the interface alone to reap the benefits of the full functionality.
I feel that this feedback will be useful to Panopto, and will feed it back to their product development team.
Simon Davis / Louise Morgan / Stevie Paterson:
There is no amount of reading which can match up to quality time with a user!
Reflecting on the session, I realised that what we think we know about accessibility can be much richer when faced with the real everyday challenges they face.Louise Morgan
I loved watching them navigate the software and question our design decisions. In some cases we had nailed it (brilliant!) but in others we fell short. Falling short can have a direct impact on a student’s experience at University and I feel passionate about making sure our software is fully inclusive. Simplicity and consistency seemed to be the key. If the buttons/icons were simple and consistent, a student found that they could memorise and navigate round the app without a screenreader.
Another key learning was around zoom functionality on the app. Many apps don’t have the functionality to zoom which can force the user to have to resort to a web browser version, this is less convenient and sometimes with limited functionality.
When looking at the new Check-in function as part of the app we found out about some really basic things that would make a big difference, mainly around being consistent. In particular we learned that the way we are presenting class times is inconsistent and could be frustrating over time to the user. I also learned how presenting key information in headers or as images can speed up navigating to information the user wants. Being a part of this session has been key in helping to understand priorities for how we can improve.
Thank you for the opportunity to take part and it is something I would recommend to anyone working on a user facing project.
Blackboard mobile and Collaborate
Thanks again to the students that joined us for the user research sessions – I echo everyone else’s comments on how helpful and educational an experience it was.
My main focuses on the day were the Collaborate Ultra webinar platform and the mobile experience of the Yorkshare VLE (particularly via mobile apps). I struggle to be concise and we had so many useful findings that I ended up writing a full blog post about my experiences on the day, the below is just a summary.
See more: Full blog post – “E-Accessibility, the Mobile VLE and Collaborate Webinars”.
Students using accessible resources
Students are not always aware of how to make best use of accessible documents and our students with disabilities are no different. Although screen reader users are used to using headings to navigate a web page, they don’t expect to be able to do this for documents like PDFs or Word documents, probably because these have traditionally not been well-formatted. So as we start to make more accessible documents, we need to run a campaign to make students aware of how to use these to learn more effectively!
Mathjax allows a screen magnifier user to zoom in to equations and retain the high resolution that they need. It also allows screen reader users to interrogate parts of an equation in a contextual way, although we will need to upskill our screen reader users to familiarise them with this way of accessing maths. We’re already encouraging academics to create more accessible resources that contain equations.
It was lovely to listen in on Friday to the two screen readers comparing their technology and how they found different ways of doing things. It made me realise that I hadn’t factored in some socialisation time in the session for everyone to get to know one another. It was hard to break up the conversation and turn our attention to the work at hand. I would advise others to organise a meeting before the workshop just for the students to get to know one another before the actual user research workshop itself. In fact, a meeting for those wanting to test their software would help too! We all learned something from viewing Louise Morgan’s test plan before the day.
My key takeaways from working with the students are around making Google Slides embedded on the vle more accessible, making students aware of how to use accessible documents, producing an introduction to the vle for induction, and maybe putting on ‘how to use a screen reader’ classes for anyone interested. The latter will have to be run by one of our screen reader users!