Posted on behalf of Richard Walker
In this session David Swallow of the Department of Computer Science’s Human Computer Interaction research group at the University of York reported on the development of a protocol for assessing the accessibility of software applications. PROTEA (Protocol for Testing E-Accessibility) was designed to assist institutions to be able to make an informed decision about the accessibility of applications as part of the procurement process and can also be used to assess the usability existing software solutions.
Why is a protocol necessary when there are existing guidelines and standards out there? David answered this question by highlighted the limitations of existing Web Content Accessibility Guidelines (WCAG), which most vendor systems claim that they meet. WCAG was designed for web content only and is difficult to apply to complex systems such as enterprise-wide learning management systems, for which the standards are not precise enough. The answer is to test these more complex systems with real users, who can provide first-hand accounts of accessibility and usability challenges through the use of a ‘thinking aloud’ protocol, capturing the challenges as users encounter them in tackling a predefined task. This forms the basis of the PROTEA approach, which David went on to describe.
PROTEA draws on handpicked test subjects – 3 of whom are blind, 3 are partially sighted and 3 of whom are dyslexics. There is a clear rationale for make-up of this user test group. David noted that blindness, partial sight and dyslexia will address 80% of the usability and accessibility challenges that any online system will face. Testing is based on the use of JAWS screen reader software or WindowsEyes (Windows 2010 or above), with partially sighted subjects using SuperNOVA and ZoomText. For dyslexics there are a diverse range of solutions available, but it is important to offer control of size/typeface of text and contrasts in background colours, enabling changes in font and use of different style sheets, as there is no solution which will address all requirements.
David was accompanied by Graham -an accessibility consultant and blind test subject to demonstrate how the PROTEA approach works – showing how a user would work through a series of authentic / real world tasks using the application being tested. The tester is recorded attempting the tasks, and is asked to provide a commentary of their actions using the ‘thinking aloud’ protocol highlighting how they are faring and communicating any problems they are encountering in completing the task. The test subject is asked to rate problems on a scale of 1 through to 4; 1 being a minor cosmetic problem and 4 being catastrophic, preventing the tester from proceeding and unable to complete the task in hand.
This is something that all institutions should take an interest in – not just for the procurement of new technologies but for the review of existing provision, with a focus on anticipating and testing for accessibility / usability challenges, rather than reacting to issues when they raise – a ‘wait and see’ philosophy which runs counter to the spirit of the Equality Act of 2010. David concluded by reporting on the work that the HCI group have conducted with the University’s E-Learning Development Team to review upgrades of the University’s Blackboard virtual learning environment. David reported on outcomes from the testing for Blackboard v9.1 Service Pack 12 last year. The preliminary results of the accessibility tests highlighted issues relating to the accessibility of Blackboard’s content editor for blind and partially blind users employing screen reading software such as JAWS (v. 15.0), which were communicated to the vendor. These issues had not been reported by users, but the testing results enabled the University to anticipate challenges in accessibility, which were reported on to the software vendor. This has led to improvements in the core product and some swift fixes to the Service Pack 12 release from the vendor, as well as a longer commitment to work with the University of York on usability testing.