Welcome to our December issue of the Accessibility Minute Newsletter! This newsletter is produced by the CU Boulder Digital Accessibility Office and covers one accessibility skill or topic per month. Please visit the DAO website for access to all past newsletters. As always, thank you for taking a minute (or two!) to read.

Assessing Software for Accessibility

We discussed talking points and questions to bring up with vendors in last month’s newsletter, “Talking to Vendors about Accessibility." This month’s newsletter is part 2 of the series, which will cover what you can do to evaluate and choose accessible software once you have communicated with a vendor. We will discuss testing options ranging from basic manual testing you can do on your own to third-party testing options and resources provided to you by the CU Boulder Digital Accessibility Office.

Measuring Accessibility

There are different ways to measure the accessibility of a software product. At CU Boulder, when we measure accessibility, we consider both the experience of the user as well as whether software meets globally accepted accessibility standards.

The global accessibility standard we use at CU Boulder is the most recent version of the Web Content Accessibility Guidelines (WCAG). WCAG outlines four main principles of accessible software: Perceivable, Operable, Understandable, and Robust.

  • Perceivable means that users can access digital information in a sensory modality that works best for them.
  • Operable means that users can navigate and interact with software using devices and assistive technology of their choosing.
  • Understandable means that software acts in predictable ways, helps users avoid errors, and clearly communicates content to users.
  • Robust means that software is compatible with a broad range of devices, browsers, operating systems, and assistive technologies.

While standards are essential to communicating about accessibility with vendors and across institutions, a more holistic understanding of the user’s experience is also critical to ensuring that software is actually usable by people with disabilities and users of assistive technology. We define usability of software by whether it is convenient to use, consistent, comfortable, and functional for the user. For example, software may be accessible when measured with WCAG, but require an assistive technology user to memorize numerous unfamiliar shortcuts in order to use it.

Assessing Accessibility

Once you know how accessibility can be measured in theory, it’s also important to understand how accessibility is actually assessed in practice. A person’s ability to accurately judge the accessibility of software heavily depends on their experience with accessibility, and it is not feasible for everyone who procures software on campus to have the level of knowledge necessary to perform a complete accessibility test.

We will present a few basic ways to get a low-level sense of the accessibility of a piece of software that someone without much accessibility expertise can accurately measure. Then we will provide recommendations about where to get more in-depth information once those tests have been completed.

Conduct Basic Accessibility Testing

To get an idea of how perceivable and operable software might be, you can do some very low-level manual software testing yourself by simply using your keyboard. You can navigate through content by pressing the Tab key to see if actionable items (such as links, buttons, and form fields) receive focus.

  • There should be a visual indicator that something has focus, such as a colored outline around the form field or button.
  • You should also assess whether these actionable items are in a logical reading order, meaning your focus doesn't skip around in a random order but rather follows the natural flow of reading through the page. These methods are only one limited way to assess accessibility. Still, if tab navigation is difficult, confusing, or impossible, then that indicates the rest of the software is likely to contain additional accessibility issues.

Another low-level manual test you can do to get a sense of a software’s accessibility is to test the color contrast of text and important elements in the user interface against their background. Testing color contrast is one way to assess how perceivable the software is. There are many tools to measure color contrast, but we recommend downloading the Colour Contrast Analyser (available for Windows and Mac). This tool is easy to use and allows input of RGB/Hex codes (preferred for accuracy) or an eyedropper feature if you don’t know the color code.

If you have access to the software you want to test, another way to get an idea of its accessibility is to run an automated test using third-party tools or platform-specific automated testing tools. It is important to note that the results of an automated accessibility test should be interpreted with caution for several reasons. Automated testing can result in false positives or false negatives, and some aspects of accessibility must be evaluated by a human, so automated testing always provides an incomplete picture of the accessibility of the product.

Depending on the platform and the testing tool you use, the results may be difficult to understand without specific training or a good amount of background accessibility knowledge. Even if you don’t understand all of the results, one key thing to look for is how many accessibility errors the testing tool identifies. If you have questions about the results of an automated test, you can contact the Digital Accessibility Office at DigitalAccessibility@colorado.edu.

While not a perfect assessment tool by any means, automated testing is a fast and free process that can contribute to your understanding of the accessibility of the product. These tools won’t catch all errors, but they can still provide you with a general understanding of the level of attention the vendor has paid to accessibility in developing the software.

Some options for automated testing include:

Arrange for Manual Testing

For the most comprehensive and accurate information about the accessibility of a specific software tool, reach out to the DAO Assessment and Usability team at AUL@colorado.edu to request a manual test. The A&U testing staff includes individuals with disabilities who utilize different access methods and assistive technologies. We may even already have information on software you are interested in using or are already using. In an upcoming newsletter, we will provide further information and summaries about commonly used ed tech tools that have been recently tested.

Planning for Accessibility Issues

If you manage software that has already been deployed on campus, you may be saying: what now? At the bare minimum, you should always assume that there may be accessibility issues with a product and provide clear public documentation about how users can report issues to your team. If a user lets you know that they have encountered an accessibility issue with the product, you should reach out to our team at DigitalAccessibility@colorado.edu. We can then help you find a way to report the issue to the vendor in a way that makes it clear this is an accessibility barrier and provide any additional information we were able to glean from testing.

December Challenge

  • Run an automated test using one of the automated testing tools mentioned earlier in the newsletter
  • Use your Tab key to navigate through a piece of software you currently use; does the tab order make sense given the way the interface is visually ordered?
  • Download the Colour Contrast Analyser if you don’t already have it on your device.

DAO News

DAO Office Hours are now the 4th Tuesday of every month from 1-2 pm MT. Our next office hours will be held on January 24th.

Your Thoughts

We want to hear from you about any questions or issues you run into while trying out this accessibility practice this month! Please send us your thoughts on this month’s topic.

If you have questions, comments, or would like support with accessibility, please contact us at DigitalAccessibility@Colorado.edu.

If you enjoyed this month's topic, please feel free to share it on LinkedIn, forward to your colleagues, or share on Twitter! They can subscribe by selecting the Subscribe button at the bottom of this email.