Testing workflow

Most in-depth tests at AUL have a one month turnaround, but depending on the lab load and availability of testers, testing can take longer. The timelines are set by the lab staff in collaboration with the client (the service owner) early on in the process. All testing follows a standard workflow - let us look at the three stages of what happens here before, during, and after testing:

  • Before testing
    • decide if a project should be tested by the lab, based on our testing priorities
    • make sure that the project is 100% or almost 100% finished, and that all major functionality has been developed.
    • ensure that there is a stable test environment to which all of our testers have been granted access
    • establish a timeline for the project with the client
    • choose the hardware platforms for the test
    • set communication expectations with the client (frequency, depth, and preferred channels)
    • come up with a list of actions (can be very short for simple apps, or long for complex ones) that the users are expected to perform with the site / app that is being tested
    • write a testing script, asking the client for clarifications when necessary (here's a sample script to give you an idea what to expect - accessible to anyone with a CU Boulder account on Google Drive)
  • During testing
    • conduct supervised standardized testing with a sighted observer present, following a strict testing script
    • have a client representative on call to address unexpected problems (access denied, data reset, login expired, etc).
    • if needed, update the client on how the testing is going and share early impressions of the project's accessibility
    • compile the test report and group problems by severity, starting with the blocking issues that render the application inaccessible, and ending with usability problems
    • relate all accessibility problems to the sections and subsections of the WCAG 2.0 standard
  • After testing
    • deliver the final report to the client as a PDF or a Google Doc (here's a short sample report to give you an idea what to expect - accessible to anyone with a CU Boulder account on Google Drive)
    • optionally, offer video recordings and live demos to showcase the issues. We strongly recommend live demos, especially for service owners and developers with limited accessibility knowledge
    • make recommendations on how to improve the usability of problematic elements (at the UX level rather than the coding level)
    • provide additional testing after the developer attempts to correct the reported issues
    • fill out our feedback form to tell us how we did and how we can improve

It is also important to note that there are a few things that AUL will not be able to do:

  • interact directly with non-CU entities, vendors or developers. All communication with external audiences is best done by the service owner.
  • reveal the identity of testers
  • make decisions to launch or stop a service - we can only provide recommendations and evaluations about what we believe to be the best course of action
  • correct the application's source code to remediate the encountered issues.