Putting Too Much Faith in Tools

Posted by Albert Gareev on Jan 26, 2016 | Categories: AccessibilityMy ArticlesStories

This story was featured in my article
published on StickyMinds –
The Politics of Accessibility Testing”, January, 2016.


Digital accessibility refers to software supported by users’ assistive technologies as well as accessibility within web browsers.

This concept, that software should be usable by the widest possible audience, has been around for more than twenty years, yet it remains out of the mainstream of testing and development efforts. Nowadays friendliness of information technologies and user experience are gaining more and more importance.

We have also seen diversity and digital inclusion become social priorities. On top of the implied social contract we have explicit legal contracts, such as Section 508 in the US and Canadian provincial legislations (AODA in Ontario, AMA in Manitoba, Quebec Standards for Accessibility, and others), which define accessibility standards for government and public sector software. This sets a trending example for the overall market. However, laws and regulations do not define accessibility requirements on their own; they refer to the Web Content Accessibility Guidelines and only dictate the level of compliance, from A to AAA. Note that this web standard is evolving, so the same laws mandate keeping up with the changing requirements.

Software developers – designers, programmers, testers, business analysts – must face the challenges to meet the legal standards and business needs.

Web accessibility is often spoken about in terms of technical solutions – design and programming. Less voiced are the challenges of the people involved in the change, conflicts of interests, and people’s perceptions that sometimes create problems bigger than technical challenges.

As a testing practice lead taking on the accessibility testing domain, I was surprised by resistance to the initiative. I couldn’t understand why such a noble goal was not welcomed. I was frustrated by misunderstandings and misconceptions – but I also was responsible for causing some of them.

I relate some situations I’ve experienced and provide some insight on people’s perceptions and reactions that may increase challenges in adopting accessibility.

Challenge #3:
Putting Too Much Faith in Tools

Test automation in general is a highly debated topic. The author belongs to the community advocating for skilled, high quality testing approach adaptable to any project. Within Context-Driven School of Testing we believe that testing is foremost critical thinking – which is inherently only humans can do – but testers may and should use tools where it’s applicable. This doesn’t make testing automated, at best, automation-assisted.

Unfortunately, there are still views on testing as something mechanical, repeatable mindlessly, and finite. Such views might be supported by smoothly handed wild claims of testing tools vendors, misunderstanding or misplacing the real value of tools. This leads to some odd expectations.

Robot2 On one project, challenged by the need to implement accessibility, the development manager was convinced that a tool could scan the entire application, perform all needed tests, investigate the problems, report them, and even tell the programmers how to fix them. Programmers were expected to learn coding for accessibility by fixing errors.

This decision was backed up by a previous successful experience utilizing a security testing tool that scanned the code for vulnerability patterns and was indeed helpful for programmers as a way to learn and avoid such errors.

People make decisions based on their understanding of the technical challenge and habits that have been proven to work. It seemed so straightforward that the testing team wasn’t even included in making the decision.

Initially, the approach seemed to be working. But it became apparent that the tool could only scan the home page and login screen. Wherever user interaction was required, the tool got stuck. Programmers had to manually navigate through the application to perform scans. This suddenly became very time-consuming. At that point they decided to involve the testing team.

Skilled testers tend to question any tools before trusting them. One tester noticed that the tool seemed to be checking whether images had an alternative textual description provided. She created a new data entry with an image of a kitten described as “guard dog.” The tool didn’t report any problem. In fact, as it was discovered with further testing, the tool also would happily pass on generic texts such as “This is an image” and “This is a link.” As the framework produced these generic textual descriptions, the development team using the tool assumed there were no defects.

The news about the tool didn’t make the development manager happy, and the team was blamed for using the tool incorrectly. But the programmers still struggled with gaining purposeful understanding of the accessibility problems and code fixes they needed to make. RepairRobot

What really helped was setting up collaborative sessions where the programmers gained experience of real users, operating the application with display turned off, with assistance of the screen reader. This helped them realize the meaning of such WCAG requirements as Perceivable and Understandable.

No one likes to hear that their decisions were wrong or based on invalid assumptions. Suggesting doing a pilot or proof of concept with the tool can be very helpful in setting realistic expectations. Keeping it practical, referring to the facts from the experience, and helping the team to gain such experience helps make well-informed decisions.


Accessibility is about Human Rights, Equality, Diversity, and Inclusion. It’s also a social responsibility and a legal obligation, at least for public companies. And this is an opportunity for business growth and demonstrating excellent customer care!

Supporting Accessibility is a noble goal, and people are willing to subscribe to it – in theory. In practice though, they may resist or appear resisting. Sometimes because supporting Accessibility will conflict with the interests they deem more important. Sometimes because they apply an approach they’re used to without consideration of the specifics of Accessibility. Sometimes it’s just a bit of misunderstanding. After all, this is very new for many teams, and they could use help in learning the technologies and adapting their processes.

As a testing practice lead I see the success in exactly that – helping people to gain practical experience in Web Accessibility domain. This will ensure forming the right understanding and successful approach.

Useful Hints

  • Don’t take anything personally. Really, applies to all testing.
  • Be patient. Don’t expect immediate changes or big wins right away.
  • Support and create opportunities for trial and learning, like workshops and pilot projects.
  • Encourage participation in global communities and local Accessibility groups. Find them near you at and connect through Twitter (hashtag: #a11y).
  • Refer to the facts. Keep down the drama.
  • Help people to avoid feeling bad about mistakes, turn that into the benefit of practical experience.
  • Demonstrate what real help the tools can offer and what their fundamental limitations are.
  • Skills and experience are the main keys to success. Promote learning through practice.
  • Gain supporters across the organization. Make sure to speak their language.
  • Stay positive and be persistent.
  • Remember that your job is the service of testing. Let the product owners make final decisions about quality of the software.

Test Strategy Outline

  • Treat Accessibility just like any other system feature.
  • Supporting Accessibility at the development framework level will significantly reduce the scope of testing.
  • Having standard GUI classes with built-in Accessibility features will help to create UI Mocks, test very early, frequently, and reduce risks of regression.
  • Test the design: begin with UI mock-ups and Wireframes.
  • Use risk-based sampling to quickly identify areas of incompliance.
  • Use automated checking for timely detection of unwanted changes (regression).
  • Grow your in-house Accessibility Testing specialist or hire a consultant to teach the team through practice.

Image credits

Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported
This work by Albert Gareev is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported.