To conclude my internship with Mozilla, I did a video presentation on Air Mozilla. This is a high-level overview of the solution I designed to automate accessibility testing.

Below are the slides, followed by the transcript and video.

Automating Web Accessibility Testing

Hi, my name is Kimberly Sereduck. I recently completed an internship with Mozilla through the GNOME Outreachy program. As part of the test engineering team, the goal of my internship was to automate web accessibility testing.

Which, for most people, probably brings up two new questions in itself:
What is Automated Testing?
What is Web Accessibility?

What is automated testing?

In order to ensure the quality of software and websites, software engineers and test engineers write tests to check that the product works as it should. Automated tests often run on a set schedule, or when an update is released.

What is Web Accessibility?

Accessibility is a word we have probably all heard before, but you may or may not understand what it means. Even while working as a web developer, I wasn’t sure exactly what it was, or why it was important.

Imagine for a moment that, when browsing the web, your knowledge of what a page contains was based exclusively on dialogue from a Screen Reader.

A screen reader is a type of software that will read aloud the contents of a page, including headers, links, alternative text for images, input elements, and more.

This technology allows users with limited vision to navigate the resources of the internet.

However, if a website is not designed with accessibility in mind, it can still be very difficult or even impossible for these types of users to understand what a page contains, or to interact with it.

This is just one example of a user who requires an accessible web. There are many different types of users, who utilize different types of assistive technologies.

This brings me to the topic of web accessibility testing.

Web accessibility testing is simply an analysis of how accessible a website is.

There are detailed standards of web accessibility called the Web Content Accessibility Guidelines, or WCAG for short.

An example of one of these rules is that all images must contain alternative text. This text would be read aloud by a screen reader.

If an image is used to help convey a point, providing an alt attribute allows non-sighted users to understand the context more fully.

In the example here, a graph is an image used to convey information, and not used just for display purposes.

A good alternative text would be to concisely describe what information the graph contains.

Problem Domain

Accessibility testing for websites, is, at this point, largely manual work. There are many tools that exist to generate accessibility reports for websites. These are mostly limited to browser-based tools.

Here is an example of one of the browser-based tools.

Most of these tools do not offer much in the way of customization or flexibility.

Most of them return a list or report of all accessibility violations found on a single page.

However, almost all websites are comprised of multiple pages, and some may even have dozens of different types of pages.

Using one of these browser-based accessibility tests would require someone manually running the tests on each different type of page, reviewing the results, and creating an accessibility report for the site.

For most companies that address the accessibility of their websites, this is done on an irregular basis, and is not integrated into their automated testing workflow.

An example of what I mean by automated testing workflow:

If an update is made to the website, which results in users not being able to download Firefox, there are automated tests in place to catch these errors.

This enables test engineers to be notified right away, rather than the problem going unnoticed until someone happens to catch it.

This type of testing is called regression testing. Regression tests make sure that features that worked before an update still work after the update.

What Problem Does This Solve?

The problem that this project solves is to integrate regression testing for web accessibility into this automated workflow.

So, if a site is accessible, and an update is released that makes it less accessible, these tests would notify test engineers of the new accessibility violations.

To make this possible, I have written software that allows python programmers to make use of a tool called aXe in their python tests.

aXe is a an API created by Deque Systems, a company that specializes in web accessibility.

aXe can be run against a web page and return a list of all violations of accessibility rules.

aXe also allows customization, such as including or excluding certain accessibility rules, or testing specific parts of a page.

The software I have written, called axe-selenium-python, maintains this ability to customize your accessibility tests.

The way this software works is that it is included in existing automated tests, adding accessibility checks for each page that the tests visit.

This puts accessibility on the same level as functionality, and accessibility reports will be generated every time a test suite is run, rather than manually and infrequently.

Creating a more inclusive web is a top priority to the people at Mozilla.

Mozilla has a dedicated accessibility team that does check their websites for accessibility.

However, we would like to give the test engineering team the ability to include accessibility checks in their automated testing workflow.

My work will enable Mozilla to test the accessibility of its websites on a more regular basis, and do it using much less time and resources.

Goals for the Future

Although the software I have written is functional and currently available for public use, there are many goals to improve it, making it more simple to use and more beneficial.


There are three main goals for improving the package I have written. The first is to devise a production-ready solution for testing individual accessibility rules, rather than running a single accessibility check for each page.

The reason for this is two-fold.

Without individual tests, there is a single PASS or FAIL for accessibility. If ANY rule is violated, this test fails. This also means that the results of the failing rules are shown in a single report.

However, if individual tests exist, there will be a pass or fail for each accessibility rule that is tested for, making the results more readable.

Test engineers would be able to quickly see the number of violations, and which rules were violated.

Having individual tests also allows the ability to mark tests as expected failures.

In the world of test engineering, when a failure is spotted, typically a bug or issue is filed, and the test will continue to fail until the bug is fixed. It is not uncommon for some issues to go unaddressed for weeks.

In most cases, you don’t want to keep getting notifications of a failing test after a bug has been filed.

So the tests are marked as expected failures, and instead of being notified when they fail, the test engineers will be notified when the test begins passing again.

I want to enable this same functionality with my software.

While I have succeeded with a couple of different approaches, neither of them can be easily integrated into existing test suites.

I want my solution to be as hassle-free as possible, allowing users to easily add and customize these individual tests.

The third goal for improving this package is to generate more sophisticated accessibility reports. I want to allow users to view a single report for accessibility, one that is well designed and presents the information in a way that is easier to understand.


A long-term goal that I have as far as the company is concerned is to develop an Accessibility dashboard for all of Mozilla’s web assets.

This dashboard would use the data from the most recent accessibility tests, and give each Mozilla website a grade or rating for accessibility.

This dashboard would be included with others that are currently displayed in several of the Mozilla offices around the world.

Producing this dashboard would increase the visibility and awareness of Mozilla’s accessibility status, which, in turn, would have a significant impact on Mozilla’s goal of creating a more inclusive web.


Another goal I have for this project is to involve community contributors in Mozilla’s accessibility goals.

One idea for increasing community involvement is by adding a new feature to my axe-selenium-python package, or to write a new package altogether.

This feature or package would automatically file a bug when a new accessibility violation is found.

Most accessibility violations are fairly easy to fix, and require only a basic knowledge of HTML.

As such, they would make great First Bugs for people who have never contributed to Mozilla or an open-source project before.

This would help increase community contributions overall, and also to expedite the process of fixing these accessibility violations.

This feature could tremendously affect Mozilla’s inclusivity goals, and do so without adding a large workload to full-time employees.


More and more people every day have access to the internet.

Millions of these users have limited vision or limited physical ability, and rely on technologies other than a mouse and a screen to navigate the web.

The internet is a vast resource that opens up a world of knowledge that humans have never had access to before. It is becoming increasingly important in all aspects of life.

Knowledge, to me, is not a privilege, but a basic human right. As such, I believe that the resources of the internet should be made accessible to all types of users.

An accessible web also allows all people to participate more actively in society.

I am very fortunate to have had an opportunity to help create a more inclusive web. I hope to continue to do so, and make an even more significant impact in the world of accessibility and inclusivity.

If I seem terribly uncomfortable and out of breath, I was! I was almost 9 months pregnant when I recorded this presentation, so please, pity me.

Watch the Video Here