How to do Web Accessibility QA: Part 2
Introducing a simple process and template for doing a11y testing — no WCAG knowledge needed.
Accessibility QA starts with broadening your frame of reference and understanding what it's like to use a computer in unfamiliar ways. With that understanding, we can dive into actual testing.
The usual starting point is to read the Web Content Accessibility Guidelines (aka WCAG), which define the current accessibility standard. (The older Section 508 standard is relevant only for government sites.) But good luck understanding WCAG on first glance.
WCAG is broken into three levels (A, AA, AAA); four principles; 12 guidelines; and 61 success criteria. It's hard to make sense of WCAG's multi-layered categorization, jargon, and sheer number of items.
The good news: You don't have to worry about all that to get started. Instead, I find it easier to think in terms of these broad goals:
- Goal 1: People who don't use a mouse should be able to use and understand a site.
- Goal 2: People who don't look at a screen should be able to use and understand a site.
- Goal 3: A site's content should be visually legible.
- Goal 4: People should have access to alternate versions of video and audio content.
- Goal 5: People should have control over automatic changes to the page.
Goals 1-3 cover most accessibility items for an average site. Goals 4 and 5 cover accessibility items that are less common or more likely to be out of your hands.
Here's my four-step approach for testing with those goals in mind:
- Step 1: Use an automated testing tool as a first check. (Covers goals 1-3.)
- Step 2: Use the site with a keyboard instead of a mouse. (Goal 1.)
- Step 3: Use the site with a screen reader. (Goal 2.)
- Step 4: Test specific WCAG items as needed. (Goals 1-5.)
The great thing about this approach is you don't have to get into the WCAG weeds. The first three steps will cover most common WCAG items without you even knowing it.
Here's a testing spreadsheet template to make things easier:
- You can use the A11y QA tab to enter the URLs that you'll be testing, and to document your test findings. I usually fully test each step, then abstract my spreadsheet notes into tickets.
- In the WCAG AA tab, I've re-organized the WCAG list by the above goals. You can use this tab for reference once you're ready to get deeper into WCAG.
- The sheet only shows WCAG level A and AA items; you don't have to worry about AAA items unless that's a specific target or requirement.
One caveat: only do accessibility testing for the accessibility items your designers and developers have tried to achieve. It's a waste of time to test things that don't exist, unless the team is prepared to retroactively design and implement missing things.
Now let's look at the four testing steps in detail.
Step 1: Use an Automated Testing Tool
I start by using an automated tool called WAVE to test visual contrast (part of Goal No. 3) and code structure (which underpins Goals 1-3), and as an overall first check.
To use WAVE, I add it as a Chrome browser extension, go to each URL that I'm testing, and click the WAVE extension button.
The WAVE tool parses the page's code and identifies possible WCAG problems:
I usually only pay attention to the general errors (denoted in red) and the contrast errors (denoted in black).
For example, this WAVE report found that some form fields did not have labels, and some linked elements — in this case, social media icons — did not have text:
WAVE also generates a lot of noise. For a well-implemented site, I've found that some errors and most alerts (denoted in yellow) are not actual problems. Also, if you encounter a lot of contrast errors, it might be because your designers didn't check for color contrast in the first place.
Initially it's worth talking with your developer colleagues about what you find in WAVE. After a couple projects, you'll get a feel for which errors should be addressed and which you can ignore.
Step 2: Use the Site With a Keyboard Instead of a Mouse
Now we can apply our newfound understanding of using a keyboard. In this step, you aren't testing specific WCAG items so much as trying to experience the site as any keyboard user would.
To test comprehensively:
- Tab/Return through all navigation and subnavigation, including any submenus.
- Tab/Return through the footer.
- Tab through at least one instance of every unique page.
- Tab/Return through at least one instance of every unique interactive element (carousels, video players, modal windows).
- Tab/Return/type through every unique form.
Even without deep WCAG knowledge, any frustrations or confusion will be obvious. Some common frustrations:
- There aren't jumplinks, so you have to tab through every link or button to reach a link farther down the page.
- After tabbing through the main nav, the focus state takes you to a sidebar, social links, or something other than the main content. (This isn't always problematic, but often is annoying and reflects poorly thought-out code order.)
- You can't tell which element currently has focus.
- An interactive element is wonky, e.g. a carousel doesn't advance when you press Return while focused on the "forward" button.
- You can't move focus state into an interactive element or a form field.
- You can't move focus state out of an interactive element or a form field.
- You can't move focus state onto a form submit button; or pressing Return on the focused button doesn't submit the form.
- There's an element on the page that you think you should be able to interact with but can't.
Step 3: Use the Site With a Screen Reader
Next, fire up VoiceOver on a computer. Again, you don't need to test specific WCAG items; just using VoiceOver thoroughly across the site will reveal most problems.
Start by going through the same scenarios as you did with keyboard testing (navigation, footer, interactive elements, etc.). Look for the same frustrations or confusion around traversing and interacting with the site.
Also look for common frustrations around content and context that can arise for people who don't see the screen. (You'll quickly realize how much of your web comprehension is based on visual perception and cues.)
- When VoiceOver is reading a navigation block, is it apparent that you're hearing navigation being read?
- Is it apparent when VoiceOver is reading a link or other clickable element?
- Does VoiceOver clearly identify images, video, or other non-text elements?
- Does VoiceOver describe an image, skip over the image, or read the image's filename? (The first two are ok depending on the context; the latter is never useful.)
- Does VoiceOver give all necessary information about a form? Or is it unclear what the form fields are, what field you're in, what validation a field has, etc.?
- Does VoiceOver read any instructions that wouldn't make sense if you closed your eyes?
- For example, does it read instructions like "Click the left button" (the spatial description wouldn't make sense) or "Click the blue button" (the color description wouldn't make sense)? Does it read generic copy like "Read more" that wouldn't make sense without seeing surrounding content?
I'd wait until any issues are addressed before doing iPhone VoiceOver testing, as you'd likely encounter the same problems.
Step 4: Test Specific WCAG Items
After keyboard and screen reader testing, I'll go back to the WCAG list and pick out any items I need to specifically test. Common items include making sure page titles provide adequate context, and testing legibility when the browser is zoomed.
Double-check the uncommon items in Goals 1 and 2, and all of Goals 4 and 5, to see if they're relevant for your site.
For example, all Goal 4 items are about providing alternative versions of video and audio content. These would be moot if your site doesn't have video or audio content, or if you're an agency and the client is responsible for providing that content. Goal 5 items are about automatic changes to the UI or page context — e.g. a carousel that automatically scrolls for more than 5 seconds — which many sites don't have.
End Step: Thank Your Colleagues
Like I said in Part 1, it's profound to finally understand what it's like for people to use a computer in different ways than you do.
It's equally profound to do accessibility testing and see how much your colleagues already understand this — to see the care and effort they put into making the site accessible.
So thank them, praise them, and help them spread that understanding, care, and effort throughout your company and the broader web community.