The a11y Monthly: How to fix navigation in JavaScript

In modern web development, building web applications using JavaScript frameworks is a growing trend. And with good reasons, as JavaScript frameworks offer several advantages. However, the shift to a different interaction model creates new accessibility challenges that haven’t been fully addressed yet. In this post, I’d like to talk about a fundamental aspect of accessibility: pages navigation feedback. Specifically, how to repair the native accessibility level that our JavaScript applications often break.

The WebAIM Screen Reader User Survey

I was inspired by the WebAIM screen reader user survey published in December. This survey was an excellent start of the new year for accessibility. WebAIM (Web Accessibility In Mind) is a non-profit organization based at the Center for Persons with Disabilities at Utah State University. For years, they’re doing tremendous work. Among other things, they publish a lot of educational resources. Periodically, WebAIM surveys screen reader users preferences and the collected feedback is always enlightening.

One thing in the survey caught my eye. Under “Problematic items”, right after CAPTCHA, the most challenging barrier screen reader users face on the web is Unexpected screen changes. The most interesting thing is how this item position has evolved:

The order and indicated difficulty for the items in this list are largely unchanged over the last 8 years. There is one notable exception — “Screens or parts of screens that change unexpectedly”. This item has moved from 7th most problematic in 2009 to 5th most problematic in 2012 to 2nd most problematic in 2017. This is likely a result of more complex and dynamic web applications.

Wait, are we creating new accessibility barriers? Yes. It’s not because of the technology in use though. It’s because of the implementation. Sometimes it’s because developers, including the ones who build JavaScript frameworks, are unaware of the problem.

Navigation is the first unexpected change

In a normal HTTP request life cycle, the browser sends a request. The server responds to that request by sending new data. After that, the browser reloads the page to show the new data. This is a classic interaction model, where the page reload is actually the first feedback for users.

When a page reload occurs, as I’m aware of, all the screen readers start announcing the new page reading the document < title > tag. Some screen readers, for example, VoiceOver, play a “beep” to indicate further navigation occurred.

Instead, what happens with Single Page Applications and the like? Usually, just a portion of the page gets updated. Maybe a new UI component or an entire view gets rendered, but there’s no real “navigation.” Even if, as a developer, you’re taking advantage of the browser’s History API and you’ve implemented some routing mechanism, that’s not a navigation assistive technologies can understand.

In all my tests with screen readers, clicking a link in a single page application based on a JavaScript framework doesn’t give any audible feedback to users. After a link gets activated, there’s just resounding silence. No feedback at all.

The reason why this happens is simple: assistive technologies are designed based on existing specifications and recommendations because they need predictable, standardized behaviors and interaction models to behave correctly. Assistive technologies can’t read developers’ minds. They can’t infer: “Dear developer, did you mean that was meant to be a sort of navigation to a new view? OK, let me announce that to the user”.

What we’ve done at Yoast

At Yoast, we’ve built a React-based single page application for our customers. Of course, it has a navigation menu and a routing mechanism. We didn’t announce navigating to a new “page” to screen readers.

Our solution was simple enough to be implemented programmatically. Each time a new page (which is a React component) loads, which in React’s terminology is when the component mounts, we send a message to an aria-live region taking advantage of the speak module from the WordPress packages. This ensures an audible message like “XYZ page has loaded” gets announced by screen readers when a new “page” gets rendered. Users now have proper feedback, and native accessibility is somehow rebuilt.

What to expect in the future

ARIA provides mechanisms to announce content updates, but this is up to the developers’ implementation. On the other hand, the new interaction model typical of single page applications and JavaScript frameworks is here to stay. Navigation is just an example: dynamic content updates are used everywhere to update the entire screen or parts of the screen.

Browsers are aware of dynamic content changes. Assistive technologies can now understand when changes occur and update their data representation accordingly. However, the issue of informing users of content change has not been fully addressed yet. For the future, I’d hope for some new standard, native, way to ensure all users are always informed of content changes.

In the meantime, it’s important to understand when our implementations break a specific feature accessibility. It’s our responsibility, as developers, to rebuild the native accessibility we’ve just destroyed.

Want to help?

At Yoast, accessibility matters. We know it’s a process and we’re continuously improving, testing, iterating, and developing. We’re always open to feedback and contributions. Please do not hesitate to let us hear your voice. Please report any issues or potential improvements you notice in our products.

Read more: 5 easy things you can do to imporve accessibility »

Coming up next!