Listening to the web, part two: it’s all semantics
Part 2 of a 3 part series in which I share my thoughts on thinking about accessibility while developing and working with screen readers
- Part 1: thinking in accessibility
- Part 2: it’s all semantics
- Part 3: working with screen readers
As designers and developers, at the end of the day, we do what we do because we want people to benefit. As we discussed in yesterday’s post, it’s essential that we consider the various ways others might be interacting with our sites. Are they sitting comfortably in their home on a laptop? Do they have an infant in one arm and a phone in the other? Are they in a rush trying to find information on the closest hospital? The fact of the matter is, we can’t know.
We will never know when, why and, in some cases, how someone will interact with and consume the content we put out into the world. What we can do is realize this fact, embrace uncertainty, and try to create functional user interfaces for as many contexts as possible.
I believe one of the best ways to achieve this feat is to ensure accessibility is at the forefront of the design and development workflow.