1. 19
    1. 5

      In the genre of “disability simulation”, this is an okay article.

      Please be aware of a couple of things, though: this genre is regularly seen critical, you could also just ask a screenreader user to write up their experiences. It’s abled people talking about disability, which is probably leaky and filtered instead of just directly approaching guest authors for the subject. It also means abled people getting money for writing about a subject they are not expert in. If you are uncomfortable with this line of reasoning, you do have to understand that most disabled people have a hard time getting into job markets, even if they were appropriate for them. “Where do I get money from” is a very regular problem, so abled people writing about subjects they have to learn qualification for is a practical problem all the time.

      A practical side note: I was recently at a conference for Deaf people and one of their biggest complaint is how bad research for finding computer-recognizable gestures is, for the sole reason that no one considers that there are humans that know a ton about recognizable gestures.

      A couple of the analysed cases are known offenders: That autoplay and autocontinue is inaccessible has been codified in accessibility guidelines/rules since before YouTube existed. They flat-out ignore that for “user engagement” reasons.

      It also hints at 1/3 people using braille lines for output, but doesn’t go into detail if special rules apply.

      It recommends using screenreaders and such in your testing procedures, but does not recommend integrating actual screenreader and braille line users into your testing process. This is a very strong miss that kind of illustrates our problem: we talk about disability, but frequently miss the point of integrating disability. And that works by integrating disabled people, not learning their usage patterns. You don’t need to know the quirks of every screenreader, there’s consultants for that.

      If you want to learn about disability, I highly recommend you to talk to disabled people instead and watch them use your product. Just be aware that (back to the above) it’s usually common to pay for that.

      (btw. the new rust-lang website has a pool of contributors for that. (currently unpaid, they most do Rust as their day job, we have offered, though))

      1. 2

        Question: how do you go about finding blind people to test your app? I work in a company that definitely won’t hire or contract blind usability testers, and I’d imagine most companies are this way. I also don’t know anyone who is blind who I could ask to give it a quick test.

        For this, I figure running through the site with a screenreader is better than nothing, no? I test for color blind people using Axe and color blindness simulators, I test for impaired vision by stepping further away from the screen. While I do sometimes ask a person with the actual disability to take a look at it, oftentimes I can’t because the things I’m working on are confidential. As a developer, I feel like I’m doing my part, hiring people with disabilities to test applications seems like a management problem.

      2. 4

        This is all good advice.

        One thing I’m surprised they didn’t mention. Reading web pages from a screen reader is super tiring, at least in my experience. Like I’d want to just fall asleep after about half an hour.

        1. 2

          One thing I’m surprised they didn’t mention. Reading web pages from a screen reader is super tiring, at least in my experience. Like I’d want to just fall asleep after about half an hour.

          Because that’s an effect that mainly applies people unused to using screenreaders. If this is your main source of information, you become incredibly well tuned to it (also, be aware that some people listen to the voice at around 4x to 8x speed).