Evaluating current state of a screen reader in Jav

2019-04-11 11:12发布

问题:

I've been doing screen reader optimization for the last 2 years without issue, but now I'm developing an application that has audio playback as a core piece of functionality. As I understand, there's no way to defer playback while a screen reader is running and all of my audio streams are talking over each other right now. I've been through the WAI-ARIA specs many times to the point that I doubt the feature I'm looking for is included there.

Is there any overall screen reader API accessible by JavaScript that would allow me to coordinate my applications audio to not overlap with accessibility devices? Something where I can just listen to a callback or subscribe to an event—something like window.addEventListener('screenReaderAudioFinished', handlerFn); ?

TL;DR I'm looking for some way in JavaScript to be notified when a screen reader is finished speaking. Callbacks, events, anything.

回答1:

TL;DR What you are looking for does not exist in the browser

Sound like you are developing a game, If that is true, then what you could do is completely take over the announcements that are happening - essentially implementing your own screen reader. This could be achieved by implementing an application region with a virtual "cursor" on the controls manipulated by the keyboard. This calendar widget shows how this can be achieved for both keyboard and gesture control.

http://dylanb.github.io/datepicker/datepicker.html

Audio output could be implemented through your own audio pipeline. If you have the ability to play audio, then writing this should be reasonably trivial but mixing choices and priorities might be the hard part.

You should not only be announcing the controls a user moves to when they move to different controls, but you should also have an "audio description" track that describes what is happening in the animations if these are important.

How you mix the sounds depends on what is most important at any given time and whether there is any "real-time" component to the user's interactions with the game.



回答2:

You have to read audio control part of the WCAG (http://www.w3.org/TR/UNDERSTANDING-WCAG20/visual-audio-contrast-dis-audio.html)

In fact, what you want is quite the opposite of what a people with disabilities would want.

What is needed is to be able to stop the audio in your application, and to start it again on demand.

What I would do is giving a control to pause the audio in your application, giving the user the ability to know that your application want to tell him something with a little beep (<3s) when audio mode is turned off.

Screenreaders are not fully integrated with browsers, so it's quite difficult to find a better solution if you want to have full support of accessibility devices.