DHS Section 508 Compliance Test Process for Android Mobile Applications
Section 1: Introduction and Rationale for Tests 7
All interface elements including images, charts, and tables that provide meaningful information must be conveyed correctly
to assistive technology through their accessibility properties (name, role, state, and value) in a consistent manner for users
without vision or with low vision. For the purpose of testing, the name, role, and state expected outcomes are listed in the
Elements Table. The tester is testing to see if sufficient characteristics of all elements are announced.
To aid navigation with screen reading AT software, users can bring up a list of navigation controls on their screens. Users
can read through content and decide which of the links in the content they wish to follow (i.e., they do not have to navigate
back to the link itself).
Links must have a unique and descriptive name. Say each item for sale has a 'click here' link next to it, and the user calls up
the list of controls. The list will have multiple 'click here' links that are not distinguishable. Another common problem occurs
when the links only contain URLs, and the purpose of each link may not be apparent. It is therefore required to use
meaningful and unique names for links and user controls, to aid navigation using assistive technology for users without
vision or with low vision.
For screen reader users, screen orientation is important in order to ensure proper gestures are used. Some apps will allow
rotation as the device moves; others are static. Orientation must be announced by the mobile screen reader if the app
supports device rotation.
3. VIDEO, AUDIO, AND MULTIMEDIA
This section addresses audio files, animations, video files, and multimedia. Screen reader software cannot interpret images,
animation, video, or multimedia. Screen readers will, however, read text that has been associated with interface elements.
The interpretation (meaning) of an interface element must therefore be conveyed textually in the interface programming
for assistive technology for users without vision or with low vision.
Animation includes sequences of overlaid images, dynamic changes of state such as a moving speed dial, a chart illustrating
dynamic flow changes from one state to another, etc. Video-only files include animations, screen, video captures etc. The
visual information provided through animation and video-only must be provided through alternative means for assistive
technology for users without vision or with low vision.
Audio-only files include speeches, sound-bites, ambient (background) sounds, etc. Equivalent text descriptions must be
provided for users with no hearing or who are hard of hearing.
Synchronized media is a presentation consisting of time-synchronized video and audio. Synchronized media includes public
information films, Webcasts, press conferences, and online training presentations.
Some users will not be able to hear the content. Therefore there needs to be another mode to provide the audio
information, such as captions (text showing what is being said, and other relevant sounds). Captions need to be available,
but do not necessarily need to be turned on by default. For example, users who need captions can switch them on with a
control (usually a 'CC' button for Closed Captions). If there is no means of switching modes, then the default mode must be
accessible (i.e., Open Captions).
Because captions must be time-synchronized, separate transcripts will not meet this requirement on their own.
Some users will not be able to see the content. Therefore there needs to be another mode to provide descriptions of the
visual information. In synchronized media, this usually means additional narration inserted during breaks in the dialog,
describing visual events and cues.