1.2 Ensure compatibility with assistive technologies

Users with impairments often use assistive technologies to enable them to provide inputs and perceive outputs. These include special keyboards, alternative pointing devices, screen readers and screen magnifiers.

Applications should be able to detect input from assistive technologies and generate outputs in a way that assistive technologies can detect.


If an application manages input and output in a way that is not compatible with assistive technologies, some users may not be able to perceive the application objects or act on them. They will then be unable to use the application.

Directions and Techniques

Use the standard system inputs

Assistive input technologies, such as special keyboards and pointing devices, use the standard system input mechanisms. If an application bypasses these mechanisms, for example by taking input direct from the keyboard rather than from the input buffer, it may not be able to detect input from the assistive technology.

Use the standard system outputs

Assistive output technologies, such as screen readers, monitor the use of the standard operating system drawing routines to determine what is displayed on the screen. This includes routines for displaying or erasing user interface components, text and graphics. If an application bypasses these routines, for example by drawing or writing text directly to the screen, the assistive technology may not be able to detect the output.

Use standard system user interface components, such as menus, buttons and dialogs

Standard user interface components provided by the operating system or Java Foundation Classes (JFC) expose their state without requiring additional work from the application.

For custom user interface elements, use APIs that support assistive technology
Custom user interface elements should be exposed to assistive technologies through a suitable Application Programming Interface. This should enable assistive technologies to do the following:

  • Find out which user interface element is at a particular location
  • Find out the identity and state of a user interface element, including its type, handle, availability, content and value
  • Receive notifications when the state of a user interface element changes, for example, when a control becomes disabled or when a text string changes
  • Carry out actions that affect a user interface element, for example, click a button or select a text box

Applications running on Microsoft Windows can use Microsoft Active Accessibility for this purpose.
Java applications can use the Java Accessibility API.

Use logical event handlers

Use logical event handlers, such as onSelect, rather than device event handlers, such as onMouseUp. This enables the application to respond to events generated by any suitable device.

Use the system cursors and pointers, or keep them tied to the focus

Assistive technologies used by people who are visually impaired need to track the position of the system cursors and pointers. For example, screen magnifiers enlarge the portion of the screen around the mouse pointer, so they need to know where the mouse pointer is. Screen readers follow the system cursor or insertion bar during text entry, so they need to know where that is.

If an application uses its own method to indicate focus, such as special highlighting or moving an object around the screen, the position of the system pointers or cursors may not correspond to the application's focus point. The assistive technology will then be looking in the wrong place.

If alternative focus indicators are used, it is still possible to facilitate assistive technologies by moving the system cursors and pointers so that they are always in the same position as the focus indicator. This will work even if they are not visible.

If the focus covers a larger area, such as a table cell, it may be possible to define the system cursor to be the same size. This will help screen readers determine exactly what is within the focus area.

Do not require that the application takes up the whole screen

Some assistive technologies, such as screen magnifiers and on-screen keyboards, need to be permanently visible on the screen. They take up a portion of the screen, usually at the top or bottom, and the remainder of the screen can be used for application windows. If an application requires the whole screen and cannot be resized, the assistive technology will therefore obscure part of the application. The user will have to move the assistive technology in order to view that part.

Ensure that content makes sense when linearised

Screen readers traverse the screen and read out the elements in the order in which they are encountered. This converts the two dimensional screen layout into a linear sequence. Problems can arise if this linear sequence does not follow the logical ordering of the elements. This can render multi-column text unreadable and cause difficulties with forms.

For example, if two text blocks are presented side-by-side with only blank space separating them, a screen reader may interpret them as a single block and read the first line of the first column followed by the first line of the second column, before moving down to the second line. The two columns will then be mixed up and the text will make no sense at all to the user. Another example is where a form requesting personal details consists of text entry boxes with a prompt for each. The logical order will be for the boxes to be read in the conventional order - first name, family name, address, etc. - and the appropriate prompt to be read before each input box. If the screen reader were to read out all the prompts before getting to the input boxes, it would be very difficult for the user to know which input box was related to which prompt. If the input boxes themselves appeared in a non-logical order, such as first name, address, then family name, it might also be difficult for the user.

How you could check for this:

Test with real users of assistive technologies

The only way to find out for sure whether an application is compatible with assistive technologies is for people to try using it with their assistive technologies. As far as possible, this should be done by people who routinely use the assistive technologies in their everyday life, rather than the application developers themselves. It would be possible for developers to test the application by obtaining an assistive technology, such as a screen reader, and using it themselves. However, this would be wasteful of resources and of questionable validity. For a non-impaired person to learn to use a screen reader effectively may take a few weeks of constant use. Unless that person relies on the screen reader completely, to the extent of unplugging their monitor, they may not end up using it in the same way that a non-sighted user would. This would make the test unrepresentative. A basic tenet of user testing is that representative users should carry out representative tasks in a representative situation. For accessibility, this means that users with impairments should be asked to use the application to carry out the tasks it is designed for, using their own assistive technology set up according to their own preferences.
About user testing