A well-designed product is accessible to users of all abilities, including those with low vision, blindness, hearing impairments, cognitive impairments, motor impairments or situational disabilities, such as a broken arm. Improving your product’s accessibility enhances the usability for all users, which Material Design’s built-in accessibility considerations will help you accommodate.
This section primarily applies to mobile UI design. For more information on designing and developing fully accessible products, visit the Google accessibility site.
Help users navigate your app by designing clear layouts with distinct calls to action. Every added button, image, and line of text make the UI more complicated. Simplify your app with:
Design your app to accommodate a variety of users. A user may be new to your product or use a text-only screen reader (a program that reads text aloud or uses a braille display). Your app should make it easy to:
Support assistive technologies specific to your platform, just as you support the input methods of touch, keyboard, and mouse. For example, ensure your Android app works with Google’s screen reader, TalkBack.
Assistive technology helps increase, maintain, or improve the functional capabilities of individuals with disabilities, through devices like screen readers, magnification devices, wheelchairs, hearing aids, or memory aids.
Apps should give users feedback and a sense of where they are in the app. Navigation controls should be easy to locate and clearly written. Visual feedback (such as labels, colors, and icons) and touch feedback show users what is available in the UI.
Navigation should have clear task flows with minimal steps. Focus control, or the ability to control keyboard and reading focus, should be implemented for frequently used tasks.
Screen readers give users multiple ways to navigate a screen, including:
Users may switch between both “explore by touch” and “linear navigation” modes. Some assistive technologies allow users to navigate between page landmarks, such as headings, when these landmarks use the appropriate semantic markup.
Hardware or software directional controllers (such as a D-pad, trackball, or keyboard) allow users to jump from selection to selection in a linear fashion.
Place items on the screen according to their relative level of importance.
By placing important actions at the top of the screen, they are given more importance in the hierarchy.
Input focus should follow the order of the visual layout, from the top to the bottom of the screen. It should traverse from the most important to the least important item. Determine the following focus points and movements:
Clarify where the focus exists through a combination of visual indicators and accessibility text.
The black circles indicate the order in which onscreen elements should receive focus.
Group similar items under headings that communicate what the groupings are. These groups organize content spatially.
Focus traversal between screens and tasks should be as continuous as possible. If a task is interrupted and then resumed, place focus on the element that was previously focused.
Use color and contrast to help users see and interpret your app’s content, interact with the right elements, and understand actions.
Color can help communicate mood, tone, and critical information. Use color so that all users can understand the content is fundamental to accessible design. Choose primary, secondary, and accent colors for your app that support usability. Ensure sufficient color contrast between elements so that users with low vision can see and use your app.
Material Design’s Color Tool can help you choose colors with sufficient contrast between elements, so that all users can see and use your app.
The contrast ratio between a color and its background ranges from 1-21 based on its luminance (the intensity of light emitted) according to the World Wide Web Consortium (W3C).
Contrast ratios represent how different a color is from another color, commonly written as 1:1 or 21:1. The higher the difference between the two numbers in the ratio, the greater the difference in relative luminance between the colors.
The W3C recommends the following contrast ratios for body text and image text:
These lines of text follow the color contrast ratio recommendations and are legible against their background colors.
These icons follow the color contrast ratio recommendations and are legible against their backgrounds.
While decorative elements (such as logos or illustrations) don’t have to meet contrast ratios, they should be distinguishable if they possess important functionality.
Decorative logos that are distinguishable don’t have to meet contrast ratios.
For users who are colorblind, or cannot see differences in color, include design elements in addition to color that ensure they receive the same amount of information.
Because colorblindness takes different forms (including red-green, blue-yellow, and monochromatic), use multiple visual cues to communicate important states. Elements such as strokes, indicators, patterns, texture, or text can describe actions and content.
The text field error state is communicated through multiple cues: title color, text field stroke, and an error message below the field.
Material Design’s touch target guidelines enable users who aren’t able to see the screen, or who have difficulty with small touch targets, to tap elements in your app.
Touch targets are the parts of the screen that respond to user input. They extend beyond the visual bounds of an element. For example, an icon may appear to be 24 x 24 dp, but the padding surrounding it comprises the full 48 x 48 dp touch target.
Touch targets should be at least 48 x 48 dp. A touch target of this size results in a physical size of about 9mm, regardless of screen size. The recommended target size for touchscreen elements is 7-10mm. It may be appropriate to use larger touch targets to accommodate a larger spectrum of users.
Pointer targets are similar to touch targets, but apply to the use of motion-tracking pointer devices such as a mouse or a stylus. Pointer targets should be at least 44 x 44 dp.
Touch target on both: 48dp
In most cases, touch targets should be separated by 8dp of space or more to ensure balanced information density and usability.
Touch target height: 48dp
Button height: 36dp
Touch targets and buttons
Flexible, responsive layouts help content scale in relation to the screen size. Content shouldn’t be truncated as a result of device type or resolution.
Keeping related items in proximity to one another is helpful for those who have low vision or have trouble focusing on the screen.
The slider value is in close proximity with the slider control.
To improve readability, users might increase font size. Mobile devices and browsers include features to allow users to adjust font size system-wide. To enable system font size in an Android app, mark text and their associated containers to be measured in scaleable pixels (sp).
Make sure to allot enough space for large and foreign language fonts. See Line Height for information on the recommended sizes of foreign language fonts.
Clear and helpful accessibility text is one of the primary ways to make UIs more accessible. Accessibility text refers to text that is used by screen reader accessibility software, such as TalkBack on Android, VoiceOver on iOS, and JAWS on desktop. Screen readers read all text and elements (such as buttons) on screen aloud, including both visible and nonvisible alternative text.
Accessibility text includes both visible text (including labels for UI elements, text on buttons, links, and forms) and nonvisible descriptions that don’t appear on screen (such as alternative text for buttons with icons). Sometimes, an on-screen label may be overridden with accessibility text to provide more information to the user.
Both visible and nonvisible text should be descriptive and meaningful, as some users navigate by using all headings or links on a screen. Test your app with a screen reader to identify areas that are missing or need better accessibility text.
Keep content and accessibility text short and to the point. Screen reader users hear every UI element read aloud. The shorter the text, the faster the screen reader users can navigate it.
Screen readers may automatically announce a control’s type or state through a sound or by speaking the control name before or after the accessibility text.
If the control type or state isn’t being read correctly, the control’s accessibility role may be improperly set or be a custom control.
Every element should have an associated accessibility role on a website or be coded to be announced properly. This means a button should be set as a button, and a checkbox as a checkbox, so that the control’s type or state is communicated correctly to the user.
If you extend or inherit from a native UI element, you will get the correct role. If not, you can override this information for accessibility on each platform ARIA for web, crosslink AccessibilityNodeInfo| https://developer.android.com/reference/android/view/accessibility/AccessibilityNodeInfo> for Android. On Android, set the class name field of the control’s AccessibilityNodeInfo to "android.widget.Button".
Use action verbs to indicate what an element or link does, not what an element looks like, so a visually impaired person can understand.
Link text should:
Ensure an element has the same description everywhere it’s used.
The description read aloud indicates the action represented by the icon.
Accessible text for a navigation menu could be “Show navigation menu” and “Hide navigation menu” (preferred) or “Show main menu” and “Hide main menu” (acceptable).
For icons that toggle between values or states, announce the icon according to how it is presented to the user.
Elements are displayed based on how they should be used. For example, if a star icon represents the action of adding something to a wishlist, the app should verbally state “Add to wishlist” or “Remove from wishlist.”
Don’t tell users how to physically interact with a control, as they may be navigating with a keyboard or other device, not with their fingers or a mouse. Accessibility software will describe the correct interaction for the user.
The command “voice search” describes the user task (search) paired with the input method (voice).
Hint speech provides extra information for actions that aren't clear. For example, Android's “double-tap to select” feature prompts the user to tap twice when landing on an item without taking action. Android TalkBack will also announce any custom actions associated with an element. Use hint speech sparingly and only for complex UI.
Give visual alternatives to sound, and vice versa. Provide closed captions, a transcript, or other visual cues to critical audio elements and sound alerts.
Allow users to navigate your app using sound by adding descriptive labels to UI elements. When using a screen reader such as TalkBack and navigating by touch exploration, labels are spoken aloud when users touch UI elements with their fingertips.
The following sounds should be avoided:
Material Design uses motion to guide focus between views. Surfaces transform into focal points for the user to follow, and unimportant elements are removed.
To allow users with motion and vision sensitivities to use interfaces comfortably, adhere to the Material Design motion guidance, which supports the following from the W3C:
Controls in an app may be set to disappear after a certain amount of time. For example, five seconds after starting a video, playback controls may fade from the screen.
Avoid using timers on controls that perform high-priority functions, as users may not notice these controls if they fade away too quickly. For example, TalkBack reads controls out loud if they are focused on, and placing them on timers may prevent the controls from completing their task.
For controls that enable other important functions, make sure that the user can turn on the controls again or perform the same function in other ways. Learn more in Composition section.
By using standard platform controls and semantic HTML (on the web), your app will automatically contain the markup and code needed to work well with a platform’s assistive technology. Adapt your app to meet each platform's accessibility standards and assistive technology (including shortcuts and structure) to give users an efficient experience.
Use native elements, such as the standard platform dialog.
Use scalable text and a spacious layout to accommodate users who may have large text, color correction, magnification, or other assistive settings turned on.
Keyboard and mouse interfaces should have every task and all hover information accessible by keyboard-only.
Scale your UI to work well with magnification and large text.
Screen-reader users need to know which UI elements are tappable on-screen. To enable screen readers to read the names of components out loud, add the contentDescription attribute to components such as buttons, icons, and tabs containing icons that have no visible text. For web apps, add an aria-label.
Any features with special accessibility considerations should be included in help documentation. Make help documentation relevant, accessible, and discoverable. As an example, review this guide on how to use a screen reader with Google Drive.
Following these accessibility guidelines will help improve the accessibility of your app, but does not guarantee a fully accessible experience. It is recommended that you also:
Talk to your users, particularly those who use assistive technology, to learn about their needs, what they want out of your app, which tools they use, and how they use them. Become familiar with these tools so you can give them the best experience.
People use assistive technology in different ways.