Accessible iOS Apps: Up To 11!

Fri, Oct 23, 2020 22-minute read

My talk from #a11yTo Conf.

To celebrate, Apress is offering a 20% discount in my book “Developing Accessible iOS Apps” with the code ‘a11yTOConf’ in their website. Valid till October 30, 2020.

Description:

Let’s go on a journey to create, not just accessible iOS apps, but awesome accessible experiences that absolutely all your users will be able to use and love. How do we bring VoiceOver’s support to the next level? How should we deal with Dynamic Type, so they can choose the font size? How can we support Smart Invert Colors? What about Reduce Motion, or Reduce Transparency? What is the Rotor? What can we do to improve the user’s experience with Voice Control and Switch Control? And, how we can test all these features? We’ll try to answer all these questions to help you bring the accessibility levels of your iOS app up to eleven!

Slides:

PDF version of the slides

Presenter Notes:

Hi everyone, my name is Dani Devesa. I’m from Xàbia, a town in the east coast of Spain. But today I’m speaking to you from the city of València! I spent here a few years while studying at Uni. I’m now based in London, where I work as a Software Engineer at Spotify. I am hugely passionate about developing accessible iOS apps, and that’s why today I’d like to talk to you a bit about how we can create great accessible apps for iOS devices. This is Accessibility, up to 11.


The idea behind the title comes from a movie called “This Is Spinal Tap”. It is a comedy, a mockumentary from the 80s about an English rock band called “Spinal Tap”. There is a scene where the director is interviewing one of the guitarists and he proudly shows his amplifiers. All the numbers go up to 11 (instead of 10). The director asks if that actually means anything, and the guitarist says that of course, that it means they’re 1 louder than the rest.

It is just a joke but it has become an expression for showing that you are not happy with average, that you want to go one step further. I like to think that we should do that when it comes to accessibility. Our iOS apps being accessible should be a given by now, and we should be willing to go the extra mile to create great accessibility experiences that absolutely all our users love.

So I’m going to show you 11 things we can do to bring the accessibility levels of our app up to 11.


Let’s start then! We’ll start from zero with some of the basics and we’ll keep tuning the accessibility levels all the way up to 11. We’ll have our accessibility dial in the top right corner of the slides at all times.

In this slide, we try to exemplify how the accessibility model works in iOS. Imagine that we are creating an app for keeping track of our favourite conferences and we run it on one of the latest iPhones in the market, like this Apple Newton from 1993. In this case, we have a text field to introduce the name of the conference. Of course, A eleven Y T O is one of my favourite conferences. If we move VoiceOver’s focus to the text field, it will announce something like: “Conference name. #a11yTo Conf. Text Field. Double tap to edit.”. This is not a single blob of text. Instead, it is formed of 4 different properties: a label, a value, a trait and a hint.

By the way, you’ll see that a lot of the examples in the talk, like in this case, are focused on VoiceOver. This is mainly because it is arguably the most popular assistive technology in iOS, but also because it can be challenging, and because by creating a good VoiceOver experience, it is very likely that we will be improving the experience for other assistive technology users too, such as Voice Control and Switch Control users. We’ll see some examples where this happens later in the talk.


And how would this look in code? We can install the latest version of Xcode in our work mac, like this Macintosh from 1984, and configuring any of the properties we’ve seen before would look something like this.

VoiceOver, like many other assistive technologies in iOS, relies on the UIAccessibility protocol, which describes the necessary accessibility information to represent a UI control. These are some of the main properties that you’ll find yourself working with to improve your app’s accessibility.

And please, don’t feel intimidated by the code I’ll be showing during the talk. I’ll keep it as simple as possible for the purpose of illustrating what I’m trying to explain. And so you can see that Apple has done a great job in making some things that seem quite complicated, actually fairly easy to implement. Often a couple lines of code is all you’ll need. And hopefully some developers will find the examples useful too.

So here we have: isAccessibilityElement: it is a boolean. UI Controls have this property set to true by default. It is false for plain UIViews. It is very tempting to make everything an accessibility element but this is not always necessary, or even a good practice. For example, in the previous example the label before the text field may be considered redundant, and therefore we might want to set this property to false. Or we might want to skip a decorative image that is not conveying any relevant information.

accessibilityLabel: it is a string and it needs to be localised if you support more than one language. It is derived, if there is one, from the control’s title. So a lot of times you won’t need to set one.

accessibilityTraits: it is the role of the component, it indicates how you can interact (or not) with it. A UI control may have more than one trait. It is combined with the traits in its superclass. Some examples of traits are: button, header, adjustable, selected, not enabled… And they’re localised for you.

accessibilityValue: we’ve seen the text field example where the value is the text that you are typing. Another example could be a slider, whose value you can increment or decrement.

accessibilityHint: it is optional. Useful if the UI component needs further clarification on how it is used. It is read after a pause so experienced users can just skip them. It is a good practice to start with a verb as a call to action.

Apple offers a bunch of UI Controls in its UIKit framework. When using these controls, you’ll barely have to do anything as all these components have sensible preconfigured properties to make them accessible.

You may also know that you can now use SwiftUI to build the UI of your iOS apps too. Most of times you’ll find equivalent accessibility attachment modifiers to the properties in the UIAccessibility protocol.


Let’s talk a bit more about accessibility labels. Because good accessibility labels are the foundation for good accessible apps. It is important to frequently test your app on a real device because you may find that it doesn’t always work as you might expect! Sometimes the user might figure out what it’s going on but it is probably not quite right or definitely not the best experience.

Like in these examples. You can see for example that it is very easy to forget to properly configure accessibility labels that contain times, units or abbreviations. These are all real examples I’ve found in apps I’ve worked on.

4 days ago is better than 4 d. VoiceOver reads 3 h 24 m as “Three h twenty-four meters”, but “Three hours twenty-four minutes” is better. And Monday is clearer than Mon.

For times and durations you can use Date Formatters for getting a readable version.

Sometimes you would need to tweak how VoiceOver vocalises one of our accessibility labels. Like in the “Most read” example. Changing the spelling in the accessibility label is a bad practice though. It may work fine for VoiceOver but it can be confusing for braille reader users. We’ll see how we can do this properly now.


VoiceOver is really customisable. You can have attributed accessibility labels that let you specify extra properties for all, or part of the accessibility label. You can, like in this case, tweak how VoiceOver vocalises something by specifying the IPA (or International Phonetic Alphabet) notation.

Paella is a rice dish that is very typical from Valencia, the region I’m from in Spain, where we are right now. When using VoiceOver in English, it reads it as “Pila” and obviously I get very angry. But luckily you can do something as simple as specifying the IPA notation to correct the pronunciation.

Another solution for this particular usecase would be to specify the language VoiceOver should use. If you specify that VoiceOver should read Paella in Spanish, it will switch to the Spanish voice when it gets to that word and then back to the English one.

Other things you can do with an attributed accessibility label are: you can set VoiceOver to read punctuation marks, if it makes sense; you can also change the pitch; or you can even specify if a new announcement should be queued or if it should interrupt any ongoing announcements.


We could spend the whole talk discussing accessibility labels but I wanted to show you one more example. This is actually one of the most common mistakes you can find in iOS apps. As we’ve seen, the accessibility label derives from the title of the component. But sometimes there are buttons without a title. In those cases, you have to manually configure an accessibility label. Otherwise, VoiceOver may try to figure out the next most suitable text as an accessibility label. Sometimes it may get the name of the image (and as you can see in the slide, you don’t want VoiceOver to say things like Close underscore icon underscore at 3 x or Close underscore 0 0 1 underscore 2019), sometimes it may say just “Button”, and sometimes you may be lucky enough and VoiceOver might be capable of figuring out what your button means and say something like “Possibly close”. But none of these things make a good experience.

And just a reminder that you don’t need to specify the type of control in the label. This would bring some redundancy and VoiceOver would say something like: “Close button, button”.

In the case of a button to close a modal view, “Close” is a perfectly fine accessibility label.


After configuring good labels, it is important to configure the right accessibility traits. I’m going to focus now on a particular trait that I consider it is very important and it is often missed: the heading trait.

Headers are important to visually structure information. They help us quickly find what we are looking for. With VoiceOver, you can also navigate through headings by using the rotor. With VoiceOver on, rotate two fingers on the screen, as if you were trying to rotate an invisible knob. The rotor menu will show up and will allow you to select one of its multiple navigation options.

If you select Headings, a single swipe up/down will move VoiceOver’s focus to the previous/next heading. That’s much better than having to swipe multiple times to the right before you get to the next logical piece of information.

For this to work well with your app, you have to configure the heading trait to anything that represents a header in your app.

In this example, where we have topic clouds for Artists, Genres and Albums, you’d need 10 swipes to the right to get from Artists to Genres. But a single swipe down is enough when navigating through Headings.

The rotor is highly customisable and some other options that you can find and configure are: navigating character by character or word by word, or you can change the speaking rate or the typing mode, and more.


And configuring a header trait to a label that represents a heading is a simple as assigning UIAccessibilityTraitsHeader to the accessibilityTraits property of the label. A simple change in the code with a huge impact for your users.


Another important concept in VoiceOver is notifications. A simplified way of explaining how VoiceOver works: VoiceOver asks for the elements on screen to your app, which returns a structured tree with all the necessary metadata. That means that sometimes, if things on screen change, we may need to post a notification to UIAccessibility to notify that.

A common use-case for this is when presenting custom popups, popovers, modals, action sheets, etc. In this case we can post a screen changed notification and move VoiceOver’s focus to the new element presented on screen.

If the layout changes in a way that, after a user interaction, a new element appears on screen but it doesn’t represent a huge part of the screen size, we can post a layout changed notification. Dropdown menus, or item descriptions that can be expanded and collapsed, are some examples.

The difference to the user between a screen changed and layout changed notifications, is that the screen changed notification will play a sound that indicates the user that they basically moved to a new screen.


Posting both a screen changed and layout changed notifications would look something like this. You can call post on UIAccessibility. The first parameter is the type of notification and the second one is the view you would like to get the focus from VoiceOver.

The third line of code is also very important. When presenting a UI component that overlays the existing UI (like a custom popup), you may have found that VoiceOver starts to randomly jump between the overlaid UI and the elements underneath. To avoid that, you can specify that the presented view is a modal view in accessibility terms, and VoiceOver will then ignore its sibling views.


A third type of notification is the announcement notification. Useful to alert the user of something that happened in a part of the screen that is not currently on focus and that is not important enough to disrupt the user and move VoiceOver’s focus to a different place.

It can be used for example to announce certain messages like errors or long running tasks that have finished or some information on screen that has been updated.


As we’ve seen, when posting a screen changed or a layout changed notification, you can pass the view you want VoiceOver’s focus to move as a parameter. For the announcement notification, you can instead pass the message you want VoiceOver to announce to the user.

The message can be a plain string or an attributed string. If you pass an attributed string, you can configure an attribute, called accessibility speech queue announcement, that will tell VoiceOver if the announcement should be queued, when the value is true, or that it should interrupt any ongoing announcements, when the value is false.


Here are a couple VoiceOver gestures for power users. The first one is the escape gesture (a two finger scrub on the screen, kind of like drawing a z). It means back.

You’ll see that it works by default for navigation controllers. But you’ll need to implement this gesture’s behaviour yourself for modal views or for any other component presented in a custom way.

I call it the escape room because, if not implemented, it may leave the user stuck in some parts of your app. If you have any custom popovers, modal views or bottom sheet menus, you may need to add a couple lines of code for this to work.


The good news is that, by just overriding the accessibilityPerformEscape method in your presented view, you’ll capture that gesture and you’ll be able to dismiss your view. The last step is to return true if you successfully handled the gesture.


The second gesture I wanted to show you is the magic tap (a double tap with two fingers). The magic tap is meant to execute the main action in the current screen of your app. Think of playing/pausing some music or video, taking a picture or starting/stopping a timer.


And it is as simple as the perform escape gesture example. You can just override the accessibilityPerformMagicTap() that will help you capturing this gesture and you can then trigger the main functionality for the current state of your app.


A very important part for iOS apps being accessible is avoiding the one size fits all strategy. It is better to have a dynamic app that adapts to its users. By allowing them to do the necessary adjustments and customisations, our users will be able to enjoy our apps independently of their needs and abilities.

iOS actually offers a ton of configurations that let you know what the user’s preferences are so you can react and offer them a suitable experience. One of the last options introduced by Apple, in iOS 13, was shouldDifferentiateWithoutColor. We know that we shouldn’t rely just in color to signal important information in our apps. Now you can even read that preference from the user, so you can reinforce this principle even more, when it is enabled.

I think the best example for this particular configurations is games. Imagine you are developing a UNO-like game where a user throws a card based on the color or number of the previous player’s card. You could use this option to add a symbol to the card, so color-blind users can easily identify which card they should play. How could you have differentiated which color is each one of these cards in this black & white presentation otherwise?

By the way, Mattel has actually a colour-blind friendly version of its game that uses the colour-blind alphabet to differentiate the colors of its cards.


In iOS there are two things you can do with each one of these configurations: you can check if they’re enabled or not, and you can listen to notifications in case the user changes them while using your app.


But there are many more. Reduce Motion and Reduce Transparency are two important ones. Reduce Motion lets you soften some animations or remove them all together, if the user has this preference on, as they could for example, suffer from motion sickness and they could feel dizzy with repeating or abrupt animations. Reduce Transparency lets you make certain parts of your app less transparent improving contrast and removing distractions.

And other configurations available are: Bold Text, On/Off Switch Labels, Button Shapes, Prefer Cross-Fade Transitions, and more!


Custom actions let you perform an action on the focused element by VoiceOver, again by using the rotor. Custom actions are great when a particular action is executed with a gesture that is overridden by VoiceOver, think of the reply and archive actions when swiping a cell in your mail app.

But it is also great to make navigation faster by ‘hiding’ certain actions you can perform in every single item of a collection. A great example would be a Twitter-like app. Imagine you are in one tweet and want to get to the next one. Ideally a single swipe should be enough. If we let VoiceOver focus all its actions though (username, retweet, like and share), it means you’ll need 5 swipes. It would be very redundant, most of the info you’d be listening is the same. Plus it may not be clear which tweet you are liking anymore. Was it part of the previous tweet or the next one?

With custom actions, you can stop at any tweet, swipe up/down to rotate through the different actions and double tap on the screen to perform it.


You can then create an array of custom actions. Each action has a name, and an action to perform. Since iOS 13 there is a Swiftier block based API that can be used instead of a target-selector.

And in iOS 14 you can assign them an image too. Why an image? So you can show them in the Switch Control menu.


As we said at the beginning of the talk, sometimes by improving the VoiceOver experience, you can benefit other assistive technology users too. And custom actions is a good example. It makes navigating and interacting with your app with Switch Control easier too. And now by giving custom actions an icon, it is even easier to spot them in the menu.


Since iOS 13 you can control your phone with your voice. Apple called this feature Voice Control. It is a great way to spot issues in your accessibility labels. Just say: “Show names” and you’ll see on screen all the accessibility labels of components you can interact with. If it just says “Button”, it is probably missing an appropriate label, for example.

Sometimes though, you may find that the labels are correct, but too long to use for interactions. Or sometimes you may think that different users can think of the same interaction with different names. For example: For a settings button, a user could say: “Tap settings” or “Tap configuration” or “Tap cog”. But if the accessibility label of the settings button is “settings”, only the first of these three commands would work.

Apple has provided a solution for these situations though. With accessibilityUserInputLabels you can provide input alternatives.


As simple as passing an array of localised accessibility label alternatives to the accessibilityUserInputLabel property of a UI control.


Another important iOS feature is Dynamic types. It allows the user to change the preferred font size in the system settings. And I would say it is probably going to be the wider used “assistive technology” available in iOS, with many of us probably needing to use it at any point in our lives.

Sizes go from Extra small to Extra-extra-extra large. And you can enable 5 extra accessibility sizes: From accessibility medium to accessibility extra-extra-extra- large.

The use of dynamic types is pretty straightforward. Instead defining your own font sizes and styles, you can get preferred fonts based on predefined styles: Title, Header, Body, etc.


Since iOS 11, all the font sizes scale when using accessibility sizes and you can even use custom fonts that scale accordingly by using the UIFontMetrics class.

Setting adjust font for content size category to true will automatically adapt the size of your font if the user changes its preference while using your app.


But what is the point of offering your user to go from this…


… To this. The font scaling for the users preference but it doesn’t play well with the current layout! Bigger font, but less content doesn’t necessarily translate in a better experience.

How do we solve this? Let’s go to level 9.


If your current UI doesn’t scale appropriately to accommodate all your content, you can actually provide an alternative layout for larger font sizes in a few different ways.

The most common case I’ve seen is table view cells with an image in the left and text in the right, like in the previous example.

One thing you can do is having an entirely different layout as a separate nib file. If you prefer to create your UI layout using code, you can have different sets of layout constraints (for smaller and larger font sizes), and activate/deactivate the appropriate one, depending on the text size chosen by the user. And another option is to layout things with stack views and change the axis from horizontal to vertical when the user configures a larger font than a certain threshold.


You can check if the preferred content size category (from your view’s trait collection) is an accessibility size (or if it is bigger than a certain one, like bigger than the extra large size) and in that case, use an alternative layout for your UI.


And in case you think that it is not important to support such large font sizes, this slide that blurs two UIs, one with the default text size and another one with the largest one, tries to exemplify how a user with low vision could perceive their device.

As you can see, the example with the largest font size is still readable and the user could probably use the app without even needing VoiceOver while the example with the default font size is barely usable.


A less obvious accessibility feature is the Invert Colors one. I know we have now dark mode, but not all the apps currently support it. Invert Colors It is meant for people that may find easier to read light text over dark backgrounds if they find that the screen is too bright.

But Invert Colors inverts the colors of everything. Images then, go negative, as you can see in the example with Steve Jobs’ picture. And using Invert Colors shouldn’t come to the expense of an awful experience with images. Images in negative look horrible. And more importantly, Steve Jobs turtleneck was black, not white.

That’s why in iOS 11, Smart Invert Colors was introduced. It allows you to opt-out elements of the UI from being inverted. That means you can keep the images as they are, and even the branding colors of the app, if it makes sense.

Classic Invert Colors is still available too.


You can basically set the accessibilityIgnoresInvertColors property to true for any image or control you want to avoid inverting by Smart Invert Colors. Yes, you have to manually do it yourself, so it is not that smart after all…


Testing is very important. It allows you to find issues in your app and it lets you check if the fixes you applied worked as expected. You can’t bring the accessibility levels of your app up to 11 without proper testing.

Please, always test on real devices. But while developing it might be more convenient to use the Accessibility Inspector in the simulator. It is built-in in Xcode.

It has three tabs:

Inspect: Lets you check which accessibility properties (label, value, traits, identifiers, etc) you’ve configured to the elements you can see on screen. New in Xcode 11, there is a very handy VoiceOver player simulator.

Audit: Runs a simple accessibility audit in your app (on the visible screen in the simulator). Finds potential issues like: small hit areas, colour contrast issues (there is a built-in tool for helping you with that too), text that doesn’t support dynamic types, images that contain its file name as its accessibility label, and more.

Settings: Is very convenient to simulate some accessibility settings like Dynamic Type, Increased Contrast, Reduced Motion.


And we could have spent the whole talk discussing about ways of testing your app. But just wanted to quickly mention a couple more tips.

Can you use your app with you eyes closed? After activating VoiceOver, triple tap with three fingers the screen to switch off the screen but still be able to keep using your phone with VoiceOver. This feature is called Screen Curtain. As the creator of your app, you shouldn’t hopefully get too frustrated navigating through the main flows with VoiceOver with Screen Curtain on.

Right into Xcode you now have a feature called Environment Overrides that lets you do try some of the setting configurations available in the Accessibility Inspector, plus others: bold text, button shapes, or grayscale.

And it is extremely convenient to enable shortcuts to enable/disable VoiceOver (or any other of the assistive technologies). You can access these shortcuts by triple tapping the home button (the side button, if you don’t have a home button anymore). If you have Accessibility Shortcuts configured, you’ll find testing your app much more often.


And that’s all I had for you today. I hope you’ve found some useful info in this talk and that it helps you bring the accessibility levels of your iOS app up to 11.

You can find me on the internet with the handle @dadederk and I have a GitHub repo with code examples for everything we’ve seen in the presentation and more.

Thank you!