Alara – Accessibility Enabled

10 Jul

Alara 1.1 is now available on the AppStore. I’m proud to say that Alara now fully supports VoiceOver. Accessibility is something I really wanted to get in 1.0, but the schedule just didn’t permit it.

So while 1.0 was waiting in the review queue, I got to work studying how to best bring accessibility support to Alara.

I have to admit something that is a bit shameful of me…I’ve never shipped an app with true accessibility support before. Sure, all apps get some accessibility support for free (big kudos to the Accessibility Team for their hard work there). But to truly be a good citizen, you need to take some time to add specific support.

Which of the Accessibility features you need to support largely depends on the style and usage of your app. If your app is primarily visual, as is the case with Alara, you’ll want to support VoiceOver. This will help users with vision impairments use your app.

When it comes to VoiceOver, the Accessibility Team has really done a lot of the leg work for us. You’ll go a long way by just giving some meaningful labels and traits to your views. You do this either by using the Accessibility Inspection in Interface Builder, or in code by setting the accessibilityLabelaccessibilityValue, accessibilityFrameaccessibilityTraits, and other properties as needed.

The work really comes down to truly understanding how your app works, how it can covey it’s information clearly and concisely, and then expressing that in a style that is more auditory than visual.

Take a look at this screenshot


If you don’t do anything specific to support VoiceOver the app isn’t very helpful. For example, if you were to tap on a bar in the bar-graph you wouldn’t get any information at all. You could tap on the UV index label in the bar (it’ll just read the number aloud), or the time label below it (hearing the time isn’t very helpful without any details). That information is separated and it isn’t very clear as you’re tapping or swiping around what you’re getting at.

In this case it makes more sense to combine these views in the “eyes” of VoiceOver so that they are seen as one entity and when selected reads something aloud like “UV Index 5 at 4PM”.


I’m not certain I’ve got the wording just right yet and I’ll continue working to make it better as I go. If you’re a user of VoiceOver, or other accessibility features, and have recommendations please let me know either on twitter @leemorgan or email

The hardest part for me with accessibility was knowing where to get started. I’ll give you a hint. Turn on VoiceOver, launch your app, and start tapping around. You’ll get an idea about what needs a better description pretty quickly. Switch over to Xcode and give them whatever labels you think makes sense right then. Now go back and play with your app for a bit longer. You’ll probably notice all sorts of little things that don’t feel quite right. Maybe some information contained in two views is better understood by combining their description together? Maybe a button shouldn’t be VoiceOver enabled, and instead it’s parent view should be? These are all things you’re likely to think about once you start using your own app with VoiceOver. What are you waiting for? Go ahead, turn it on and play, then get to work. You’re users will thank you.