Accessibility on iOS: Working with VoiceOver support

Accessibility on iOS provides visually impaired iOS users the ability to use an awesome touch screen platform to its full extent. It’s your job as an iOS developer to ensure your app works wonderfully with Accessibility on iOS.

When considering adding features or optimizing your app for Accessibility, don’t assume we’re just talking users who are completely blind. And despite being all iOS being full touch screen devices, there are lots of visually impaired users outthere, ranging from users who are blind, hard at seeing or colour blind, for example.

Apps, such as Mail, numerous Twitter clients and more include the possibility of changing text on the screen to a larger font size. Matt RixTrainyard includes an option for players who are colour blind. These are just a few examples of improving apps for Accessibility.

Also consider interaction. While a scroll requires just a flick with one finger for regular use, having VoiceOver enabled scrolling is done with a three finger swipe, and moves the UIScrollView (and any NSObjects that inherit from UIScrollView) a page at a time. So while a pull-to-refresh is awesome, for user with VoiceOver enabled, it might not be as apparent or ease to use.

In this post, I will focus a bit on optimizing your app for VoiceOver. Before continuing with this post, check the video below where a blind user shows off VoiceOver and demonstrates how to use it.

I also recommend enabling VoiceOver on your own device and playing around with it for a while, just to get familiar with it. One thing is to know about it and use it with the Accessibility debugger turned on in Simulator, another thing is to experience it for yourself!

While most of UIKit works fairly well out of the box with VoiceOver, there are plenty of things you can do to improve how your app works with VoiceOver. If you’re doing any custom UI (especially drawing) VoiceOver will most likely not work, or it may speak out the names of the object. So if you call a custom UIButton “createNewTaskButton,” that’s what it will say. Instead, it should probably say “Create New Task”.

Consider your own app(s) and what use visually impaired users may have of it. I made a transit app, so it was only natural to optimize it for visually impaired users and add custom and improved VoiceOver support. With VoiceOver enabled, users can tap the time until the next transit vehicle arrives and the device speaks out in plain English “Next TTC vehicles arrives in 5 minutes and 32 seconds. Tap again to hear updated time.” If I had not customized it, it would have just said “5:32.”

Underneath are subsequent vehicle arrival times. As you can see from the Simulator screenshot, without VoiceOver optimization, the device would just say “20 min bullet 25 min bullet 26 min.” With Voice Over optimization, it says “Subsequent TTC vehicles arrive in 15 minutes (pause) 20 minutes (pause) and 25 minutes. Tap again to hear updated times” Notice the (pause) and look at the screenshot. VoiceOver speaks pretty fast, so if you would like a bit of a pause, just insert a period, and it will pause a bit between the words. This is a great way to split up important information.

Also notice how it will say “and” for the last arrival time. If there are more or less times, it would only say it for the last item. Remember attention to detail in these cases. If someone was to tell you the information for what you touched on the screen, how would you best understand it? There’s a huge different between spoken language and UI. Keep this in mind!

For the buttons with which users can change a route, the label would look like this “506 • Carlton.” In this case, the literal VoiceOver sentence would be “506 bullet Carlton.” So I went ahead and customized this to “506 Carlton,” so it wouldn’t speak the “bullet” part of it. For the custom UIButton where the user can add or delete a route/stop selection to their Favourites, I added “This is a favourite. Double tap to remove from your favourites” and if the selection is not a favourite it would say “Not a favourite. Double tap to add to your favourites.”

If you’re doing any custom drawing, for example as I have done in the example on the right, you will need to pay extra attention to VoiceOver optimizations.

For flat, drawn views, VoiceOver cannot access any of the text or other information on screen. In this example, I draw each cell, so there are no UILabels or UIImageViews for VoiceOver to recognize and speak.

I therefore set the accessibilityLabel property on each cell which then provides VoiceOver with information about what to say. As you can see from the example, according to Accessibility Inspector, VoiceOver will say the route number and name, and because the item is a current selection, I also made sure VoiceOver lets the user know about this by saying “Current selection” after the route number and name.

For non-VoiceOver users, the checkmark is a great visual marker for the item being a current selection. But for VoiceOver users, this would have not been the case. Especially since it’s a custom drawn view, VoiceOver has no clue about the checkmark. In your Accessibility optimization, not only consider what VoiceOver would say, but also consider the UI and what VoiceOver can see, or should see that is related to the item the user it touching.The user may not know about a certain UI object or visual marker without VoiceOver telling them.

Accessibility Inspector in iOS Simulator is a great tool to help optimize your app for full VoiceOver support. Tapping the X button disables the Inspector and hides most of its UI. This enables you to see the app without VoiceOver enabled to quickly get to the content you’re currently debugging for VoiceOver, or since VoiceOver scrolling requires three fingers, it will allow you to scroll when you disable it temporarily.

While Accessibility Inspector shows what VoiceOver will say, you should again test on an actual device with VoiceOver enabled. As mentioned, one thing is how something looks and “sounds” when it’s written, another is how it sounds when VoiceOver speaks it.

So these were a few of the consideration I made for one of my apps in optimizing it for full VoiceOver support. For more on the subject, I urge you to read Apple’s Accessibility Programming Guide for iOS, which gives a more detailed overview of the possibilities as well as more in depth examples and usage guides.

Furthermore, Matt Gemmell has written a great post on it as well with more details on Accessibility properties, among other things, and it’s totally worth checking out.

Core Location presentation at TACOW

Last night, I did a presentation on Core Location for TACOW’s (Toronto Area Cocoa and WebObjects developers group) quarterly meet-up. I volunteered during a lunch with the group’s co-founders David (@rebeld) and Karl (@kolpanic) on the last day of WWDC.

I picked Core Location since I had learned some good insight at WWDC and also worked with the framework thoroughly through building my app Next TTC.

The presentation went well and was well-received, and I am now posting it here for everyone else to enjoy. I have left my notes in the presentation, but I am not sure how much help that will be. I suggest you view the presentation in Play mode, just so the animations help understand the content a bit better.

Download: Core Location Keynote presentation

Enjoy!

Next TTC follow-up + TACOW Meet-up tomorrow

It’s been a long day and I think I’ll *just* make the deadline for my #idevblogaday post. This morning, Toronto Transit Commission (TTC) released the much anticipated bus data for their Next Vehicle Arrival Service (NVAS), which previously only included streetcar arrival time predictions. Read more about the app my previous Next TTC: Behind the Scenes.

In anticipation of the release of the bus data, I had done some updates for my app, including hosting some of the data on my own server for speedier transfer, etc. However, with a jump from 300KB to nearly 4MB of data for the initial config XML file, I have some work ahead of me to improve the app and it’s handling of the data. Furthermore, some of the bus routes also have a bit of different configurations, in terms of their route format (some routes have a/b/c/etc. directions/destination for the same route), so I have to change the parsing up a bit to properly display this to users on those routes. I also have to change the way I store data and access it on the fly, so lots of work ahead of me this week! I need more time in the day!

But, apart from the slowness of the app (until I finish the update), it’s been pretty good today. I was able to update to include the new data this morning, which meant I got some good press today.

I was mentioned in National Post, CP24, CityTV, and finally interviewed for a great article in Globe and Mail which was posted tonight (hopefully making it to press for tomorrow’s paper!). I’m really happy about these mentioned. As far as I could see from my app’s ranking (went from ~#30 in Navigation to #2 in that category, and made it into Top 100 overall – both for the Canadian App Store of course, which is quite different from the US Store, in downloads) sales have been pretty good today, but most exciting is all the awesome feedback I have been getting from users who are ecstatic about finally being able to use the app for their bus commute (and night buses and streetcar, not previously included).

All in all, today’s made me pretty excited about my app again. You always end up getting a bit tired with a project after staring at the code for a long time, or literally using every aspect of your app possible, or painstakingly tweaking every little pixel of the UI if you’re me. I have a bunch of ideas on the table that I want to do after fixing what needs to be fixed. And I can honestly say, it’s made it a bit more interesting now there’s finally a larger user-base to cater to – and more copies to be sold.

Finally, if you’re in Toronto and free tomorrow night, I am presenting at tomorrow’s TACOW (Toronto Area Cocoa and WebObjects developers group) meet-up. I’m doing a presentation on Core Location, how-to-use-it-basics plus some insight/tips and experiences from myself using it. Should be fun! The details for the meet-up is on the website, and we’ll be heading out to a pub afterwards for some drinks.