We’re just putting the finishing touches to VoiceOver accessibility support for our National Rail Enquiries iPhone app. When adapting the app for VoiceOver, we found that Apple’s developer documentation for accessibility was pretty good, but there were still several questions we couldn’t answer. After some help from Apple, and some experimentation and research, we’ve managed to answer most of our queries. I thought it might be useful to share what we discovered, in case other developers have run into the same problems. Here are our questions and findings.
1. Is there a way to set the accessibility label of a UIAlertView or UIActionSheet button to be something other than the title of the button? For example, we have a button with the UK spelling of “Add to favourites”, but VoiceOver does not pronounce this correctly, so we would like to set the accessibility label to use the US spelling (“Add to favorites”) to ensure that VoiceOver pronounces it correctly. Likewise, is there a way to set the accessibility label of the UIAlertView or UIActionSheet’s title to be something other than the title?
A: Unfortunately, UIAlertView and UIActionSheet don’t make their UIAccessibility model publicly accessible. The only way to achieve this would be to create our own custom modal views, which we’d rather not do. The alternative is to query and modify the view hierarchy of the UIAlertView / UIActionSheet instance, but this is generally a bad idea, as the model might change in the future.
2. Is it possible to get VoiceOver to read out an action after an event has completed? For example, our app uses a refresh button to reload a list of live train times. Depending on the current network coverage and network speed, the reload can take several seconds to complete. We would like to notify the user via VoiceOver once the refresh has finished, to prompt them to re-read the results detail.
A: Because the iPhone OS accessibility model is based on user actions and gestures, it can’t be triggered programmatically. We could use System Audio Services to play a sound, but this would need us to be able to detect if VoiceOver is turned on, which isn’t possible (see 6 below).
3. Is it possible to assign VoiceOver focus to a particular accessibility element when programmatically initialising a viewcontroller? We are finding that for most of our viewcontrollers, the UINavigationBar’s back button receives the VoiceOver focus as soon as the controller is loaded, presumably because it is the top-left-most UIControl. It would be more useful for the user if the ViewController’s title had the focus and was read out in full, and / or if we could set the focus to be on a particular user interface element ourselves.
A: For some scenarios, it might be possible to affect this by changing the hierarchy order of the subviews in Interface Builder. In this example, however, the UINavigationBar’s subview hierarchy isn’t something we can modify, and so we can’t change this behaviour.
4. When should an element be given a trait of UIAccessibilityTraitButton, and when should it be given a trait of UIAccessibilityTraitLink? We have many UITableViewCells with disclosure arrows, each of which take the user to a deeper UIViewController, and it is not clear from the documentation as to whether these should be buttons, links, or should have some other kind of trait.
A: Apple apps seem to use UIAccessibilityTraitButton for tableview rows with detail disclosure arrows. UIAccessibilityTraitLink is really only for elements that behave like a URL link would – such as opening Mobile Safari, or making a phone call.
5. How should UIAccessibilityTraitSummaryElement be used? Our application restores its previous state when launched, and we would like to give an indication of the restored state. However, the documentation does not explain how to use this trait to provide information to the user.
A: UIAccessibilityTraitSummaryElement trait is a hint-like trait, rather than a control-type trait (like UIAccessibilityTraitButton). It is most appropriate when you have some changing information that the user should hear right away when the application starts. For example, the Weather app uses this trait on the small “City and current temperature” control when it starts up.
This works for Weather because this application’s view hierarchy is fairly flat. If you had a summary trait element inside a subview inside a navigation controller hierarchy (i.e. something not at the root level on startup), then VoiceOver might not recognise it at startup.
6. Is it possible to detect whether VoiceOver is activated, in order to make the app behave differently? Obviously the primary aim is to make the existing interface accessible, but there are occasions where the visual interface is not best suited to VoiceOver accessibility use, and an alternative / simpler interface would be preferable. Since VoiceOver can only be turned on or off in the Settings app, we could be confident within a given app launch session that VoiceOver was consistently either On or Off, and could respond accordingly.
A: No. (If like us you’d find this useful, do file a request using Apple’s bugreporter system.)
7. Our app uses the word Live (pronounced “liyve”, meaning “real-time”) in several places, but VoiceOver reads this as “liv”, meaning “to live”. Is there a way to provide an alternative pronunciation in these cases? Should we just use a phonetic spelling for the accessibility label?
A: Phonetic spelling seems to be the way to go for alternative pronunciation. You can use the Mac OS “say” command to try out alternate spellings.
8. Is there a way to set the accessibility label of a UIViewController’s title view to be something other than the title of the UIViewController, without having to set a custom view as its UINavigationItem?
A: Same problem as question 1 – unfortunately, the underlying accessibility for the UIViewController isn’t publicly available.
9. Sometimes, when we return a custom view from tableView:viewForHeaderInSection: for a grouped table, we are finding that the custom view’s VoiceOver highlight rectangle is drawn in the top left of the screen, regardless of the actual position of the view when drawn on screen. This means that the user cannot move their finger over the actual rendered position of the header view on screen in order to read its accessibility info. The view is read when using the single-finger-swipe-from-left-to-right command to move to the next control in the controls list (and likewise when swiping from right to left to move back), but the rectangle is still drawn at the very top left of the screen.
A: This seems to be a bug. If you see this problem too, please log a bug with Apple using their bugreporter.