Accessibility in Firefox for Android: Introduction
Over the past summer I was an intern on the accessibility team at Mozilla. I primarily worked on making Firefox for Android (codenamed Fennec) accessible to blind users. I improved the speech output and touch interaction capabilities when using Fennec, and I also added the ability to interact with Firefox using a braille device instead of directly touching the phone. In this post, I describe the basic interaction model that a user would use to interact the phone. A recorded demo/presentation on Air Mozilla is also available for your viewing pleasure.
One of the primary ways blind users interact with a touchscreen phone on Android is by using a mode of interaction called Explore-By-Touch. In both the native Android UI and Firefox, a user can explore around the screen with their finger and the screen reader (the most common one is called TalkBack) will read out the object that is underneath the finger. Another way to explore all the items is the swipe left and right with one finger; this allows linear access through all UI elements, which can be particularly useful if there’s only a small button or two on a large tablet screen. Once a user has found an object they wish to interact with (e.g. a link), instead of simply tapping the screen once (which would get interpreted as an exploration touch), they doubletap on the screen to activate the object. Using a second finger or action is a common way to gain the “normal” behaviour. For example, instead of long pressing an object to open its context menu, you tap once and then tap again and then hold. Similarly, you need to use two fingers to scroll the viewport. Most of this behaviour already worked when I arrived.
I spent a fair amount of time improving and adding to the more interactive actions when using Fennec, most notably text editing. When I started, you could type into an input field, but reviewing and correcting text wasn’t possible (well, you could correct text by deleting and retyping everything, but that’s a major pain when your typo is at the start of a long string of text). Users can now, by selecting an option from within the screen reader, move the input cursor by different granularities (i.e. moving by character, word or paragraph) by swiping their fingers left and right. This allows both checking the spelling of what was typed and then easily correcting it. Moving by different granularities is also useful even when not editing text; using the same principles, it is then possible to read a paragraph character-by-character or word-by-word.
Using touch and speech output is only one way to interact with the phone and Fennec. It is also possible to instead control the phone using a braille device. The braille device pairs with the phone using Bluetooth; because of that, the phone doesn’t necessarily even have to be within reach to actually use it. At the start of the summer, you couldn’t use a braille device to interact with Fennec; while there would be speech output, the output from the braille device would be entirely blank. Now, this has been fixed so that there is braille output as you navigate around the screen, and this output is optimized for the braille display (more on that in a later post).
The braille device has different controls to move around the screen. There are buttons which essentially function like swiping left and right on the touch screen. When you land on an object, the cells in the braille output area raise to indicate the output. For each cell (which for simplicity, you can think of as a letter), there is an associated routing key which the user can press to activate the object that the letter is associated with. For example, if the braille output is “Login button”, hitting the routing key on any of those letters in “Login button” would be equivalent to actually pressing the button. Editing text is also somewhat different from touch. When a user inputs text, the braille output updates itself in real time to match what is entered. The input cursor is represented by blinking the bottom dots on the letter to which the cursor is adjacent. The cursor can then be moved by pressing the routing key.
Over the past four months, the accessibility in Firefox for Android has improved significantly. It is now possible to both quickly read and interact with a web page. This functionality is available both via touch with speech output and braille interaction. These improvements will be released in Firefox 24, 25 and 26.