May 18th is Global Accessibility Awareness Day (#GAAD), so as part of that I’d like to shed a little light on some recent advances in the field of accessibility for blind gamers.
Accessibility for blind gamers has been a difficult area for the games industry. And under-served audience who want content, developers who want to offer it, yet often an unnecessary technical barrier between. That is finally starting to change.
Blind gamers are a great audience to develop for. An eager, underserved audience with a hunger for content, and loyal fans and advocates for people who provide it. A fantastic group to work with, always eager to help. It’s an audience that is often written off as too small or as acts of altruism, but actually that’s not necessarily the case.
There are a couple of mobile games now that have tracked data on their blind players, with some interesting results. MUDRammer, an iOS MUD client. Blind accessibility took about a day to implement, and after implementation the developer found that 14% of his players were blind. The app sells for $5 a time, so he made an instant profit. And Solara, which is a visual strategy game so took a bit of effort to get it accessible, about 1 week of dev time. The tracked the data, and it came out at around 1% of their players, which is around the proportion of the population who are blind. That 1% however played for much longer than any other players, and spent much more on IAPs as a result – they were Solara’s whales.
Just anecdotally I’ve even known a blind gamer to spend over $1000 on IAPs for one single game.
These things all come down to the same phenomenon; catering for an underserved audience. There’s high demand, and low competition. When a game surfaces that is blind-accessible, word gets around the blind gamer community very quickly.
So there can be a real business case. But you do have to be realistic, the audience size is low and the functionality is niche, without benefit to other players. So for it to be financially viable you really need to;
1. Create a mainstream game that happens to also be accessible, rather than building niche games only for blind gamers.
2. Keep your costs low.
Unfortunately, the second point has been a real blocker. There are many games that are perfectly mechanically suited to blind accessibility, with gamers requesting it and the developers wanting to implement it, but the technology has made it unfeasible, too difficult and expensive.
Screenreaders
Blind accessible games fall into a few camps:
1. Games that have accidentally accessible gameplay, due to good sound design and sometimes assists (e.g. Killer Instinct)
Blind gamer Carlos Vasquez (OBSKHRattlehead) competing at EVO, playing Mortal Kombat X through good sound design alone
2. Audiogames, designed from the ground-up to be played primarily through audio (e.g. Papa Sangre)
3. Games that are have gameplay that is compatible with external screen-reader software (e.g. King of Dragon Pass)
It’s predominantly the third item that I’m going to discuss a little with this post. However, screenreaders are also relevant to the other two. Mainstream games with audio-accessible gameplay, such as Killer instinct, would still benefit from screenreader accessibility as cost effective way to make their UIs blind accessible, and the UIs for dedicated audiogames are often based on screenreader compatibility too.
Screenreaders are the primary way that people with low or no vision interact with technology. They literally read out what’s on the screen, via configurable synthesised speech. All modern desktop/mobile operating systems have screenreader software built in.
Cursor based navigation doesn’t really work if you’re blind, so instead on PC navigation is through moving through each item on the screen in a linear fashion. On mobile it’s a big different, you still have the choice of cycling through each element in turn but it also changes the way that gestures work, so that you can drag your finger around the surface of the screen and it will speak out whatever your finger is over.. so rather than a linearised equivalent, you can use your finger like an eye and build up a mental picture of the layout of the screen.
If you’re developing a native app it is easy to make your app compatible with screenreaders. The biggest piece of work involved just making sure things are labelled correctly, and the software takes care of the rest. As using decent labels should just be standard practice anyway, it can even be so straightforward that it is possible to do unintentionally.
Blind gamer Debbie Fisher with Hanging with Friends, a game that she was able to play using a screenreader without developers having put in any blind-specific dev time
However it relies on the screenreader software being able to look at the native UI elements present, and engines do not output any native UI elements. As far as the OS is concerned game rendered by an engine is pretty much a single UI component containing a bunch of pixels.
This is where the cost barrier comes in. It means the only option for a developer using an engine is to either recreate screenreader type functionality within a game, to record voiceover for all interface elements, or to dump their engine and start again natively. In some rare instances one of these three options is chosen, but generally none of them are are either realistic or financially viable for most developers.
This causes all kinds of problems, including difficulties with dev/gamer relationships. I’m not going to go into detail on that here, but it is something that I’ve written about before.
Unity accessibility plugin
That has now changed, Michelle Martin has just released a plugin to work around it by essentially replicating the entirely of the iOS and Android screen readers inside Unity (quite a feat!). So once you have that installed, it’s almost as easy to make your UIs blind accessible as it is in native app development. It’s just about making sure UI elements have the right attributes (sensible labels, indication of state where necessary, and optional usage hint if needed), and that the right text strings are sent when the player needs to be notified of something happening (e.g. “you have a new chat message in your inbox”, “building complete” etc). All of the gestures, focus handling and so on are taken care of for you. And best of all, it’s a standardised set of information that works across different platforms.
Adding accessibility information to game UI using the UAP plugin
There are plenty of game mechanics that just aren’t suitable for blind accessibility, but if in particular you’re building an interface based game without need for precise timing (as many mobile games are, from Football Manager to Words With Friends, Reigns to Hearthstone), it’s now drastically easier to tap into that underserved and very loyal audience.
More info:
https://m.youtube.com/watch?v=SJuQWf7p9T4
http://www.metalpopgames.com/assetstore/accessibility/doc/index.html
It would be better for the engines to handle this kind of thing be default at engine level rather than going through a plugin. But Michelle’s work plugs that gap, and allows devs a way in that wasn’t previously there. It is currently mobile only with some limited functionality working on PC, but she’s looking at getting it working across as many platforms as possible.
Tolk
For PC games there’s a library called Tolk, by Davey Kager. This is a simpler push system to the Unity Accessibility Plugin, it essentially just waits for text strings to be sent to it, and then pushes them on out to the various different PC screenreaders available.
https://davykager.com/projects/tolk/
Even though it is push based, meaning you have to handle focus management yourself, and implement your own system for relaying UI labels to the library on focus, it’s still pretty straightforward. Here’s an example of a game that uses it, Skullgirls, it took a total of five hours to implement:
https://mobile.twitter.com/MikeZSez/status/857516611925704704
Every area of the Skullgirls UI is screenreader accessible on PC, using Tolk
Xbox
Hot off the press from this year’s GDC are two new APIs from Xbox, that work across both Xbox and PC. One for text to speech for UI, and another for text <-> speech for in-game communication.
So firstly the text to speech for UI. The Xbox has had a text to speech system for some time now for navigating system menus and some third party apps, but it has now been made available for game developers to use. It is a very similar system to Tolk – just send a text string to it and it will be read out. So it isn’t a full screenreader, you still need to have focus management and a way to send the messages implemented at game level (or better still Unreal/Unity etc. if you’re listening – engine level).
Xbox Narrator
As with Tolk it doesn’t have an engine-side set of attributes for states etc, so you need to plan for that kind of thing yourself, for example stating that a button is a button, and whether a checkbox is ticked.
Then the second API, the text <-> speech for online communication. This is a great piece of functionality for people with all kinds of impairments, not just physical but temporary and situational too. It quite simply transcribes in near-realtime all voice messages to text, and all text messages to voice. It isn’t perfect, but is a great start, and can make a huge difference in the ability to participate of anyone who has difficulty seeing or reading text, difficulty hearing, or difficulty speaking.
Live text <-> speech transcription in action in Halo Wars 2
More information on both of the Xbox APIs, together with a range of other Xbox functionality for both players and developers:
https://channel9.msdn.com/Events/GDC/GDC-2017/GDC2017-009
Contact the Xbox accessibility team if you’re interested, they are keen to help devs with implementation.
Testing
Even if your game is mechanically suited to blind accessibility and has a solution like one of the above implemented for accessible UI, either on its own or in conjunction with gameplay accessible through sound design, that’s no guarantee that the game is blind accessible. An interesting example being and audio game called Ear Monsters. The dev thought it would be sensible to auto rotate the screen in case blind gamers couldn’t see that they were holding it the wrong way up, but then getting a load of feedback from blind gamers who couldn’t play it.
He had been designing to a picture in his head of how people in general hold devices, not thinking that of course people who have no vision have no reason to hold a device up to their face or even in front of them. The players in question had been playing with it on their lap, with the far end sloping slightly downwards away from them, so the auto rotate was actually causing the game to be upside down.
As previously mentioned, blind gamers are fantastic bunch to work with and get feedback from. If you want to find blind gamers to test with, check the forums at audiogames.net and applevis.com, they’ll be falling over themselves to help. You can also find people through social media, for example a twitter people search for ‘blind gamer’. For in-person, your best bet is local blindness organisations and groups. The national orgs can sometimes be a bit understandably protective over their members, but if you get in touch with one of the groups in your local city they’ll be very keen to help.
“The blind community is ready to work with you. We want to play with your games”
– Brandon Cole, #GAConf 2017
What does it all mean?
Blind-accessibility is an area that is really starting to gain traction now.
There isn’t a magic button to make games accessible to blind gamers, and the number of games that are mechanically suited to blind accessibility is smaller than the number of games that are not. But for the right games, the barriers are tumbling down, the doors are opening up.
This is great for blind gamers, great for the industry in general. And also an opportunity like never before for developers who are willing and able to get in there early, while competition is still low.