I’m someone that, in theory, loves the idea of online multiplayer competitive shooters, even if I suck at playing them in practice. I have a few disabilities that impact my fine motor control, as well as my reaction speeds, which means I’m never going to be amazing at playing games that require precision and fast responses, but it doesn’t mean I don’t enjoy the genre on paper.

However, one of the aspects of online First Person Shooters I struggle with, but don’t talk about very often, is processing and acting on directional audio.

I live with intermittent hearing loss in one of my ears. It’s not so severe that I can’t hear on that side at all, but it’s notable enough that there is an imbalance in the volume and clarity each of my ears perceives. This has been something I’ve lived with for years, a condition that flares up at times to around 80% hearing loss on one side for brief periods, and never truly goes away entirely.

This makes acting on audio cues in first person shooters, even with a good pair of surround sound headphones, tricky. I can’t hear a gunshot, or footsteps, and instantly intuit which direction the sound came from.

This is why I was so interested when I came across Sector’s Edge, a Free to Play First Person Shooter on Steam that is experimenting with showing ray traced audio in game as coloured radiating light effects, to help deaf players identify sounds, their sources, their volume intensity, and the direction they’re coming from.

So this week, I loaded up Sector’s Edge to see what the mode was like in practice. The game’s far from perfect, but I am nonetheless impressed by what they’re attempting to do.

For transparency, at the time of making this video, “Deaf Mode” in Sector’s Edge is only available in Beta. At the time of recording, the mode can be accessed by going to Steam’s Beta Branch menu for the game, and typing “betabetabeta”, all in lowercase with no spaces between the words.

Once signed up for the beta branch, by navigating to audio settings, players can turn on “deaf mode” which switches off all game audio, and removes the audio customisation menu, but in its place opens a menu allowing for customising the game’s audio visualisation settings.

Players can customise footstep colour, gunfire colour, the colour used for “other sounds” which covers things like explosions, as well as allowing players to control the opacity of loud and quiet sounds independently, and the size of the coloured dots used for loud and quiet sounds. By tweaking the opacity and the size scale options, it becomes easier to, at a glance, see if a colour is loud or quiet, and as a result how nearby it is likely to be.

As someone who struggles with directional audio cues in video games, I honestly really like what this setting is trying to do. I found it a LOT more natural to see a red light coming from my right hand side, and I found it a lot easier to know not only which direction to turn, but roughly how far to turn to face the source of the sound, and how close roughly the person shooting at me was likely to be. I undoubtedly found it easier to know where enemy combatants were using this mode.

Now, before we go much further into this video, I do want to take a moment to acknowledge the fact that Sector’s Edge mutes player audio when this mode is turned on. I want to talk about what that’s likely trying to achieve, and how I feel about that choice.

By muting player audio when using deaf mode, I believe that the developers of Sector’s Edge are trying to create equity of experience for deaf players, rather than equality, an admirable aim that I understand, and want to discuss.

For those unaware, the difference between equality and equity, when we’re talking about accessibility as a concept, is the difference between one size fits all support that gives the same tools to everyone, and tailored support aiming to give everyone an equal experience, levelling the playing field.

You may have seen this classic image around the internet, or some variant of it, of equality represented as everyone being given a box to stand on to see over a fence, not taking into account that some people can already see over the fence unassisted, some people will need more than one box to be included, and some people will need something other than a box to get to the height they would need to be on a level playing field. Equity is attempting to offer support that brings those most in need in line with the experiences of others, to create equal outcomes in the end even if that means not offering everyone the same one size fits all tools, whether they need those tools or not.

Circling back to Sector’s Edge, the intention of Deaf Mode here, in the context of a competitive first person shooter, seems to be to ensure that deaf players are able to get support that brings their competitive experience in line with those of players with perfect hearing, without in the process giving hearing players extra tools and recreating that imbalance. Basically, if hearing players can hear the audio, and deaf players can see it, they should be on a fairly even playing field when playing competitively.

The fear, I suspect, is that allowing audio to remain audible while deaf mode is switched on would encourage hearing players to play with both audio and visual cues at the same time, meaning that deaf players would still be at a competitive disadvantage, having one less source of sensory feedback to act upon.

In order to create equity of experience for deaf players, the mode is only available with sound off. If hearing players want to use the visual tools for sound representation, they are welcome to, but they must do so on an even playing field with deaf players.

There is a part of me that is disappointed at the lack of ability to use both audio and visual feedback at once, due to my own personal use case. My autism and ADHD in combination cause me to struggle with sensory processing, and I tend to find media easier to follow if I have more than one sense engaged on the same information. It’s why I default to watching TV with subtitles on, I find reading and listening together easier for me than just one or the other, and having audio and visual feedback for sound honestly is a dream come true for my specific situation. I would personally really benefit from being able to hear a sound, even if I can’t hear which direction it came from, then use visual information as a supplement to know where the sound came from. Even if I was given mono audio, alongside this visual direction indicator, that would be really helpful for me.

But, in the end, I think Sector’s Edge has made the right choice on balance by turning off audio while this mode is active. Aiming for equity in an online competetive first person shooter, or honestly in any kind of competetive video game, and offering tools exclusively aiming to improve the chances of one group of disabled users being able to compete on an even playing field, is something we very rarely see, and something I do want to applaud. These developers made an active choice to try and make sure that deaf players got a unique option that helped them, and wasn’t provided to people who didn’t need it, and I see huge benefits to that.

And here’s the thing, as a gamer who can hear, just not very well on one side, I would happily play with this mode on in any competitive FPS that offered it. Giving up sound, but gaining visual information I find easier to process and directionally understand is a wonderful option, and one I really appreciate. If turning off game audio is the price I pay to access that, it’s a price I will pay gladly.

Sector’s Edge is not a perfect game. There are very few people playing online to matchmake with (at least on the beta branch) so I was mainly playing against bots, and the game could benefit from incorporating Apex Legends style communication tools for allowing deaf players, or nonverbal players, options for communicating with teammates other than using voice chat. EA released the patents on those by the way, so, like, maybe look into incorporating some of that stuff into your ganme, but I really respect the aim and the execution of what they’re trying with this gameplay option.

More competitive video games should honestly consider creating accessibility modes aiming for equity of experience for disabled players. Not every tool has to be available for every player, and a trade off of tools can help ensure that specific players get to compete on an even footing.

Previous post PS VR2 Hardware Accessibility Review
Next post Let’s Talk About Final Fantasy XVI

Leave a Reply