Accessibility hacks! adventures in hands-free gaming

hey folks, so long story short I have been struggling with some injury or other RSI like thing that is making it hard to use mice, keyboards, and controllers. been going on for a while and I’ve been experimenting with various things to make it possible to play games. (I’ve been working with a doctor on the root cause too)

I recently hit upon a really useful tip and thought it would be cool to start a thread to share that but also solicit advice, suggestions, stories and so on from any other folks who have figured out neat ways to play games with alternate inputs!

Okay so first a story: most of the games I play are strategy games, which are best with a mouse & keyboard. I’ve really struggled with a good replacement, the best was a Steam controller which I successfully used to play Total War 3K, but still flared up my symptoms. Anyway, I had kinda given up but then found this neat hack: velcro a cheap gyro remote to a hat and combine it with (also cheap) voice software for clicks. the video of playing Into the Breach with it was truly revelatory to me, it looked so easy! so I bought this set up a couple of weeks ago and have been getting used to it. It’s hardly perfect but given that other head mice cost an order of magnitude more it is amazingly effective, and means I am not compromising at all with continued use that causes problems. so far I haven’t had any neck pain or anything but it’s obviously something I’m trying to watch out for.

And now a question: I’ve also been playing Ring Fit Adventure, which is very doable for me (well it wipes me out because I’m out of shape, but anyway it doesn’t cause me pain) and I’ve been thinking how that controller could be repurposed for general purpose (think of the menu mapping with rotate left/right/up/down, you basically have a D-pad) and it made me think of DDR pads. There are all those folks doing wild things like beating Dark Souls with a dance pad and I wondered how hard it is to use them as a general purpose controller for slower paced turn based games like ITB, or other tile based RPGs and so on, where a D-pad and some buttons are all you need. Has anyone messed with this at all? The little research I did wasn’t super helpful, I was a little disheartened that the hardware seems real hit or miss.

Anyway! hope this topic is not out of place, and thanks for reading …

7 Likes

Rebinding a DDR pad shouldn’t be too hard actually assuming it’s following the normal input standard, I think there’s also a few third party manufacturers for PC. Probably combine it with JoyToKey or UCR to remap to whatever you need.

Logitech sells a lot of stuff I feel like could be repurposed for accessibility including an Adaptive Gaming kit. I believe, I have not confirmed this, that almost all of their devices play nice with JoyToKey and UCR.

Razer sells an interesting mini keyboard with a 8 way direction dpad attached to it, could be good if it’s just one hand/arm that you can’t use.

Kensington makes a lot of different trackball devices including this rather large one

There’s apparently now wireless handheld trackball devices that might work for you

Look up USB Foot Pedal on Amazon, there’s a decent selection of these types of devices now.

People have built a few homebrew PC programs for using with the XBOX Kinect to control mouse movement.

The DIY electronics market has exploded over the last few years so being able to make your own custom controller setup is not too far outside the realm of possibility. Look into arcade parts, a lot of that works well with an Arduino which your PC can then recognize as a normal HID.

yeah I assumed it was straightforward, was more curious if anyone has experience doing this or has recommendations wrt hardware…

this depends on the Xbox adaptive controller right? or are the bits from Logitech standalone USB devices? I had looked at the XAC a while ago but dismissed it as it can’t really replace a mouse afaik, but I should look into it again to combine with my new set up

I actually have a foot pedal and forgot to mention it in my post. It’s a little awkward but I have been trying to figure out how to fit it into my set up.

That’s a good tip as I had forgotten about the Kinect, will look into it

Looking more into it, yes that would appear to be the case. However the Xbox adaptive controller is an Xbox controller so it should register as a normal Xbox controller on your PC it just has different input methods.

Quite honestly I’m a bit surprised that there isn’t a company out there who is trying to be the madcatz of accessibility gaming. Like I said we are kind of in a golden age of small electronics and there’s a lot of potential for building out cool accessibility devices using off the shelf electronics parts.

For example capacitive touch sensors could be used to allow someone to make their own custom buttons.

Coming back to this thread after a few months of experimentation and research and I think you are totally right about this, but there is a kind of William Gibson “the future is already here just unevenly distributed” element to it. I’ll elaborate but first I have to explain why watching Crusader Kings 3 during Save Point convinced me to buy a DDR dance pad…

CK3 should be pretty playable with a head mouse, but one of the things I’ve found is that RTS style camera movement with WASD really doesn’t translate well to voice control. Having to say “pan left”, “pan up”, etc really slows the game down, so I’ve been sticking to games with a single screen, like Into the Breach. There was a brief discussion about disabling edge panning during the CK3 segment, and I got it in my head that if I could just add a d-pad into my setup it would make playing these games a lot smoother.

This was also partly inspired by this recent video from PAX online from a Twitch/Youtube fellow named Super Louis 64, someone whose stuff I was vaguely familiar with but didn’t really pay super close attention to. I think I had written DDR pads off because of videos exactly like Super Louis 64’s, where the emphasis is on playing these notoriously hard games like Dark Souls. But his intro video had such a disarming and fun attitude to it that I started thinking twice about it.

I also got an urge to play Pyre after watching Save Point, so I decided to try it with the mouse controls and my foot pedals, and to my surprise it worked way better than I thought it would. But my foot pedals only have 3 switches (Pyre really needs 4 buttons at minimum, and is better with 5) and they’re not really designed for gaming so don’t feel like they should be stomped on in a real-time game.

That was enough to convince me to finally buy a USB dance pad and experiment with combining it with my head mouse. I’m already pretty comfortable with the head mouse, but I figured the dance pad would add a big level of challenge, so I turned on aim assist in Pyre. But it works so well that aim assist made the game too easy and so I’ve turned it off :joy:

The new twist this added was that I really need to raise my display closer to a standing desk height to be comfortable playing while standing. I ended up using an old 720p projector I have which is not great, it’s not doing Pyre’s gorgeous art any favors, but it’s fine. However this doesn’t really work for CK 3 because some of the text gets hard to read at this resolution, so now I’m in the market for a standing desk or a better projector set up…

There’s a bunch of other things I want to experiment with, specifically Super Louis 64’s Ring fit mods which make it possible to use the Ring fit controller on other Switch games but also I think the PC? The other thing I’ve been meaning to check out is this open source machine learning project which allows you to train a custom voice recognition model using sounds instead of words.

So anyway this is what I mean when invoke that William Gibson quote. Games like The Last of Us 2 get a lot of press and praise for their accessibility efforts, and the results are certainly laudable. But my experience has been that the biggest benefit for my specific issues has been exploring the work that hobbyists are doing at the fringes.

3 Likes

I’ve been continuing my quixotic search for the perfect hands-free WASD over the past month and started looking into this. Because although I’m still playing Pyre with the DDR pad, barring an expensive display upgrade, that set up really doesn’t work well for the kind of text heavy strategy games that typically use RTS style camera controls.

I was talking to a friend about this and they were confused about why I’ve been so focused on edge panning and camera movement, and I realized it might not be obvious, so I’ll try to explain…

why edge panning is annoying with a head mouse, and getting a global mouse fence working

Basically it’s the same reason I think some people turn it off when they’re using a mouse with their hand, just exacerbated by the fuzziness of the head mouse. The biggest problem I find is that since I’m constantly moving my head a little bit, the mouse cursor will drift and eventually get off-center. The easiest way to fix this is to turn my head to the side or put the cursor in a corner and kind of hold the mouse cursor in place while I re-align my head position. But if moving the mouse to the edge of the screen moves the camera, I end up in a kind of annoying loop of having to keep re-adjusting. It’s just so much more natural to disable edge panning if possible and use another input to control the camera.

A lot of games straight-up won’t let you disable this though (Othercide and Star Renegades being two most recent examples I’ve run across, XCOM 2 being another one I think) and although there’s a couple of freeware programs that will do this for you, I’ve had trouble finding one that didn’t also come with weird performance problems. I recently figured out that there’s a really simple incantation you can give to AutoHotKey that will fence the cursor just inside the screen edges without a performance hit, so now I’ve got that bound to a voice command that I can just fire off whenever I need to.

Anyway, to cut to the chase I started looking into both those Adafruit capacitive sensors but also a little board called Makey Makey that turns closed circuits into USB input, without code or any soldering. (another find from Super Louis 64)

Hardware and electronics hacking is absolutely not my forte even though my day job is a computer programmer, and I didn’t want this project to turn into something that felt like work, I just wanted to quickly prototype ideas to see if they would even solve my problem, so I went with the Makey Makey.

What I did was create panels made of aluminum foil and tape them to the underside of my desk drawer. Then wire them to the Makey Makey’s inputs for W,A,S,D (and later added F,G for extra bonus keys)

The drawer then goes back in the desk, with the wires running up and back inside the drawer, where the Makey Makey sits. The USB comes out the front of the drawer and plugs into my PC, along with an anti-static strap which I’m using as the ground connection. Then to activate one of the panels I lift a foot to touch the aluminum foil and close the circuit, triggering the associated key.

I’m pretty happy with how it turned out. I’ve been playing Othercide using it and it works great, even though that game has kinda of janky UX (there’s weird extra button presses needed for things? maybe a cross platform console/PC thing) and you don’t really need to pan around that much.

A better test was this weekend when I tried the re-release of Age of Empires 1 on game pass, looking for some nostalgia kicks, unfortunately the first game does not support issuing orders while paused, but AoE2 does. I haven’t played that game since high school but it was surprisingly playable with my set up, and pausing to micro gave the game a whole new feel (I have never been strong at RTS micro)

I haven’t jumped into CK3 yet but now I really have no excuses…

4 Likes

Another month has passed so I’m back with another post about some weird set up. Appreciate everyone who keeps reading these, it’s not exactly what I had imagined for this thread, but if nothing else it can serve as notes for myself…

Sadly both of my previous posts have not exactly stood the test of time. I find the DDR pad combined with my shitty 720p projector to be suboptimal, plus I sprained my ankle while on a long walk so didn’t want to stand up to play games for a week. And my under desk d-pad is a little bit flaky, most of the panels work fine but one of them keeps needing to be adjusted to maintain conductivity. I’ll probably need to look into soldering or maybe an alternative material. In the meantime…

One of the challenges I’ve faced is that using my head mouse and voice commands limits the games I can play. With Voice Attack the best I can manage is about 1200ms response time, sometimes as bad as 1500ms. This is fine for turn-based games but it’s really too slow for most non-strategy real-time games. For a while I was sure that this was just a question of voice recognition software performance but I think that’s actually only half the story. At the end of the day it just takes non-trivial amount of time to say a word, and I think that inherently causes a delay.

I came to this conclusion after experimenting with Parrot.py, an open source program I discovered earlier in the year but didn’t try until recently. It allows you to record a series of sounds, then train a noise recognition model, and finally wire up those sounds to key presses or mouse actions. The reason it took me so long to try this was that the readme says you need thousands of recordings to train the model. I assumed this was going to be really time-consuming, but it turns out each recording is only 30ms, and there’s an interactive tutorial, so the whole thing only takes a few minutes.

The results are really impressive. I’d already seen videos of the developer playing Starcraft 2, but it wasn’t until I got a simple model of my own working using a “hiss” sound to activate the left mouse button, and a “pop” for right click, that it really sunk in. The response time I could achieve was closer to 400ms, which is faster than with a foot pedal. After using voice software for mouse clicking for 10 months it just felt so fast!! (and I’m sure I don’t need to tell listeners of WPR that the difference between winning and losing is measured in milliseconds :wink: ) The hardest part was getting a stable Python environment in Windows, and I ran into some issues with something automatically adjusting my microphone levels, causing Parrot to pick up way to much noise.

But last week I put together everything I needed to play Pyre with it. (I keep coming back to this game because it’s one of my favorites, and I never finished the main story so I genuinely want to keep playing it, plus it’s kind of the perfect sandbox because it’s got such simple inputs) This video I put together for Reddit goes into some more detail on the specific sounds and key bindings, but here’s the code for the custom module, for anyone curious. It’s basically just a 5-way if statement for the 5 inputs

I’ve been in touch with the Parrot dev a little bit when he streams Hollow Knight on Twitch, and it sounds like he doesn’t have that many users yet. I’m really hoping I can figure out how to help evangelize it somehow.

I think the other reason I keep playing Pyre is all the hype around Hades this year. I had really written it off as a game that I was just not going to play for a long time or maybe ever, and that made me sad as a fan of Supergiant. But now that suddenly feels like it’s a real possibility, so I actually grabbed it while it’s on sale on Steam…

5 Likes

These posts are always great! Using simple audio as a form of input is such an ingenious way to play games that I hadn’t thought about before.

Yeah it’s very clever. I don’t know enough about machine learning to understand the details, but I’m not even using the configuration to use the GPU to train a neural network, it’s just in a mode using a random forest algorithm of some kind. I did try to use some sounds that overlapped too much with this so I’m very curious to see what the full neural network mode can do

Honestly, using short sounds as activation cues is so smart. Kudos to whoever came up with that. I went to a seminar almost 20 years ago now where they were using portable fMRI rigs and meditation techniques to produce a “mental grammar” (of up to 20 symbols) that could control devices. But honestly, for a lot of people, I bet that it would be a lot easier and less stressful to just learn 20+ sounds.

You might want to see if you can use WSL2 to do your python-ing, if you are having troubles with python stability on windows. It runs python in a windows supported hypervisor, and I use it for a ton of my development work at home.

Yeah that’s a good idea, I’ve actually recently been using WSL2 to build a TypeScript/JavaScript project (plugins for LipSurf) and I’ve used Docker to run one off Python scripts on Windows before, but in this case because Parrot depends on some libraries that interface with Windows API, and I’ve had trouble in the past with voice commands submitting input through DirectX, I was worried something like that might be more trouble than it’s worth. This was the specific issue I ran into, there’s some generated code from the win32com module that mysteriously went bad and had to be rebuilt.

Now that I’ve got it working I might look into those options.

Ahh, yeah. I think if you are depending directly on win32 calls, WSL2 is probably a no-go (since it is a real linux environment). You’d probably need to reconfigure parrot to be a client/server based application (and do the win32 stuff in the client). At which point, you still need a stable python-on-windows solution, so it’s probably not worth the hassle.

As far as python-win32 interface issues go, needing to delete pyc files from time to time isn’t too bad.

Quick (-ish) Hades-hands-free update. First of all, I can now pet the dog. Parrot.py has a clever feature where the same sound can control multiple keys, depending on screen regions you can define. This is how I’ve implemented movement controls, since Hades doesn’t have click to move like Pyre, but I’m also now using it for non combat stuff like talking, gifting, and petting Cerberus. I suspect “summon” does more than just pet but for now this works.

Second, over the weekend I switched from Parrot’s random forest algorithm to a neural net mode. This isn’t even the most advanced mode (that’s a Pytorch GPU-based neural net which I haven’t gotten set up yet) but it feels like a big improvement. I did have to redo all my fine tuning but it feels like I’m getting a cleaner distinction between sounds.

I started to make a video showcasing the fine tuning but honestly it’s not that interesting, it’s just a lot of trial and error. If anyone is curious here’s the code, Parrot has a lot of options for narrowing activation based on loudness, frequency, etc.

This game definitely pushes my ability to use the system to its limits. I’ve put a decent chunk of time into it and feel like I’m only just making some progress in terms of being able to get to the bosses (I’m trying to give it a decent shot on “normal” without “god mode” yet). Despite that I am really enjoying the game itself, and it’s not something I thought I would be able to play, so basically a success.

4 Likes

Hello👋 it’s me, your friendly neighborhood RSI haver… I decided to rename this thread to better reflect its contents but also have another update.

Surprising no one, the honeymoon between me, Hades, and Parrot.py is over. There’s nothing specifically wrong with Parrot or with Hades* however it turns out that making noises at your computer for hours after spending all day dictating code at your computer is bad for your vocal chords. I basically had to take a break from Hades and start being more conscious about how much time I’m spending talking because I was straining my vocal chords. On top of that, I was finding that using my gyroscopic remote (attached to my gaming headset) as my primary pointing device for work, web browsing, and games was starting to take a toll on my neck.

I spent a few hours playing Per Aspera over last weekend, thinking it would be a nice slow paced game as a break from Hades, but instead started developing a crick in my neck, and after that I decided to splurge a bit and bought myself a Tobii eye tracker for Christmas.

The software I’ve been using to write code at work for the past couple of months is called Talon. It’s free as in beer and it’s by far the most flexible voice control software I’ve used. Although the raw accuracy is not quite as good as something like Dragon or Google’s voice dictation, the way it parses short commands and chains them together as well as the extreme flexibility and customizability makes for a very powerful tool. Here’s a talk from last year’s Strange Loop that has some good demos of using it for programming.

The developer has also written drivers and control software that allows you to turn Tobii gaming eye trackers into general purpose pointing devices. This is something you can apparently do with the older model of their gaming tracker (4C) using Windows built in eye control, but that device is no longer being manufactured and the newest model (5) doesn’t support it yet for whatever reason.

Anyway, I had shied away from eye tracking stuff so far because my gyroscopic remote was working fine but now I realize that being able to sit in a slightly more relaxed posture and not moving your head so much is a big improvement in terms of ergonomics. In addition although the eye tracking is pretty fuzzy at the moment the Talon developer has a clever mode that allows you to zoom in on a region of the screen for more detailed and accurate clicking. Talon also supports noise commands (just two right now, pop and hiss) for lower latency clicking.

I’m probably not going to end up using this for tons of gaming and instead replace the use of my gyroscopic remote for things like work and general purpose computing, however it is possible and convenient to use it for turn based and slower paced games (although the Parrot.py dev uses eye tracking for pointing, I don’t think he’s using Talon for this). This first video is using the “zoom mouse” mode which is the most convenient, but you kind of need the “control mouse” mode to hover and get tool tips. The second video is entirely the the direct control mode. The jitter is apparently going to improve as the dev works to support the latest Tobii but it looks worse than it feels to play

* well actually no, that’s not true, I have some very specific complaints about Hades but I’ll save them for the game of the year thread

3 Likes

Curious to know what this experience is like. Is there a huge slow down in work flow or is it actually better in your experience since you probably need to think more before you write the code? I’m definitely someone who is constantly changing and rerunning their code so I’m not sure how I would adapt to this.

1 Like

My experience has been that it’s neither a huge slow down nor is it better, it’s just different. I’m probably a bit slower in terms of pure WPM vs typing by hand, but it’s possible to get quite fast with Talon because of the chaining I mentioned: you can say in a single utterance multiple commands that Talon parses and interprets separately, or pipeline commands with very short pauses that Talon then executes together. It helps that I’m using Intellij and so can leverage a lot of built-in templates and shortcuts but folks in the Talon community do similar stuff with dynamic languages in Emacs, so a lot is possible. Having some kind of pointing device helps a lot in terms of fast editing, navigating files, etc IMHO

In terms of day to day work, I’m writing basic business CRUD backend services and not churning out lots of novel code, so I spend a lot of time reading documentation for whatever framework or database were using, and so my actual productivity is about the same. YMMV if you’re doing real computer science although I’m aware of a few academics using Talon successfully

(hit send too soon…)
If you have an iterative development style I would imagine the bigger hurdle would be the kind of live coding/REPL/notebook environment that’s popular with some languages, but I see no reason something like that would be hard with Talon, with the right scripts. Talon itself has no commands out of the box, almost everything is user scripts. For instance here’s the Git commands in the main community scripts repo: https://github.com/knausj85/knausj_talon/blob/master/misc/git.talon

1 Like