During last week’s 2023 Google I/O event, an annual conference where the company’s developers showcase new tech developments, it was revealed that Google has been working on a piece of software called Project Gameface, a program designed to enable mouse control using face gestures, requiring nothing but a standard webcam to operate.
The program was announced as part of a video in which Lance Carr, a disabled gamer, explains that he used to game using an expensive head tracking mouse peripheral, but this was lost when his home was destroyed in a fire. The idea behind Project Gameface was apparently to create a low cost facial gesture mouse control program that could be used without that same high cost investment in an expensive peripheral.
The program, which aims to recognise points on a person’s face and track their head movements and facial gestures, is powered by Google Mediapipe, an AI implementation toolset, and aims to be adaptable to a wide range of faces and levels of facial movement.
A build of Project Gameface is available to download, currently Version 0.3.30 at the time of writing this video script, and while it’s far from a finished program, it does already support a number of features that gamers can experiment with.
The program can be used to control clicking the left, right, or middle mouse buttons, as well as recentring the mouse position on screen, and switch focus between monitors. You can also assign keyboard button presses to facial gestures, with the exception of the ESC key which is used to close the program.
At present, controls can be mapped to opening your mouth, moving your mouth left or right, raising your left or right eyebrow, and lowering your left or right eyebrow.
While I am not the target audience for Project Gameface, I’d like to highly recommend Can I Play That’s coverage of the program, which includes impressions from Marijn, a gamer with a facial difference who often has difficulty using programs that rely on facial recognition. They found that the customisation options available for Project Gameface did help somewhat, but that the program still occasionally registered misclicks, suggesting there is still room for improvement in the diversity of data in the training dataset for the program.
While this is an early proof of concept right now, and likely to not be as accurate or reliable as expensive bespoke solutions, the fact that Project Gameface can be operated with just a basic webcam does make it an appealing budget option, assuming it recognises your facial movements accurately and you have control over those movements reliably.
While this doesn’t seem like right now it’s going to replace the need for more expensive adaptive controllers, it is promising seeing the work being done here, and hopefully Project Gameface continues to develop, and does not end on the pile of discarded Google software projects.