Last night, a Roblox testing place called “Avatar Showcase-(Working)” became briefly available, revealing an array of new avatar features, including a dynamic face tracking system that allows users to animate their avatars’ facial expressions using their own real-life movements. This experimental feature, although now disabled and set to private, could indicate that Roblox will soon launch the technology to the public.
The dynamic face tracking feature works by connecting the user’s camera, whether on a phone or PC, to their Roblox account. Once activated, the avatar preview window can be enabled from the in-experience menu, allowing users to see their avatars’ mouths, eyes, and head movements in real time, as they turn on their microphone and camera.
If you want to see it in action, check out the following video:
Roblox first announced this update in March 2022 as a critical step towards making the metaverse a part of people’s daily lives through natural and believable avatar interactions. The technology is based on the Facial Action Coding System (FACS), which defines a set of controls to deform the 3D face mesh according to the user’s facial muscle placement. The deep learning-based method uses a two-stage architecture: face detection and FACS regression.
To ensure optimal performance, Roblox implemented a fast variant of the well-known MTCNN face detection algorithm. The modified algorithm provides a 10x speed-up, allowing for real-time face detection on various devices. Additionally, facial landmarks predicted by MTCNN are used for aligning the face bounding box before the FACS regression stage, reducing computation.
The synthetic data pipeline used by Roblox has allowed for iterative improvement of the expressivity and robustness of the model. The developers have added synthetic sequences to enhance responsiveness to missed expressions and balanced training across different facial identities. The temporal filtering in the FACS weights subnetwork, an optimized backbone, and error-free ground-truth data from the synthetic data contribute to high-quality animation with minimal computation.
As some Roblox experiences already have access to the experimental dynamic face tracking feature, it is possible that the brief availability of the Avatar Showcase was a mistake. However, this could also indicate that the technology is nearing its official release.
Comments