Roblox Avatar Showcase: Dynamic Face Tracking Previewed Early, Likely Launching Soon

The Roblox Avatar Showcase was briefly available for users, previewing a new dynamic face tracking feature that animates avatars in real time, based on the user's facial expressions.
Roblox logo over a bunch of game icons from Roblox.
Image: Roblox

Last night, a Roblox testing place called “Avatar Showcase-(Working)” became briefly available, revealing an array of new avatar features, including a dynamic face tracking system that allows users to animate their avatars’ facial expressions using their own real-life movements. This experimental feature, although now disabled and set to private, could indicate that Roblox will soon launch the technology to the public.

The dynamic face tracking feature works by connecting the user’s camera, whether on a phone or PC, to their Roblox account. Once activated, the avatar preview window can be enabled from the in-experience menu, allowing users to see their avatars’ mouths, eyes, and head movements in real time, as they turn on their microphone and camera.

If you want to see it in action, check out the following video:

YouTube video

Roblox first announced this update in March 2022 as a critical step towards making the metaverse a part of people’s daily lives through natural and believable avatar interactions. The technology is based on the Facial Action Coding System (FACS), which defines a set of controls to deform the 3D face mesh according to the user’s facial muscle placement. The deep learning-based method uses a two-stage architecture: face detection and FACS regression.

To ensure optimal performance, Roblox implemented a fast variant of the well-known MTCNN face detection algorithm. The modified algorithm provides a 10x speed-up, allowing for real-time face detection on various devices. Additionally, facial landmarks predicted by MTCNN are used for aligning the face bounding box before the FACS regression stage, reducing computation.

The synthetic data pipeline used by Roblox has allowed for iterative improvement of the expressivity and robustness of the model. The developers have added synthetic sequences to enhance responsiveness to missed expressions and balanced training across different facial identities. The temporal filtering in the FACS weights subnetwork, an optimized backbone, and error-free ground-truth data from the synthetic data contribute to high-quality animation with minimal computation.

As some Roblox experiences already have access to the experimental dynamic face tracking feature, it is possible that the brief availability of the Avatar Showcase was a mistake. However, this could also indicate that the technology is nearing its official release.

Shaun Savage

Shaun Savage

Shaun Savage is the founder and editor-in-chief of Try Hard Guides. He has been covering and writing about video games for over 9 years. He is a 2013 graduate of the Academy of Art University with an A.A. in Web Design and New Media. In his off-time, he enjoys playing video games, watching bad movies, and spending time with his family.

More Content

Comments

Leave a Comment

All comments go through a moderation process, and should be approved in a timely manner. To see why your comment might not have been approved, check out our Comment Rules page!

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.